{ "version": "https://jsonfeed.org/version/1.1", "user_comment": "This feed allows you to read the posts from this site in any feed reader that supports the JSON Feed format. To add this feed to your reader, copy the following URL -- https://www.macstories.net/tag/ios-reviews/feed/json/ -- and add it your reader.", "home_page_url": "https://www.macstories.net/tag/ios-reviews/", "feed_url": "https://www.macstories.net/tag/ios-reviews/feed/json/", "language": "en-US", "title": "iOS Reviews – MacStories", "description": "Apple news, app reviews, and stories by Federico Viticci and friends.", "items": [ { "id": "https://www.macstories.net/?p=76527", "url": "https://www.macstories.net/stories/ios-and-ipados-18-the-macstories-review/", "title": "iOS and iPadOS 18: The MacStories Review", "content_html": "
iOS 7, introduced in 2013 as a profound redesign, was a statement from a company ready to let go of its best-selling OS’ legacy. It was time to move on. With iOS 8 a year later, Apple proved that it could open up to developers and trust them to extend core parts of iOS. In the process, a new programming language was born. And with last year’s iOS 9, Apple put the capstone on iOS 7’s design ethos with a typeface crafted in-house, and gave the iPad the attention it deserved.
\nYou wouldn’t have expected it from a device that barely accounted for 10% of the company’s revenues, but iOS 9 was, first and foremost, an iPad update. After years of neglect, Apple stood by its belief in the iPad as the future of computing and revitalized it with a good dose of multitasking. Gone was the long-held dogma of the iPad as a one-app-at-a-time deal; Slide Over and Split View – products of the patient work that went into size classes – brought a higher level of efficiency. Video, too, ended its tenure as a full-screen-only feature. Even external keyboards, once first-party accessories and then seemingly forgotten in the attic of the iPad’s broken promises, made a comeback.
\niOS 9 melded foundational, anticipated improvements with breakthrough feature additions. The obvious advent of Apple’s own typeface in contrast to radical iPad updates; the next logical step for web views and the surprising embrace of content-blocking Safari extensions. The message was clear: iOS is in constant evolution. It’s a machine sustained by change – however that may happen.
\nIt would have been reasonable to expect the tenth iteration of iOS to bring a dramatic refresh to the interface or a full Home screen makeover. It happened with another version 10 before – twice. And considering last year’s iPad reboot, it would have been fair to imagine a continuation of that work in iOS 10, taking the iPad further than Split View.
\nThere’s very little of either in iOS 10, which is an iPhone release focused on people – consumers and their iPhone lifestyles; developers and a deeper trust bestowed on their apps. Like its predecessors, iOS 10 treads the line of surprising new features – some of which may appear unforeseen and reactionary – and improvements to existing functionalities.
\nEven without a clean slate, and with a release cycle that may begin to split across platforms, iOS 10 packs deep changes and hundreds of subtle refinements. The final product is a major leap forward from iOS 9 – at least for iPhone users.
\nAt the same time, iOS 10 is more than a collection of new features. It’s the epitome of Apple’s approach to web services and AI, messaging as a platform, virtual assistants, and the connected home. And as a cornucopia of big themes rather than trivial app updates, iOS 10 shows another side of Apple’s strategy:
\nSometimes, change is necessary.
\n\nAs more features have been added to iOS over the years, its first-run setup flow has become bloated, if not downright unintuitive.
\niOS 10 doesn’t take any meaningful steps to simplify the setup of a new iOS device, which is mostly unchanged from iOS 9. The only notable difference is the action required to begin the setup process, which is now “press Home to open”. As I’ll explore later, there’s a reason for this.
\nWhere iOS 10 does break away from the old is in the system requirements needed to install the OS. Most devices from 2011 and 2012 aren’t compatible with iOS 10, including:
\nDevices supported by iOS 10.
Progress, of course, marches on, but there are other notable points in this move.
\nThe iPad 2 – perhaps the most popular iPad model to date – supported iOS 9 (in a highly constrained fashion) despite developers clamoring for its demise. After 5 years of service, Apple is cutting ties with it in iOS 10. By leaving the A5 and A5X CPUs behind, developers are now free to create more computationally intensive iPad apps without worrying about the lack of Retina display on the iPad 2 and the performance issues of the third-generation iPad holding them back.
\nLook closer, and you’ll also notice that Apple is dropping support for all devices with the legacy 30-pin dock connector. If a device can run iOS 10, it is equipped with a Lightning port.
\nIn addition to Lightning, every iOS 10-eligible iPad has a Retina display, but not every device comes with a Touch ID sensor yet, let alone a 64-bit processor, Apple Pay, or background ‘Hey Siri’ support.
\nIt’s going to be a while until Apple can achieve its vision of 64-bit and one-tap payments across the board, but it’s good to see them moving in that direction by phasing out hardware that no longer fits what iOS has grown into. iOS 10 is starting this transition today.
\n\nOne of the first interactions with iOS 10 is likely going to be an accidental swipe.
\nFor the first time since the original iPhone, Apple is changing the “Slide to Unlock” behavior of the iOS Lock screen. iOS 10 gets rid of the popular gesture altogether, bringing tighter integration with Touch ID and an overhauled Lock screen experience.
\nLet’s back up a bit and revisit Steve Jobs’ famous unveiling of the iPhone and Slide to Unlock.
\nAt a packed Macworld in January 2007, Jobs wowed an audience of consumers and journalists by demonstrating how natural unlocking an iPhone was going to be. Apple devised an unlocking gesture that combined the security of an intentional command with the spontaneity of multitouch. In Jobs’ words:
\n\n\n And to unlock my phone I just take my finger and slide it across.
\nWe wanted something you couldn’t do by accident in your pocket. Just slide it across…and boom.\n
As the iPhone evolved to accommodate stronger passcodes, a fingerprint sensor, and a UI redesign, its unlocking mechanism stayed consistent. The passcode number pad remained on the left side of the Lock screen; even on the iPad’s bigger display, the architecture of the Lock screen was no different from the iPhone.
\nWith the iPhone 6s, it became apparent that Slide to Unlock was drifting away from its original purpose. Thanks to substantial speed and accuracy improvements, the second-generation Touch ID sensor obviated the need to slide and type a passcode. However, because users were accustomed to waking an iPhone by pressing the Home button, Touch ID would register that initial click as a successful fingerprint read. The iPhone 6s’ Touch ID often caused the first Home button click to unlock an iPhone, blowing past the Lock screen with no time to check notifications.
\nIronically, the convenience of Touch ID became too good for the Lock screen. As I wrote in my story on the iPhone 6s Plus:
\n\n\n The problem, at least for my habits, is that there is useful information to be lost by unlocking an iPhone too quickly. Since Apple’s move to a moderately bigger iPhone with the iPhone 5 and especially after the much taller iPhone 6 Plus, I tweaked my grip to click the Home button not only to unlock the device, but to view Lock screen notifications as well. While annoying, the aforementioned slowness of previous Touch ID sensors wasn’t a deal-breaker: a failed Touch ID scan meant I could at least view notifications. When I wanted to explicitly wake my locked iPhone’s screen to view notifications, I knew I could click the Home button because Touch ID wouldn’t be able to register a quick (and possibly oblique) click anyway.
\nThat’s not the case with the iPhone 6s Plus, which posed a peculiar conundrum in the first days of usage. Do I prefer the ability to reliably unlock my iPhone with Touch ID in a fraction of a second, or am I bothered too much by the speed of the process as it now prevents me from viewing notifications on the Lock screen?\n
Apple is making two changes to the unlocking process in iOS 10 – a structural one, with a redesign of the Lock screen and its interactivity; and a behavioral one to rethink how unlocking works.
\nApple hopes that you’ll no longer need to click any button to wake an iPhone. iOS 10 introduces Raise to Wake, a feature that, like the Apple Watch, turns on the iPhone’s display as soon as it’s picked up.
\nRaise to Wake
\nRaise to Wake is based on a framework that uses sensors – such as the motion coprocessor, accelerometer, and gyroscope – to understand if a phone has been taken out of a pocket, but also if it’s been picked up from a desk or if it was already in the user’s hands and its elevation changed. Due to ergonomics and hardware requirements, Raise to Wake is only available on the iPhone 6s/7 generations and it’s not supported on the iPad.
\nApple has learned from the first iterations of watchOS: Raise to Wake on the iPhone 6s and iOS 10 is more accurate than the similar Watch feature that shipped in 2015. In my tests, Raise to Wake has worked well when taking the iPhone out of my pocket or picking it up from a flat surface; it occasionally struggled when the iPhone was already in my hands and it was tricky for the system to determine if it was being raised enough. In most everyday scenarios, Raise to Wake should wake an iPhone without having to click the Home or sleep buttons.
\nRaise to Wake is only one half of the new unlocking behavior in iOS 10: you’ll still need to authenticate and unlock a device to leave the Lock screen. This is where the iPhone’s original unlocking process is changing.
\nTo unlock a device running iOS 10, you need to click the Home button. If the display is already on and you place your finger on the Touch ID sensor without clicking it – as you used to do in iOS 9 – that won’t unlock the device. By default, iOS 10 wants you to physically press the Home button.
\nBye, slide to unlock.
This alteration stems from the unbundling of fingerprint recognition and Home button click, which are now two distinct steps. Placing a finger on Touch ID authenticates without unlocking; pressing the Home button unlocks.
\nIn Apple’s view, while Raise to Wake turns on the display, authentication may be required to interact with features on the Lock screen – such as actionable notifications, widgets, or Spotlight results. With iOS 10, users can pick up an iPhone, view what’s new on the Lock screen, and authenticate (if necessary1) without the risk of unlocking it.
\nFrom a design standpoint, this change is reflected in the icons and messages displayed to the user on the Lock screen. When the display turns on with Raise to Wake, a padlock icon in the status bar indicates that the user has not yet authenticated with Touch ID. At the bottom, a ‘Press home to unlock’ message replaces the old ‘slide to unlock’ one.
\nLocked.
With the display on and after Touch ID authentication, ‘Press home to unlock’ becomes ‘Press home to open’ and the status bar lock switches to an ‘Unlocked’ message.
\nUnlocked.
Under the hood, clicking the Home button and placing a finger on Touch ID are two separate actions. However, the wording of ‘Press home to unlock’ feels like Apple wants you to think of them as one. The entire message is an illusion – pressing the Home button by itself doesn’t actually unlock a device – but Raise to Wake combined with the second-generation Touch ID will make you believe in it.
\nOn an iPhone 6s, one click on the Home button is all that’s needed to exit the Lock screen – at least most of the time. If the iPhone’s display is off because Raise to Wake didn’t work (or because you manually locked it while holding it), the experience is similar to iOS 9. Clicking the Home button with a Touch ID-enabled finger will wake up the display and bypass the Lock screen.
\nYou can revert to a pre-iOS 10 unlocking experience if you don’t like the new one. First, Raise to Wake can be disabled in Settings > Display & Brightness, and your iPhone will no longer turn on when picked up. Additionally, tucked away in Settings > Accessibility > Home Button, you’ll find an option called ‘Rest Finger to Open’. When enabled, your iPhone will unlock through Touch ID alone, without having to press the Home button.
\nIt takes some time to get used to the new unlocking behavior of iOS 10. The apparent unification of Home button click and Touch ID makes less sense on devices without the second-generation sensor, where one click is rarely enough and tends to bring up the passcode view for a second attempt. And, nostalgically speaking, I miss the old ‘slide to unlock’ message, although for reasons that are merely emotional and not related to function.
\nAfter three months, Raise to Wake and Press to Unlock have made the overall unlocking experience faster and more intuitive. I now expect my iPhone to know when it’s time to wake up and show me the Lock screen, and I don’t miss the old unlocking process. Raise to Wake eliminates the need to click a button to wake an iPhone; having to press the Home button to unlock removes the risk of accidentally leaving the Lock screen.
\nBut it all goes back to that accidental swipe. Picture this: you’ve just upgraded to iOS 10, or you’ve bought a new iPhone with iOS 10 pre-installed, and, instinctively, you slide to unlock. What you’re going to see isn’t an error message, or the Lock screen bouncing back, telling you that you need to press the Home button instead. You’re going to see the biggest change to the Lock screen – potentially, a better way of interacting with apps without unlocking a device at all.
\nSlide to unlock, and you’ll meet the new Lock screen widgets.
\n\nTechnically, Lock screen widgets predate iOS 10. On both the iOS 8 and iOS 9 Lock screens, users could swipe down to reveal Notification Center and its Today view. However, iOS 10 adds an entirely new dimension to the Lock screen, as well as a refreshed design for widgets throughout the system.
\nThe Lock screen’s renovation in iOS 10 starts with three pages: widgets and search on the left, the Lock screen (with notifications and media controls) in the middle, and the Camera on the right. You can swipe to move across pages, as suggested by pagination controls at the bottom of the Lock screen.
\n\nThe leftmost page, called the Search screen, isn’t completely new either. Apple took the functionality of Spotlight search and Proactive of iOS 9, mixed it up with widgets, and made it a standalone page on the iOS 10 Lock screen (and Home screen, too).
\nFrom left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.
Notably absent from iOS 10’s Lock screen is the Camera launcher button. By getting rid of the tiny shortcut in the bottom right corner, Apple has made the Camera easier to launch: swiping anywhere to move between Lock screen and Camera is easier than carefully grabbing an icon from a corner. I’ve been taking more spontaneous, spur-of-the-moment pictures and videos thanks to iOS 10’s faster Camera activation on the Lock screen.
\nApple’s sloppy swiping for Lock screen navigation has one caveat. If notifications are shown, swiping horizontally can either conflict with actionable buttons (swipe to the left) or open the app that sent a notification (swipe right). You’ll have to remember to swipe either on the clock/date at the top or from the edge of the display; such is the trade-off of using the same gestures for page navigation and notification actions.
\n\nThree changes stand out when swiping right to open the Search screen:
\nUnlike their predecessors, widgets in iOS 10 don’t blend in with the dark background of Notification Center. This time, Apple opted for standalone units enclosed in light cells with an extensive use of custom interfaces, buttons, images, and dark text.
\nWidgets in Notification Center on iOS 9 and iOS 10.
There’s a common thread between widgets and notifications (also redesigned in iOS 10): they’re self-contained boxes of information, they sit on top of the wallpaper rather than meshing with it, and they display an app’s icon and name in a top bar.
\nNotifications and widgets. Spot the trend.
The new design is more than an aesthetic preference: the makeover has also brought functional changes that will encourage users and developers to rethink the role of widgets.
\nA widget in iOS 10 supports two modes: collapsed and expanded. The system loads all widgets in collapsed mode by default, which is about the height of two table rows (about 110 points). All widgets compiled for iOS 10 must support collapsed mode and consider the possibility that some users will never switch to the expanded version. Apps cannot activate expanded mode on the user’s behalf; switching from compact to expanded is only possible by tapping on a ‘Show More’ button in the top right corner of a widget.
\nCompact and expanded widgets.
This is no small modification, as it poses a problem for apps that have offered widgets since iOS 8. Under the new rules, apps updated for iOS 10 can’t show a widget that takes up half of the display as soon as it’s installed. Any widget that wants to use more vertical space for content – such as a todo list, a calendar, or even a list of workflows – will have to account for the default compact mode.
\nFor some developers, this will mean going back to the drawing board and create two separate widget designs as they’ll no longer be able to always enforce one. Others will have to explain the difference to their users. Workflow, which used to offer a widget that could dynamically expand and collapse, is updating the widget for iOS 10 with a label to request expansion upon running a workflow that needs more space.
\nWorkflow’s new iOS 10 widget.
There’s one exception: legacy iOS 9 apps that haven’t been updated for iOS 10. In that case, the system won’t impose compact mode and it won’t cut off old widgets (which keep a darker background), but there’s a strong possibility that they won’t look nice next to native iOS 10 ones.
\nThe same widget in iOS 9 legacy mode and with native iOS 10 support.
I don’t see how Apple could have handled this transition differently. Design updates aside, there’s an argument to be made about some developers abusing Notification Center with needlessly tall and wasteful widgets in the past. Compact mode is about giving control to the users and letting them choose how they prefer to glance at information. Want to install a widget, but don’t need its full UI? Use it in compact mode. Need to get more out of it? Switch to expanded.
\nApple’s decision to adopt compact and expanded modes in iOS 10 is a nod to developers who shipped well-designed widgets in the past, and it provides a more stable foundation going forward.
\nI’ve been able to test a few third-party iOS 10 widgets that illustrate the advantages of these changes.
\nPCalc, James Thomson’s popular iOS calculator, has a new widget that displays a mini calculator in compact mode with numbers and basic operations split in two rows.
\nDespite the small touch targets, the compact interface is usable. If you want bigger buttons and a more familiar layout, you can switch to expanded mode, which looks like a small version of PCalc living inside a widget – edge-to-edge design included.
\nLauncher doesn’t modify its widget’s interface when toggling between compact and expanded, but the constraints of the smaller layout force you to prioritize actions that are most important to you.
\nUsing compact mode for summary-style UIs will be a common trend in iOS 10. CARROT Weather is a good example: it shows a summary of current conditions when the widget is compact, but it adds forecasts for the day and week ahead when expanded.
\n\nEven better, slots in the compact layout can be customized in the app, and you can choose to use the widget in light or dark mode.
\nDrafts has an innovative implementation of compact and expanded layouts, too. In compact, the widget features four buttons to create a note or start dictation. When the widget expanded, it grows taller with a list of items from the app’s inbox, which can be tapped to resume editing.
\nIn the past, developer Greg Pierce would have had to ask users to customize the widget or make it big by default; in iOS 10, they can switch between modes as needed.
\nWidgets’ ubiquitous placement pushes them to a more visible stage; as soon as more developers adapt4, iOS 10 has the potential to take widgets to the next level.
\nI believe the new design will play an essential role in this.
\nApple advertises legibility and consistency as core tenets of widgets in iOS 10, and I agree: widget content and labels are easier to read than iOS 9. Standalone light cells separate widgets with further precision; I haven’t found translucency with the Lock screen wallpaper to be an issue.
\nIn addition, the light design brings deeper consistency between apps and widgets. Most iOS apps have light backgrounds and they employ color to outline content and indicate interactivity. In iOS 10, widgets are built the same way: the combination of light backgrounds, buttons, and custom interfaces is often consistent with the look of the containing app.
\nIn this regard, widgets feel more like mini-apps available anywhere rather than smaller, less capable extras. The line between widget and full app UIs is more blurred than ever in iOS 10.
\nApple’s new Notes and Calendar widgets showcase this newfound cohesiveness. The Notes widget displays the same snippets of the list in the Notes app. Buttons to create new notes and checklists are also the same. The widget looks and feels like a small version of Notes available anywhere on iOS.
\n\nThe Calendar widget is even more indicative. Glancing at events and recognizing their associated calendar wasn’t easy in iOS 9, as they only had a thin stripe of color for the calendar to which they belonged.
\nThe Calendar widget is more contextual on iOS 10.
In iOS 10, forgoing a dark background has allowed Apple to show Calendar events as tinted blocks matching the look of the app. Discerning events and the calendars they belong to is easier and familiar.
\nConsistency of apps and widgets.
I wouldn’t expect every app to adopt a widget design that exactly mirrors the interface users already know, but it can be done. Switching to a light design has given Apple a chance to reimagine widgets for consistency with apps and lively combinations of color, text, and icons. They are, overall, a step up from iOS 9 in both appearance and function.
\nThe new direction also opens up a future opportunity: what is light can be more easily converted to dark. I could see a system dark mode working well for widgets.
\nThe iPad’s Lock screen doesn’t break any new ground, but there are some differences from the iPhone.
\nOn the iPad, notifications are displayed on the left side of the screen when in landscape. They’re aligned with the system clock, and they leave room for media controls to be displayed concurrently on the right. Dealing with notifications while controlling music playback is a task well suited for the iPad’s larger display.
\nUnfortunately, Apple doesn’t think portrait orientation should warrant the same perks. If a notification comes in while album artwork is displayed on the Lock screen, the artwork will be hidden. Apple decided against using a two-column layout in portrait, which I don’t understand: they’re already doing it for widgets on the iPad.
\n\nFurthermore, if no music is playing on an iPad in landscape, having notifications aligned to the left for no apparent reason looks odd and seems…unnecessary.
\nThe right side seems cozy.
Widgets fare a little better. Apple has kept the two-column design first introduced in the Today view of iOS 9; you can still scroll the two lists of widgets independently.
\nI would have appreciated the ability to further control the resizing and placement of widgets on the iPad, and the Lock screen design seems uninspired. We’ll have to make the most of this bare minimum work for now.
\niOS 10 sports an increased modularity of widgets. Apple has done away with grouping multiple types of content under Siri Suggestions – most Apple apps/features have their own widget, which can be disabled from a revamped configuration screen.
\nWidget’s new configuration screen.
Here’s an overview of what’s changed.
\nActivity
\nYour Activity rings from the Apple Watch, with a summary of Move, Exercise, and Stand statistics.
\nCalendar
\nA mini calendar interface. Events are displayed as colored blocks matching the calendar they belong to. You can tap on an event to open it, and expand the widget to reveal more events.
\nFavorites
\nShortcuts to your favorite contacts with different ways to get in touch with them. New in iOS 10, you can assign iMessage as well as third-party communication apps (messaging and VoIP) to contact entries in Favorites, which will be displayed in the widget.
\n…yeah.
The Mail widget is the weakest of the bunch: it only displays shortcuts for VIP contacts. I would have preferred to see a preview of the unified inbox, or perhaps an option to show flagged messages.
\nMaps
\nMaps has three widgets: destinations, nearby, and transit. While the latter isn’t available for my area (Rome, Italy), the other two have worked inconsistently. I’ve never seen a nearby recommendation in the widget, despite being around places rich in POIs. The Destinations widget usually tells me how much time it’ll take me to drive home, but it doesn’t proactively suggest other locations I frequently visit.
\nMusic
\nThe Music widget is an odd one. It displays a grid of what appears to be either recently played music or your all-time most listened albums. The widget doesn’t clarify whether it’s showcasing albums or individual songs; it uses album artworks with no text labels, and it plays either the most played song from an album, or an entire album starting from the first song.
\nA nice perk: music starts playing after tapping the widget without opening Apple Music. But it always feels like a lottery.
\nNews
\nTop Stories from Apple News (shown even if you mute the channel). The widget uses image thumbnails and custom typography matching the bold font of Apple News for headlines.
\nThe best change from iOS 9: news can be disabled by removing the widget.
\nNotes
\nA preview of your most recent notes. In compact mode, the widget only shows the last modified note. In expanded mode, you get more notes and buttons to create a new note, a checklist, snap a picture, and create a drawing.
\nPhotos
\nA collection of Memories created by the new Photos app in iOS 10. Each one can be tapped to view the associated memory in Photos.
\nSiri App Suggestions
\niOS 9’s proactive Siri Suggestions are now smaller in scope and they’re called Siri App Suggestions. The widget displays 4 app shortcuts (8 in expanded mode), and it doesn’t suggest other types of content.
\nLike News, it can also be removed and be placed anywhere on the Search screen.
\nTips
\nYou’d think that the Tips widget is useless – everyone likes to make fun of Tips – but hear me out. In compact mode, the widget shows a tip’s snippet; you can tap it and open the Tips app. Switch to expanded mode, though, and you’ll be presented with a custom interface with an explanation of the tip and a large animation at the top to show you the tip in action.
\nThe Tips widget looks great, and it’s the most technically impressive one on iOS 10.
\nUp Next
\nThe old Today Summary widget has been renamed Up Next. It displays a smaller version of your next event without the full UI of the Calendar widget. Alas, the Tomorrow Summary widget is gone from iOS 10.
\nWeather
\nPerhaps the best example of how widgets can use compact and expanded modes, Apple’s Weather widget shows weather conditions for the current location when compact, and a forecast of the next six hours when expanded.
\nWeather is the widget I’ve used the most in the past three months to look up forecasts from the Lock screen in just a couple of seconds.
\nThe move to apps as atomic units scattered across the system is everywhere in iOS 10, with widgets being the foremost example.
\nNoticeably absent from iOS 10’s widgets is a push for more proactive recommendations. As we’ll see later, Apple has shifted its Proactive initiative to run through the OS and inside apps rather than distilling it into widgets.
\n3D Touch is another illustrious no-show. While notifications have been overhauled to make good use of 3D Touch, pressing on a widget will result in a disappointing lack of feedback. 3D Touch would be a perfect fit for widgets – imagine previewing a full note or reading the first paragraphs of a news story from the Lock screen.
\nThe new widget design and Search screen placement make an iPhone more useful without having to unlock it. Apple has done a good job with their built-in widgets; it’s up to developers now to rethink how their apps can take advantage of them. I’m optimistic that everything will turn out better than two years ago.
\nI unlock my iPhone less thanks to iOS 10’s more capable Lock screen. Raise to Wake, Press to Open, widgets, search, and rich notifications make the entire Lock screen experience drastically superior to iOS 9.
\nEasier to navigate, better structured, less prone to unwanted unlocks. I wouldn’t be able to go back to the old Lock screen.
\n\niOS 10’s rethinking of apps as granular interactions doesn’t stop at widgets. With a new framework that can turn incoming notifications into rich, actionable interfaces, Apple wants users to spend less time jumping between apps.
\nNotifications iOS 9 and 10.
Notifications in iOS 10 share the same design principles of widgets. Rather than being grouped in a list of items on top of a dark background, notifications are discrete light cells that can be pressed (with 3D Touch), pulled down (for incoming banners), or swiped and expanded into a floating card preview.
\nExpanding a Messages notification.
\nThe anatomy of an expanded notification – whether an app has been updated for iOS 10 or not – has fixed elements that developers can’t control. There’s a header bar at the top with the icon and name of the app, and a close button on the right to dismiss the notification. Tapping the icon on the left side will open the app that sent the notification.
\nThe standard look of a notification in iOS 10.
This is true for both iPhones with 3D Touch and devices without it; to expand a notification on an iPad or an older iPhone (or if you don’t want to use 3D Touch), you can pull down an incoming notification banner or swipe a notification to the left in Notification Center and tap ‘View’.5
\nNew APIs allow developers to take different actions for notifications that have been sent to the user – including ones that have been cleared. First, notifications can be dismissed with a Clear action by swiping on them. Apps can monitor the dismiss action and stop delivering the same notification on other devices.
\nAdditionally, developers can remove, update, and promote notifications that have already been sent. Apple’s goal was to prevent Notification Center from being cluttered with old notifications that aren’t relevant anymore. If developers implement this API, updating a notification with fresh content should help users see what’s changed. Imagine sports scores or live-streaming apps and how they could update notifications. I’m curious to see which services will convert to this behavior instead of spamming users with multiple alerts.
\nUnderneath the header of an expanded notification is the content developers can control, and where the most important changes to notifications are happening.
\nIn iOS 10, notifications can have a title and a subtitle. The title is displayed in a bold font, which helps identifying the subject of a notification. In a Reminders notification, the name of a reminder will be the bold title at the top, with its note displayed as text content below it.
\nThe default look of a notification in iOS 10. Expansion is relative to a notification’s placement on screen.
Below the title and subtitle, iOS 10 shows a notification’s body text content (same as iOS 9) and actionable buttons. In a welcome change from the past, developers can define more than two notification actions, displayed in a list under the notification’s card.6 If an app requires a quick reply upon expanding a notification, the input field will sit above the keyboard – it’s not attached to the notification like in iOS 9.
\nQuick replies in iOS 9 and iOS 10.
Design changes alone, though, wouldn’t have sufficed to modernize notifications. To reinvent their feel and capabilities, Apple has created two new extension points for developers in iOS 10: Notification Service and Notification Content.
\nThe Notification Service extension doesn’t have an interface and runs in the background. Upon triggering a notification but just before delivering it to the user, an app can call the Notification Service extension to augment or replace its payload. This extension is meant to have a short execution time and it’s not designed for long tasks. Possible use cases for Notification Service extensions could be downloading an image or media file from a URL before showing a notification, or decrypting an encrypted payload locally for messaging apps that rely on end-to-end encryption.
\nThe Notification Service extension should come in handy given iOS 10’s ability to include a media attachment (images, audio, videos, and even GIFs) in both the notification banner and the expanded notification. If they adopt it, apps like WhatsApp and Telegram could omit the “[Contact] sent you an image” standard notification and display a thumbnail in the notification banner (like iMessage does) and a full image preview in the expanded notification.
\nNotification Content extensions are what users are going to see the most in daily usage, and they motivate iOS 10’s notification card design.
\nA notification in iOS 10 can show a custom view between the header and default text content. Custom views can be anything – an embedded map, a message conversation, media, a calendar view, etc. – and they’re managed by the Notification Content extension. Custom views are non-interactive: they can’t receive touch events7, but they can be updated in-place in response to a task performed from a notification action. Apps can hide the default content of a notification if the custom view is informative enough.
\nService and Content extensions, combined with the expanded design, have turned notifications in iOS 10 into a completely new experience. Notifications are no longer just text: they are custom app UIs delivered to you with rich previews and interactions that can live on longer than a couple of seconds. Notifications in iOS 10 are mini apps in and of themselves.
\nWhen you receive an iMessage that contains a photo, the incoming notification can be expanded, either with 3D Touch or a swipe. You’ll be treated to a full iMessage conversation UI, living inside the notification, with the same transcript, read receipts, and typing indicators you’d see in the Messages app.
\nTo expand a notification, you can pull it down or press on it.
Not only can you send a reply – you can keep the iMessage interface open as you keep a conversation going from the notification. It’s a fantastic way to check into a conversation without the constraints of a quick reply.
\nScroll up in the transcript to view older messages.
When you’re done, swipe down to dismiss the notification, and you’ll be back to whatever you were doing.8
\nCalendar notifications follow the same concept. If an event with a location attached is coming up, the expanded notification will display the default text content at the bottom, but also a preview of the address with a Maps view at the top.
\nThanks to actionable buttons, you can open directions in Maps without launching Calendar. If an upcoming event doesn’t have a location, you’ll see a preview of your agenda inside the notification.
\nI tested a version of Workflow optimized for iOS 10, which brings improved notification support with the ability to customize the content displayed in a notification card. In addition to a title, you’ll be able to embed pictures, videos, GIFs, and even Maps views into a Workflow notification.
\nRich notifications created with Workflow.
Pictures are displayed as thumbnails in a notification banner before expanding it; videos can be played inline within the card itself.
\nAnd if you often receive messages containing GIFs, iOS 10 will let you preview them directly from a notification.
\nCARROT Weather has a clever take on rich notifications in iOS 10. The daily digest and severe weather/precipitation alerts can be expanded into dynamic preview cards.
\n\nThrough a Notification Content extension, the app can embed a custom interface, sounds, and even animations inside the notification card. As a result, viewing CARROT’s notifications feels more like using the app rather than reading a plain text summary.
\nWith a new framework and the flexibility granted by extensions, we’re going to see a rise of interaction methods fueled primarily by notifications. Of all the places where an app can advertise its functionality on iOS (widgets, keyboards, extensions), a notification is the most direct, contextual way to reach users at an appropriate time.
\nA notification carries interest and, in many cases, a sense of urgency. iOS 10 transforms notifications from a passive delivery system into an active experience where users engage with an app through UIs, actions, and feedback they’re already familiar with. It’s a win-win for developers, who can make their apps more useful through richer notifications, and for users, who no longer have to open apps to benefit from their services.
\niOS 10’s notifications are a new layer on top of apps. They’re going to change how we deal with them every day.
\n\nThe iPhone 6s brought the first significant adjustment to the iOS Home screen in years – 3D Touch quick actions. With iOS 10, Apple is cautiously expanding the Home screen beyond app shortcuts, but in ways you might not expect.
\nSearch from the Home screen: pull down (left) or swipe right to open the new Search screen.
As in iOS 9, Spotlight search can be accessed from two locations: the Search screen on the left side of the Home screen and by pulling down on app icons. The Search screen on the left mirrors its Lock screen counterpart.
\nNotification Center has gone through some deeper changes. The segmented control to switch between notifications and widgets at the top is gone, replaced by another set of page indicators. Every time you open Notification Center, iOS 10 will default to showing you notifications in chronological order under a new ‘Recent’ header – it doesn’t remember your position in the two pages. Unfortunately, the option to group notifications by app has also been removed.
\nWhether by laziness or deliberate design, there’s an abundance of ways to activate Spotlight search in iOS 10. Let’s round them up:
\nThat’s seven ways to open Spotlight search on iOS 10.
\n\nBeing able to access search from everywhere – be it on the Home screen, the Lock screen, or when using an app – is convenient. It makes Spotlight pervasive. As Apple continues to grow their search efforts across native apps, web partnerships, and Proactive suggestions, Spotlight’s omnipresence will become a valuable strategic asset.
\nApple continues to be a steadfast supporter of the Home screen as a grid of icons. In a potential disappointment for those who hoped to see a major Home screen refresh this year, the biggest new feature is an extension of 3D Touch quick actions and widgets, rolled into one.
\n\nApps that offer a compact widget in iOS 10 can display it alongside quick actions when a user presses the app’s icon. The widget is the same used in the Search screen – in fact, there’s a button to install it directly from the Home screen.
\niPhone Plus models can display quick actions and widgets on the landscape Home screen as well.
I’m not sure I buy into Apple’s reasoning for combining widgets and quick actions – at least not yet. The glanceability of widgets finds its raison d’être on the Lock screen and inside apps; on the other hand, I associate going back to the Home screen and pressing an icon with launching, not glancing. Years of iOS usage trained me to see the Home screen as a launchpad for apps, not an information dashboard.
\nIn three months of iOS 10 – and with plenty of glanceable/actionable widgets to test – I’ve only remembered to use a widget on the Home screen once (it was PCalc). It’s not that having widgets alongside quick actions is bad; it’s just forgettable. It’s the equivalent of two neighbors being forced to live together under the same roof. Having company can be nice sometimes, but everyone would be better off at their own place.
\nThere are other smaller 3D Touch additions to the Home screen in iOS 10. You can press on folders to bring up a Rename action, and apps inside folders that have unread badges will be listed in the folder’s quick action menu.
\nFolders have also received a visual refresh, with a nicer background blur that shows the grid of icons in the current Home screen page.
\nOn the iPad, Apple didn’t bring any improvements to the Home screen in iOS 10, but I’m sure you’ll be relieved to know that closing an iPad app no longer adjusts the icon’s corner radius on the Home screen.
\nThis relates to a deeper change happening to Home screen animations. Apple has rebuilt the entire SpringBoard animation stack with faster, interruptible animations. Along with a reduced animation curve to launch apps (what was one of the most criticized aspects of iOS 7), you can click the Home button right after tapping an app’s icon and the animation will stop, going back to the Home screen in an instant.
\nHome screen animations
\nYou can try the same with a folder: tapping outside of it will cancel the animation instantly in mid-flight. The difference with iOS 9’s Home screen animations is staggering.
\nHome screen animations
\nThey’re not a “feature”, but new animations are the best Home screen change in iOS 10.
\nIt’s fair to wonder if Apple will ever desecrate the sanctity of the Home screen and allow users to mix icons and widgets.
\nAnyone who’s ever looked at Android will spot obvious similarities between widgets for Google’s platform and what Apple has done with widgets in iOS 10. Apple still believes in the separation of icons and app content; they only added widgets to 3D Touch quick actions and they didn’t even allow the iPad Pro’s large Home screen to go beyond icons. But for how long?
\nThe iOS Home screen has served us well for years, but as screens keep getting bigger, it’s time to do more than a grid of icons with quick actions. The other side of the fence is closer than ever; a final leap wouldn’t be too absurd.
\n\nSince its introduction in 2013, Control Center has become a staple of iOS, providing users with a panel of commonly accessed shortcuts. iOS 10’s Control Center is a radical shift from its origins, and a harbinger of how iOS is changing.
\nControl Center’s design has evolved over the years, from the wireframe-like look of iOS 7 to the friendlier, rounder buttons of iOS 9.
\n\nApple wasn’t led astray by the expansion of iOS, to the point where cramming more functionality into Control Center turned into a balancing act of prioritizing important controls without sacrificing their purpose.
\nIt was clear that Control Center’s original vision couldn’t scale to the growing nature of iOS. And so with iOS 10, Apple has torn down Control Center and started from scratch. The single-page mosaic of tiny buttons is no more. The new Control Center breaks up system shortcuts and audio controls in two separate pages, with the addition of a third page for HomeKit (if available). Everything’s bigger, spacious, and colorful.
\n\nYou still open Control Center with a swipe from the bottom of the display. In iOS 10, swiping pulls up a card with paginated controls underneath it. The design is familiar, yet unmistakably new. Margins across each side convey the card metaphor; controls are bigger and buttons have more padding; there’s more color in every card.
\nAfter three years of Control Center, the new version in iOS 10 feels lively and friendly; perhaps even more fun. On the other hand, pagination and bigger controls raise a question: has simplicity come at the expense of efficiency in Control Center?
\nA useful exercise to understand Control Center in iOS 10 is to take stock of how much Apple is leaving behind. Let’s compare iOS 9’s Control Center to the same screen in iOS 10:
\nThe first page of Control Center in iOS 10 has lost audio playback. Initially, that may feel like a downgrade. But let’s swipe left and consider what Control Center has gained by separating system and audio controls:
\nThe difference is striking. Giving audio playback its own space lets Control Center present more information for the media being played. It’s also more accessible thanks to bigger text labels, buttons that don’t need to be carefully tapped, and hardware controls embedded in the same page.
\nThis won’t be easy to accept for iOS power users who cherish dense UIs: Control Center buys into a trend followed by many (but not all) parts of iOS 10. Big, bold controls, neatly laid out, spread over multiple views.
\nThe first beneficiary of such clarity is the system controls page. The first row of toggles at the top has kept iOS 9’s iconography and arrangement, but each button is color-matched to the setting it activates when toggled.9
\nControl Center is bleeding…four colors?
I found colored toggles extravagant at first; now, I like that I can glance at those buttons and know which setting is engaged.
\nDon’t forget about landscape mode.
The brightness slider and the AirPlay, AirDrop, and Night Shift buttons have been enlarged and simplified as well. For one, the slider’s puck is more comfortable to grab. The buttons reveal another tendency in iOS 10’s semi-refreshed design language: they’re actual buttons with rounded borders and they use color to indicate status.
\nIn a change that’s reminiscent of Sam Beckett’s fantastic concept, you can press on the bottom row of shortcuts to show a list of 3D Touch quick actions. These include three intensity levels for the flashlight, timer options, a shortcut to copy the last Calculator result, and different Camera modes.
\n\nAs I elaborated before, Control Center was an ideal candidate for 3D Touch actions. However, Apple’s implementation in iOS 10 is limited to the bottom row of apps; you can’t press on the Bluetooth icon to connect to previously paired devices, nor can you press on the Wi-Fi toggle to connect to a different network. The addition of 3D Touch to the lower end of Control Center shows that Apple recognizes the utility of quick actions for system-wide shortcuts, but they’re not fully committed to the idea yet.
\nDespite some missing features and growing pains to be expected with a redesign, iOS 10’s first Control Center page is an improvement. With a sensible reliance on color, a more legible layout, and the first steps toward full 3D Touch support, Control Center’s system card is easier to parse, nimble, and intuitive.
\nControl Center’s design direction has been taken to the extreme on the iPad. Only one page can be used at a time; the AirDrop, AirPlay, and Night Shift buttons are needlessly wide. It doesn’t take a design expert to figure that Apple just wanted to ensure basic compatibility with an iPhone feature instead of designing Control Center around the iPad.
\nLook at it this way: if Control Center didn’t exist on the iPhone and Apple decided to introduce it on the iPad today, would it look like this?
\n\nThe lack of an iPad-first approach was passable in the old Control Center because of its compact design. But with iOS 10, following the iPhone’s model has a detrimental effect. Buttons are too big and little care went into optimizing the UI for the iPad’s screen. Apple should reconsider what they’re doing with Control Center on the iPad instead of upscaling their iPhone designs.
\nIn iOS 10, managing music and audio playback from Control Center is a richer experience, visually and functionally superior to iOS 9.
\nThe page is split in three areas: audio information and, for the first time, artwork at the top; progress, playback controls, and volume in the middle; hardware accessories at the bottom. This is true for Apple Music and Podcasts as well as third-party apps, which don’t need to optimize for iOS 10 to show album artwork.
\n\nI was skeptical when I saw that Apple moved audio controls to a separate card. The ubiquitous presence of an audio widget was my favorite aspect of Control Center; adding an extra step to reach it didn’t seem a good idea. After adjusting to Control Center’s audio page in the first month of iOS 10, I went back to iOS 9 and controlling music felt limited and bland.
\nThere are two aspects to Apple’s design worth noting. First, Control Center remembers the page you were using before dismissing it. If you swipe up, swipe left to open music playback, then close Control Center, the next time you open it, you’ll get the Now Playing card instead of being taken back to the first page. Thanks to this, having audio controls on a separate page hasn’t been a problem in my experience, but I wonder if Apple should allow reordering pages as an option.
\nSecond, the purpose of the redesign. With artwork and comfortable UI elements, the page feels like a miniaturized music app rather than a cumbersome mishmash of buttons and sliders. It’s almost as if Control Center was reimagined for how normal people like to know what’s playing.
\nFrom an interaction standpoint, artwork creates a bigger touch target that you can tap to be taken into the app playing audio10; in iOS 9, you had to precisely tap on a song’s small title in Control Center. There’s a deeper sense of context, too. Previously, it always took me a few seconds to read through a song’s information. With iOS 10, I can swipe up and glance at the artwork to see what I’m listening to.
\nThere’s a subtle touch I want to mention. When music is playing, artwork is big, it has a drop shadow, and Control Center says ‘Now Playing on…’ at the bottom with an icon for the device where audio output is happening. Hit pause, and the artwork shrinks, losing the drop shadow, as the ‘Now Playing…’ message disappears. Tap play again, and the artwork grows bigger with a delightful transition.
\nControl Center’s music playback
\nControl Center’s audio page has two functional problems Apple should address. Song details (title, artist, and album) have been turned into lines of text that don’t scroll and get cut off. Try to listen to songs with long titles – say, I’ve Got a Dark Alley and a Bad Idea That Says You Should Shut Your Mouth (Summer Song) – and you’ll be surprised Apple designers didn’t consider the issue.
\nThat Says…?
In addition, the ability to “love” songs to train Apple Music has been removed from Control Center (and the Lock screen). I don’t understand the decision, as having a dedicated page provides even more room for music controls.
\nDespite the merits of artwork and more intuitive controls, I don’t think Apple added a standalone audio card to Control Center for those reasons alone. To me, the most convincing explanation comes from the hardware menu:
\nPicking audio accessories in Control Center.
With just a few taps, you can connect to Bluetooth headphones or wireless speakers from anywhere on iOS without opening Settings. There’s an obvious subtext: for a device without a headphone jack, an easier way to switch between wireless audio accessories isn’t just a pet peeve – it’s a necessity.
\nAudio playback is the clear winner of the new Control Center in iOS 10. Apple freed themselves from the constraints of iOS 9’s tiny audio controls, and, after three years, music is claiming the prime spot it deserves in Control Center. The new audio page brings a more engaging, integrated listening experience that paves the road for what’s to come.
\nYou can’t use the third page of Control Center unless you’ve configured at least one HomeKit device. I don’t own a lot of HomeKit accessories (I have three Hue lights and a few Elgato sensors), but the new Home page has grown so much on me, I’m no longer using any third-party HomeKit widgets.
\nBesides being available to users with HomeKit devices, Control Center’s Home card only displays accessories and scenes that have been marked as favorites in the new Home app. The page doesn’t list every HomeKit accessory, nor does it work with third-party home automation devices that don’t support HomeKit.
\nIf you meet these requirements, you’ll be able to swipe over the Music card to reveal the Favorite Accessories view.
\nAccessory buttons carry a name and icon assigned in the Home app, and, if supported, a percentage label for intensity (lights have it, for example). A button in the top right lets you switch between accessories and scenes. To turn them on and off, you just tap a button once.
\nButtons can be long-tapped to open a detail screen with more options.11 For my Hue lights, holding a button for a fraction of a second reveals a vertical slider for intensity, which can be adjusted without lifting a finger off the screen.
\nA second layer of navigation is nested into the detail view. With multicolor lights, you can tap on a Colors button below the intensity slider to modify presets and open a color wheel to pick a different shade. The wheel even has a segmented control to switch between color and temperature – a surprisingly deep level of hierarchy for a Control Center page.
\n\nUnfortunately, accessories that only report basic status messages don’t have a useful detail view.
\nIn spite of my limited testing environment, Control Center has become my favorite way to manage HomeKit lights and scenes. It’s a testament to Apple’s penchant for native integrations: lights turn on immediately because commands don’t go through a third-party server, and the entire flow is faster than asking Siri to activate an accessory. I was a heavy user of third-party HomeKit widgets and apps before; on iOS 10, I have no reason to do that anymore thanks to Control Center.
\nIf Apple didn’t have big plans for the connected home, they wouldn’t have given HomeKit its own section in Control Center. With HomeKit expanding to new accessory lines, I think it’s going to be my second most used card after music.
\nAfter three years, Control Center is growing up. To make the pendulum swing back towards simplicity, Apple has traded some convenience of the original design for three standalone pages. By unbundling functionality in discrete units, Control Center is more legible, usable, and flexible.
\nThere are missteps. The lack of any kind of user customization is inexcusable in 2016. The bottom row of shortcuts, down to four icons again, still can’t be modified to accommodate user-selected apps. And you won’t be able to swap toggles at the top for settings you access on a frequent basis.
\nHalf-baked integration with 3D Touch feels like a timid attempt to take Control Center further. The addition of quick actions for apps in the first page is laudable, but why isn’t the same true for toggles at the top as well? And if HomeKit accessories can show nested detail views, why can’t Apple Music display a lyrics screen, too?
\nI want to believe that iOS 10’s Control Center is foreshadowing the ability for developers to provide their own “app pages” and for users to swap default shortcuts with their favorite ones. More than ever before, Control Center is ripe for extensibility and personalization. Like widgets, I can see a future where we interact with some types of apps primarily through mini interfaces in Control Center.
\nI wouldn’t have expected pagination to be what I wanted, but Apple was right in rethinking Control Center as a collection of pages rather than a complex unified dashboard. The majority of iOS users won’t be affected by Apple’s design trade-offs; they’ll appreciate a screen that doesn’t need a manual.
\nThe new Control Center experience isn’t a regression; it’s a much needed reassessment of its role in the modern iOS.
\n\nAs it’s evident by now, Apple has increased the presence of 3D Touch in iOS 10. On top of notifications, Control Center, and the Home screen, 3D Touch actions have been brought to more apps and system features.
\nNotification Center
\nLike on the Apple Watch, you can press on the Clear button in Notification Center to clear all notifications in one fell swoop. Finally.
\nSiri App Suggestions
\nApps suggested by Siri support 3D Touch to show the same quick actions available on the Home screen.
\nApple Music
\nAmong many changes, Apple Music has been given the extended 3D Touch treatment with a contextual menu for selected items and playback controls. Pressing a song or the bottom player brings up a list of options that include adding a song to a library, liking it, saving it to a playlist, or opening lyrics.
\nManage Downloads
\nWhen downloading apps from the App Store or restoring a device from an iCloud backup, you can press on an in-progress download to pause it, cancel it, or prioritize it over others.
\nShare Apps
\niOS 10 automatically adds a Share button to an app’s quick action menu on the Home screen to share its link with friends. Presumably, this is meant to bolster app discovery and sharing among users.
\nBeta Feedback
\nPressing on the icon of a TestFlight beta app shows a shortcut to send feedback to the developer via Mail.
\nThe pervasive use of 3D Touch in iOS 10 proves Apple wants it to be an essential iOS feature. After using iOS 10, going back to iOS 9 feels like missing several layers of interaction.
\nThis creates an even stronger tension between 3D Touch-capable iPhones and devices without it. Right now, Apple is resorting to swipes and long-taps to simulate 3D Touch on iPads and older iPhones; will they always be able to maintain backwards compatibility without making more features exclusive to 3D Touch?
\n\niMessage is a textbook example of how a feature can turn into a liability over time.
\nWhen it was introduced five years ago, iMessage promised to bring a grand unification of SMS and free, unlimited texting with media attachments. iMessage turned Apple’s Messages app into a single-stop solution for conversations between iOS users and those who would later be known as green-bubble friends. It was the right move at the time12, and it allowed Apple to have a communication service as a feature of iOS.
\nOver the last five years, messaging has outgrown texting. Meanwhile, iMessage (the service) and Messages (the app) have remained stuck in their ways.
\nServices like Facebook Messenger, WhatsApp, LINE, and WeChat haven’t only reached (or surpassed) iMessage in terms of users; as mobile-first messaging apps without SMS’ technical (and conceptual) debt, they have been able to relentlessly iterate on design, novel messaging concepts, notifications, and app integrations.
\nThese companies, free of past constraints, have envisioned new ways to communicate. They’ve grown messaging apps into platforms, enabling others to extend them. And maybe some of the current messaging trends will turn out to be fads, but it’s hard to argue against Apple’s competitors with their numbers, cultural influence, and progressive lock-in. They’re no joke, and Apple knows it.
\nBut I wouldn’t ascribe iMessage’s slow pace of evolution to its SMS legacy alone. Because of its end-to-end encryption and Apple’s strict policy on not storing sensitive user information, iMessage is by nature trickier to extend. Apple’s efforts in this area are commendable, particularly when you consider how the aforementioned services diminish in functionality once you add encryption.
\nHowever, security hurdles shouldn’t be an excuse for iMessage’s glaring shortcomings. As laudable as Apple’s stance is, most users aren’t willing to put up with an app that feels old. They want to liven up conversations with rich graphics and apps. They want messaging to be personal. Technologists won’t like this, but, ultimately, people just want a modern messaging app that works.
\nFrom a user’s perspective, it’s fair to say that Apple has been too complacent with iMessage. The service is by no means a failure – it serves hundreds of millions of users every day. But those metrics don’t matter when stasis yields something worse than numbers alone: cultural irrelevancy. That iMessage, as many see it, “is just for simple texting”.
\nThe time has come for iMessage to take the next step. With a willingness to welcome developers into its most important app, and without giving up on its security ideals, Apple is reshaping how users can communicate, express themselves, and share. With iMessage in iOS 10, Apple is ready to embrace change.
\n\nBefore delving into the bigger enhancements to Messages, I want to touch upon changes to the app’s interface and some minor features.
\nThe conversation’s title bar has been redesigned to embed the recipient’s profile picture. Having a photo above a conversation helps identify the other person; the increase in title bar height is a trade-off worth accepting.
\nThere’s new artwork for contacts without a profile picture, too.
The profile picture can be tapped to open a person’s contact card; and, you can press it to bring up a 3D Touch menu – the same one available in Contacts and Phone with a list of shortcuts to get in touch with that person.
\niOS 10 brings a new layout for the bottom conversation drawer. By default, a conversation opens with a narrow text field and three icons next to it – the camera, Digital Touch, and the iMessage app launcher. As you tap into the text field to reply to a message, the three icons collapse into a chevron that can be expanded without dismissing the keyboard.
\nApple has also redesigned how you can share pictures and videos. The new media picker consists of three parts: a live camera view to quickly take a picture; a scrollable grid of recent items from your library; and buttons to open the full camera interface or the photo library, accessed by swiping right.
\nThe assumption is that, on iMessage, people tend to share their most recent pictures or take one just before sharing it. The live camera view can be used to snap a photo in a second (you don’t even have to tap on the shutter button to take it). Moving the camera and library buttons to the side (hiding them by default) has freed up space for recent pictures: you can see more of them thanks to a compact grid UI.
\nSome won’t like the extra swipe required to open the camera or library, but the live photo view makes it easier to take a picture and send it.
\nAfter picking or taking a picture, you can tap on the thumbnail in the compose field to preview it in full screen. You can also tap and hold a picture in the grid to enter the preview screen more quickly.13
\nMarkup inside Messages.
Here, you have two options: you can edit a picture with the same tools of the Photos app (albeit without third-party app extensions) or use Markup to annotate it. You can tap on the Live Photo indicator to send a picture without the Live part, or press on it to preview the Live Photo.
\nSpeaking of photos, iMessage now lets you send images at lower quality, likely to save on cellular usage. You can enable Low Quality Image Mode in Settings -> Messages.
\nOne of the oldest entries of my iOS wish lists is also being addressed in iOS 10: you can choose to enable read receipts on a per-conversation basis.
\nIf you, like me, always keep read receipts turned off but would like to enable them for important threads, you can do so by tapping the ‘i’ button at the top of a conversation and then ‘Send Read Receipts’. The toggle matches the default you have in Settings and it can be overridden in each conversation.
\nWhile Messages may not look much different from iOS 9 on the surface, the core of the app – its conversation view – has been refreshed and expanded. iMessage conversations have received a host of new features in iOS 10, with a focus on rich previews and whimsical, fun interactions.
\nIn its modernization of iMessage, Apple started from web links. After years of plain, tappable URLs, Messages is adopting rich link previews, which are inspired by iOS 9’s link snippets in Notes, but also more flexible and capable.
\nRich links aren’t a special setting of the app: the first time you receive a link in an iMessage conversation in iOS 10, it’ll appear as ‘Tap for Preview’ button in the conversation. This is a one-time dialog to confirm you want to load links as rich previews instead of URLs, which also look different from iOS 9.
\nLoading a rich link for the first time in iOS 10.
Like in Notes (and other services such as Slack and Facebook), rich previews use Open Graph meta tags to determine a link’s title, featured image, audio and video file, or description. A web crawler has been built into Messages: as soon as you send a link, the message’s bubble will show a spinner, and, depending on the speed of your Internet connection, it’ll expand into a rich message bubble after a second, within the conversation.
\n\nRich link previews in Messages use the same technology Apple brought to Notes last year, but they’ve been designed differently. They’re message bubbles with a title and domain subtitle; the upper section, where the featured image of a link is, can grow taller than link snippets in Notes. Web articles tend to have rectangular image thumbnails; podcast episodes shared from overcast.fm are square; and links to iPhone apps shared from the App Store show a vertical screenshot.
\n\nFurthermore, the behavior of sharing links differs between Notes and Messages. Allow me to get a bit technical here.
\nIn Notes, only links captured from the share extension are expanded into rich previews; pasting text that contains a link into a note doesn’t turn the link into a rich preview.
\nNotes: rich links and plain URLs.
In Messages, both individual links and a string of text with a link will generate a rich preview. In the latter case, the link has to be either at the beginning or at the end of a sentence. Messages will break up that single string in two pieces: the link’s preview, and the string of text without the link. Even sending a picture and a link simultaneously will create two message bubbles – one for the image, another for the link.
\n\nThe only instance where Messages will resort to the pre-iOS 10 behavior of a rich text (tappable) URL is when the link is surrounded by text:
\nUnless a link is placed inside a sentence, iOS 10 will never show the full path to the URL – only the root domain. Whether meta tags can’t be crawled14 or if a link is successfully expanded, the URL will be hidden. If you need to see the full URL of a link in Messages, you can long-tap the link to show it in a contextual menu.
\nThere are multiple types of link previews in iOS 10. The majority of websites with a social presence (including MacStories) have added support for Open Graph meta tags and Facebook/Twitter cards, and their links will appear with a featured image and a title. Alas, Apple hasn’t brought Safari View Controller support to Messages, which doesn’t make the experience of following links as seamless as it is on Facebook Messenger.
\nTwitter links have been nicely formatted by Apple: they have a special light blue background and they display a tweet’s text, username, media (except GIFs), and avatar.
\nTwitter links on iMessage.
For Apple Music, the company has created a rich preview that, in addition to artwork, embeds a native play/pause button to listen to songs without leaving Messages. Unlike other web links, you can’t peek & pop Apple Music links, suggesting that it’s a custom implementation that uses an underlying URL to assemble a special message bubble.
\nApple Music links (left) vs. SoundCloud and Spotify.
Third-party companies can’t take advantage of this – both Spotify and SoundCloud links don’t have a playback UI and they’re treated as webpages with a featured image.
\nOther Apple apps with the ability to share links don’t fare as well as Apple Music. App Store and iTunes links show a title, an icon and screenshot, and app categories; you can’t install an app or watch a movie trailer inside Messages. Links to photo albums shared on iCloud.com don’t support rich previews in Messages, and shared notes only come with an icon and the title of a note.
\nYouTube links get expanded into a playable video preview that you can tap once to play, and tap again to pause. There are no additional controls (let alone a progress bar), but it’s great to be able to watch a YouTube clip inline without being yanked to the YouTube app.
\n\nMessages will even pause playback if you scroll down in the conversation, and resume it as you focus on the video again. It’s a nice touch.
\nRich link previews embody the idea of stages of change and how Apple often adds functionality to iOS.
\nUsers will see them as a new feature of Messages, which allows everyone in a thread to see a preview of the destination page. In some cases, message bubbles can even play media. I like how links get expanded inline; plain URLs in old iOS 9 message threads feel archaic already.
\nLink previews also build upon Apple’s work with Universal Links and adoption of open standards such as Open Graph and Schema.org for search. The same technologies Applebot and Spotlight have been using for years now power link previews in iMessage.
\nI’d like to see Apple open up link previews with more controls for developers in the future, but this is a solid start.
\n\nWith iOS 10, even how you send a message can be different. The blue ‘Send’ button has been replaced by an upward-facing arrow; tapping it once sends a regular iMessage as usual.
\nWithin the arrow lies a secret, though. Press with 3D Touch (tap and hold on the iPad), and you’ll bring up a ‘Send with effect’ screen, which lets you send a message with Bubble and Screen effects.
\nLet’s start with bubbles, as I believe they’ll be the more popular ones. There are four types of bubble effects, and they support any type of content you can share in Messages – text, emoji, media, and links.
\nSlam
\nYour message flies across the screen and is slammed to the ground, causing an invisible shock wave to ripple through adjacent messages.
\nBest used when you really want to be heard or make a point. Or for shaming a friend with an ugly selfie from the night before.
\nLoud
\nA more polite version of Slam that enlarges the message without affecting nearby bubbles.
\nThe way the text shakes briefly inside the bubble suggests this is appropriate to shout something, either in anger or happiness, without necessarily destroying everything around you.
\nGentle
\nApple’s version of a kind, intimate whisper. Gentle starts with a slightly larger bubble containing small text, which will quickly grow back to normal size as the bubble shrinks down.
\nPersonally, I think Gentle is ideal for dog pictures as well as the “I told you so” moments when you don’t want to upset the recipient too much. At least you’re being gentle about it.
\nInvisible Ink
\nI won’t explain the ideal use cases for this one, leaving them up to your imagination. Invisible Ink obfuscates the contents of a message and it’s the only interactive bubble of the four.
\nTo reveal text hidden by Invisible Ink, you have to swipe over the bubble to remove the magic dust that conceals it. It can be wiped off from notifications, too. Invisible Ink is automatically re-applied after ~6 seconds.
\nInvisible Ink gives you the time to make sure no one is looking at your screen. Ingenious.
\nBubble effects may not appeal to iOS power users, but they’re a lot of fun, they’re whimsical, and they add personality to conversations.
\n\nFrom a technical standpoint, the implementation of 3D Touch is spot-on: you can hold down on the Send button and scroll to preview each bubble effect before sending it. If you receive a message with a bubble effect, it’ll only play once after you open the conversation – they won’t be constantly animating. I’ve been using them with friends and colleagues, and like them.
\nScreen effects are a different story. Unlike bubble effects, they take over the entire Messages UI and they play an animation with sounds that lasts a couple of seconds. Screen effects are deliberately over the top, to the point where they can almost be gaudy if misused. Lasers, for instance, will start beaming disco lasers across a conversation.15 Shooting star will cause a star to fly through the screen with a final “ding” sound, while fireworks will put up celebratory explosions, turning the app’s interface dark as you gaze into the virtual New Year’s night of iMessage.
\nHere’s what they look like:
\nBalloons
\nConfetti
\nLasers
\nFireworks
\nShooting Star
\nMy problem with screen effects is that they can be triggered by certain keywords and phrases without any prior warning. Texting “congrats” will automatically fire off the Confetti effect, which is nice the first time, but gets annoying quickly when you find yourself texting the expression repeatedly and being showered in confetti every time. The same is true for “happy new year” and “happy birthday”, which will bring up Fireworks and Balloons without the user’s consent.
\nI use screen effects occasionally to annoy my friends and throw confetti when I feel like it – but the automatic triggering feels almost un-Apple in its opaque implementation. There should be an indicator, or a setting, to control the activation of screen effects, or Apple should abandon the idea altogether, letting screen effects behave like the bubble ones following a user’s command.16
\nScreen effects aren’t the most exciting aspect of the new iMessage, but they bring some unexpected quirkiness into the app, which isn’t bad either. Just use them responsibly.
\n\nWhen Apple introduced Digital Touch on watchOS in 2014, it was safe to assume it’d eventually find its way to iOS. Two years later, Digital Touch has been built into Messages in iOS 10, gaining a prominent spot between photos and the new iMessage App Store.
\nDigital Touch can be activated from the heart icon with two fingers – a reminder of its Apple Watch legacy. Tapping the button turns the lower half of the screen into an interactive pad where you can draw, send taps and heartbeats, and annotate photos and videos.
\nDigital Touch has three sections: a color picker along the left side (where, like on the Watch, you can long-tap a color to pick another one); a drawing area in the middle; and icons explaining Digital Touch features rotating on the right. At the bottom, a chevron lets you open Digital Touch in expanded mode, taking over the conversation in full-screen.
\n\nThere isn’t much to say about the functionalities adapted from watchOS. Sketches are easier to create thanks to the bigger screen, though I think that, to an extent, the constraints of the Watch incentivized creativity. Also, sketches look like images with a black background pasted into conversations: they’re animated, but they don’t feel as integrated as they used to be on the Apple Watch. They look like simple image attachments on iOS 10.
\nTaps and heartbeats are the kind of features someone decided to bring over to iOS so they wouldn’t go to waste. They fundamentally feel out of place on iOS given the lack of haptic feedback on the wrist and their black background.
\nWhen you receive a tap from someone on the Apple Watch, you feel tapping on your wrist. Taps are animated images on iOS 10 and there’s nothing special about them. The physical connection is lost. Apple could have made taps part of the conversation view, letting them ripple through bubbles like effects do, or use vibration as feedback; instead, they settled on GIFs.
\nHeartbeats are even more baffling, as they aren’t “real” heartbeats due to the lack of a heart rate sensor on iOS. When you hold two fingers on the screen to send your heartbeat on iMessage17, iOS generates a generic animation that isn’t a representation of anyone’s heartbeat. The sense of intimacy watchOS fostered thanks to Digital Touch and its heart rate sensor – of knowing that the heartbeat animation represented the actual beating heart of a friend or partner – isn’t there on iOS.
\nAnd don’t get me started on the sadness of swiping down with two fingers to send a heartbreak.
\nThen there’s 3D Touch, which is used in Digital Touch to send “fireballs”. If you press on the Digital Touch pad, iOS 10 creates a pulsing fireball that will be sent as an animated image.
\nThat’s a fireball.
I’m not sure what to make of the fireball – does sending one show you’re thinking of someone? That are you’re upset with them? That you’ve realized 3D Touch exists in iMessage? Is it a reference to John Gruber? It’s an open-ended question I’ll leave to the public to resolve.
\nThe standout Digital Touch feature is one that has been built around the iPhone’s hardware. Tap the video icon, and you’ll bring up a camera UI to sketch on top of what the camera is seeing. You can also add Digital Touch effects in real-time while recording a 10-second video (to take a picture, tap the shutter icon).
\n\nThe combination of sketches and kisses with videos is fun and highly reminiscent of Snapchat; I’ve been using it to send short clips with funny/witty comments or sketches drawn on top of them. Apple should add more iOS-only “stamps” or animations to Digital Touch for photos/video without copying what they’ve done on watchOS.18
\nUnrelated to Digital Touch, but still aimed at making conversations more personal, is handwriting mode.
\nAnyone who’s familiar with handwritten signatures in Preview and Markup will recognize it: handwriting can be accessed by tapping the ink button on the iPad keyboard or turning the iPhone sideways. It opens an empty area where you can handwrite a message in black ink using your finger (or Apple Pencil). There’s a list of default and recent messages at the bottom (which can be deleted by long-tapping them), and no additional controls.
\nHow handwritten messages look in conversations.
I found handwriting mode to be nicer than Digital Touch. Handwritten messages aren’t contained in a black image and ink animates beautifully19 into the conversation view, which creates the illusion that someone has written a message for you inside Messages instead of sending an image attachment. It’s a better integration than Digital Touch.
\nDigital Touch on iOS 10 could have used more work. Features that had some reason to exist on watchOS’ hardware have been lazily ported to iOS, removing the physical interaction and feedback mechanism that made them unique on the Watch.
\nI’m not sure the iOS Digital Touch we have today is worth giving up a premium slot as a default iMessage app next to the Camera. It’s a “Friends button” scenario all over again. I wouldn’t be surprised if that permanent placement becomes customizable next year.
\niOS 10 brings new options to react to messages, too.
\nCalled Tapback, the feature is, essentially, Apple’s take on Facebook’s redesigned Like button and Slack’s reactions. If you want to tell someone what you’re thinking without texting back, you can double tap20 a message – any kind of bubble – to bring up a menu with six reactions: love, thumbs up, thumbs down, ha-ha, exclamation points, and question mark.
\nSending a Tapback.
The interaction of Tapback is delightful. Icons animate when you tap on them, and they play a sound effect once attached to a message. You can’t create your own reactions by picking any emoji like on Slack, but, looking at a conversation with a bunch of hearts, thumbs-ups, and ha-has, the feeling is the same.
\nTapback
\nTapbacks are especially effective in group threads where everyone can “vote” or express their immediate reactions without typing. A Tapback can be changed at any point during a conversation, but you can only leave one reaction per message.
\nIf what happened in my Slack teams over the past year is of any indication, Tapback should become a useful way to let someone know you’ve acknowledged or liked their message without writing anything back.
\nSlack’s influence on iMessage has propagated to emoji as well. Messages that only contain emoji (no text) will be sent as big emoji (at 3x), so you can truly appreciate the details that make up Apple’s most popular characters.
\nRegular and big emoji.
I’ve been a fan of jumbo emoji since Slack rolled them out last year. They’re a perfect fit for iMessage. Emoji are expanded in the text field before sending them – I chuckle every time I see a big thinking face about to enter a conversation. Messages will only display up to three big emoji at a time; if you create a message containing four emoji, they’ll be sent at normal size.
\nEmoji improvements don’t stop there. Apple must have noticed that users like to write messages and replace words inside them with appropriate emoji, and they’re introducing an option to automate the process in iOS 10. Possibly, this innocuous feature (which only works in Messages) is even going to power Apple’s Differential Privacy for crowdsourced data collection.
\nIf you write a message in iOS 10 and then open the emoji keyboard, the system will scan words you’ve entered in the text field and try to match them up with emoji. If a related emoji is found, a word will be highlighted in orange. Tap it, and it’ll be replaced with the emoji.
\nTap the emoji keyboard to replace words with emoji.
If multiple emoji options are available for a single word, tapping it opens a menu to choose one.
\nMultiple emoji options.
I’m not exactly the target audience for this feature (I either only send emoji or put some next to a word), but I recognize that a lot of people treat emoji as substitutes for words. Apple devised a clever and thoughtful way to “emojify” text, letting the OS compensate for a search box still missing from the emoji keyboard.
\nUnder the hood, emoji replacements hinge on a system that has to build up associations and trigger words, follow trends, and adapt for international users and different meanings of the same emoji around the world. Based on what Apple has revealed about Differential Privacy, data on emoji picked by users will be collected in aggregate to improve the accuracy of suggestions.
\nMy understanding is that Apple started from a set of words curated from common expressions and Unicode annotations, and began scaling to millions of users and dozens of languages for over 1800 emoji during the iOS 10 beta stage. In my case, emoji replacements worked well for both English and Italian.
\nCrowdsourcing this aspect of iMessage makes sense given the popularity and many meanings of emoji. It’ll be interesting to see how suggestions will be refined as iOS 10 usage picks up.
\n\nDespite numerous design updates and enhancements to conversations, the most profound change to iMessage isn’t the app itself – it’s other apps developers will build for it.
\nApple is opening iMessage to developers in iOS 10, turning it into a platform that can be extended. The company has created a Messages framework for developers to plug into and build apps, which will be available on the new iMessage App Store.
\nThe stakes are high. For millions of users, their messaging app is a second Home screen – a highly personal, heavily curated gateway to contacts, private conversations, and shared memories. Messaging isn’t just texting anymore; it’s the touchstone of today’s mobile lifestyle, a condensation of everything smartphones have become.
\nApple won’t pass up this opportunity. Not this time. In opening up their most used app, Apple hopes that developers will take iMessage further with new ways to share and enrich our conversations.
\nDevelopers can write two types of Messages extensions in iOS 10: sticker packs and iMessage apps. Both can be discovered and installed from the iMessage App Store embedded into the Messages app, and both can be created as standalone apps or as extensions within a containing iOS app.
\nYou can access the iMessage App Store with the apps button next to the input field. Messages will hide the keyboard and bring up a scrollable gallery of all your installed Messages extensions, opening the last used one by default. Apps are organized in pages and you can swipe between them. You can expand the currently selected app with the chevron in the lower right, and browse recent content from all apps via the leftmost page.
\nOpening the last used iMessage app (left) and the Recents page (right).
There’s also a way to view a Home screen of iMessage apps as icons. If you tap on the icon in the bottom left corner, you’ll be presented with a grid of oval icons (the shape for iMessage apps) and a ‘+’ button to open the iMessage App Store.
\nThe iMessage app drawer (left) and the new iMessage App Store.
This view has been designed to resemble the iOS Home screen: you can swipe horizontally across apps, you can tap & hold to delete them and rearrange them, and you can even click the Home button to stop wiggling mode.21
\nI like the idea of an iMessage SpringBoard, but it takes too many taps to open it22, especially if you want to launch an app in a hurry. Apps are tucked away behind three taps, and I wonder how that will impact usability in the long run. Right now, the compact app drawer (with the dots at the bottom) doesn’t scale to more than 30 installed apps and it feels like equivalent of the Slide Over app picker from iOS 9; there has to be a faster way to navigate and launch iMessage apps.23
\nPerhaps a Messenger-like design with top launchers embedded above the keyboard would have been a preferable solution.
\niMessage stickers can be seen as Apple’s response to the rise of third-party “emoji” keyboards that offer selections of sticker-like images, usually in collaboration with brands and celebrities. If you’ve seen the likes of KIMOJI, Justmoji, PetMOJI, Bitmoji, and literally anything -moji on the App Store lately, you know that’s an aspect of iOS Apple could improve for both users and developers.
\nWhat some third-party companies try to sell as “custom emoji” aren’t really emoji: they are images that can be pasted in conversations.24 Developers don’t control the availability of emoji in Apple’s keyboard, nor can they alter what is defined as emoji in the Unicode specification. By manipulating the public’s perception of what an emoji is, and by leveraging custom keyboards to make their “emoji” look like part of iOS, some developers were able to carve themselves a profitable niche on the App Store. Just ask Kanye West and how they made a million a minute.25
\nHowever, I don’t blame developers for trying and riding on the coattails of emoji.26 I’d argue that a lot of companies settled on the “moji” suffix because iMessage was the only big messaging service without native sticker support, and emoji were already in the typical iOS user’s vocabulary.
\nStickers provide an enormous opportunity for developers and, yes, brands to give users fun ways to express their feelings with a wider array of emotions and contexts than emoji alone. Look at LINE, and the massive, multi-million dollar success of Cony and Brown and revenue from their Creators Market; think about Twitter and how they commissioned a set of stickers to be used on photos.
\nIf every major messaging platform has found stickers to be popular and profitable, there must be something to them that appeals to people. With iOS 10, Apple, too, wants a piece of the action and is letting developers create sticker packs for iMessage. The goal is to entice users to personalize their iMessage conversations with stickers, download additional packs, and spread usage with friends. The company plans to do so with a superior experience than custom keyboards, with the prospect of a new gold rush for developers.
\nStickers live in the standard sticker browser – the compact view that opens after choosing a sticker pack from the app drawer. This area can have a custom background color and it’s where you can interact with stickers.
\nTwo sticker packs.
You can tap on a sticker to place it in the input field and send it individually, or you can peel it off the browser and drag it around in a conversation.
\nTapping a sticker to send it (left) and peeling it off (right).
The animation for peeling stickers off the browser and re-attaching them is some of Apple’s finest OpenGL work in a while.
\nAttaching stickers
\nYou can attach stickers to any message bubble in the transcript: you can put one next to a text message, cover a photo with multiple stickers, or even put a sticker atop another one or a GIF. Want to peel off a sticker and use it on an older message? Drag it over the title bar, wait for the conversation to scroll back, and attach it wherever you want. How about covering your friend’s eyes with googly eye stickers? You can do that too.
\n\nOnce a sticker has been placed in a conversation, you can tap and hold it to open the sticker details. This is also how you view all stickers that cover a message bubble, with buttons to download the complete packs on the iMessage App Store27. Here, you can swipe on a sticker to delete it from the selected message bubble if you no longer want to see it.28
\nOpening sticker details.
You’ll come across two kinds of sticker packs. There are the basic ones, which are a collection of images displayed inside a sticker browser. This will probably be the most popular choice for developers, as creating these packs doesn’t require a single line of code. If you’re a developer and want to sell a sticker pack on the iMessage App Store, all you need to do is drop some image files into an Xcode sticker pack project, add icons, and submit it to Apple.29
\nStickers can also be rotated and enlarged using pinch gestures.
The second kind are sticker packs with a custom sticker browser or other additional features. Technically, these are iMessage apps that use the Messages framework for sticker functionality that goes beyond basic drag & drop. For instance, you may see apps where you can assemble your own stickers, or sticker packs with custom navigation elements and In-App Purchases. The sticker behavior in conversations is the same, but these packs require more work from developers.30
\nFrom a user’s perspective, stickers make iMessage conversations feel different. More lively and fun, but also busier and messier if overused.
\nI’ve been able to test about 30 different iMessage sticker packs from third-party developers in the past couple of months. One of the highlights is The Iconfactory, which leveraged their expertise in icons and illustrations to create some fantastic sticker packs for iMessage.
\n\nFrom Sunshine Smilies (emoji characters as stickers) and Tabletop RPG (role-playing emoji stickers) to Mystic 9 Ball and Dino, I believe The Iconfactory has found a perfect way to reinvent themselves for the iMessage era. They’re a great fit for stickers.
\nDeveloper Raul Riera has created what I believe is going to be a popular type of custom sticker app: Emoji Stickers lets you put together your own stickers containing emoji characters.
\n\nYou can create concoctions like a monkey wearing a crown or a pineapple pizza. This is done with a custom sticker-assembling UI and built-in emoji from the open source Emoji One set.
\nMonstermoji, created by Benjamin Mayo and James Byrd, features beautifully hand-drawn monster characters you can attach to messages.
\nThese stickers are unique, and they show how anyone can easily create a sticker pack and release it.
\nI also like Anitate, a set of 80+ animated stickers by Raven Yu.
\nLook at that sad pug with bunny ears.
Anitate’s stickers are like animated emoji, redrawn for a flat style with animations. They’re fun and I’ve been using them a lot.
\nLast, I want to mention Sticker Pals – by far, the most impressive and beautiful sticker pack I’ve tried. Designed by David Lanham in collaboration with Impending, Sticker Pals features a large collection of animated hand-drawn stickers for various emoji-like objects, symbols, and animals. The illustrations are gorgeous, animations are fun, and there are hundreds of stickers to choose from.
\n\nSticker Pals is a good example of what can be achieved by creating a custom sticker browser with the Messages framework. There are buttons at the top of the browser to switch categories, and each tap corresponds to a different sound effect. Plus, the developers have devised a clever unlocking mechanism for extra stickers with an in-app store and the ability to send stickers as gifts to your friends – all within an iMessage app with a sticker browser.
\nJudging from the amount of pre-release sticker packs I received during the summer, I have a feeling the iMessage App Store team at Apple is going to be busy over the next few weeks.31
\nWith iMessage stickers, Apple hasn’t just created a better way to paste images in conversations. They’re stickers in the literal sense – they can be attached anywhere, sometimes with questionable results, but always with a surprising amount of freedom and experimentation. Mixing multiple stickers at once on top of messages could become a new activity of its own32 – I know I’ve had fun placing them over photos of my friends.
\nStickers are often looked down upon by the tech community because they seem frivolous and juvenile. But emoji were met with the same reaction years ago, and they’ve gone on to reinvent modern communication, trickling into pop culture.
\niMessage stickers probably won’t have the same global impact of emoji, primarily because they only work in iMessage33 and the service isn’t cross-platform. But I also believe that stickers are the perfect addition to iMessage in 2016. Stickers are messaging’s lingua franca. Their adoption is going to be massive – bigger than custom keyboards have ever been. Stickers are lighthearted, fun to use, and they make each conversation unique.
\nLet’s check back in a year and see how many sticker packs we have installed.
\n\nThe iMessage platform’s opportunity lies in the second type of extensions available to developers: iMessage apps.
\nLike sticker packs, iMessage apps are installed and managed from the iMessage App Store, they live in the Messages app drawer, and they support compact and expanded mode. They can be standalone apps or extensions within a containing iOS app.
\nUnlike basic sticker packs, however, iMessage apps have to be programmed. They’re actual apps that can present a user interface with their own view controller. iMessage apps can:
\nWith iMessage apps, developers can bring their apps’ interfaces, data, and experience into Messages.
\nExamples of iMessage apps.
Because of this, there are no limitations for what an iMessage app should look like. Anything developers can put in a view controller (bearing in mind compact mode and memory constraints) can be an iMessage app. Coming up with a miniaturized app that makes sense in Messages, though, will be just as hard as envisioning Watch apps that are suitable for the wrist.
\nThere are some differences to consider for compact and expanded mode. In compact, apps cannot access the system keyboard and they can’t support gestures (horizontal swipes are used to navigate between apps and sticker packs). Only taps and vertical scrolling are available in compact mode.
\niMessage apps in compact mode.
In expanded mode, both the system keyboard and gestures are supported. Developers can ask users to type information in the expanded layout, they can enable deeper gesture controls, and, generally speaking, they have more freedom in what they present to the user. When running in expanded mode, an iMessage extension that has a container app features an icon in the top left to launch the full app.
\niMessage apps in expanded mode.
The other peculiarity of iMessage apps is that they can create interactive messages with special message bubbles. These bubbles are based on a template with some strict limitations. There’s only one layout apps can use. An interactive message can display an image, audio, or video file as the main content; the app’s icon is always shown in the top left; at the bottom, developers can set textual properties for the bubble’s caption bar including title, subtitle, captions, and subcaptions (the caption bar is optional).
\niMessage apps can’t alter the standard layout of an interactive message, nor can they inject buttons around it. Any user interaction must be initiated from the bubble itself. iMessage apps can’t send interactive messages on the user’s behalf: they can only prepare an interactive message and place it in the input field.
\n\nWhen an interactive message bubble is tapped, an iMessage app can bring up a custom interface to let participants view more information on the shared item or continue a task. Keep in mind, though, that if you tap on an interactive message to open it in full-screen when you don’t have the iMessage app to view it on your device, you’ll be taken to the iMessage App Store to install it.
\nThe best way to understand what iMessage apps can do is to try some. Since June, I was able to test over 20 iMessage apps from third-party developers, and I have a general idea of what we should expect throughout the year.
\nSupertop’s podcast client, Castro, will soon let you share your favorite episodes with an iMessage app. Castro loads a list of episodes you’ve recently listened to; tap one, and it’ll turn into a rich bubble embedding artwork and episode title.
\nCastro’s iMessage app.
The best part: you can tap the bubble to open show notes in full-screen (and even follow webpage links inside Messages) and add an episode to your Castro queue. It’s a great way to share podcast episodes in iMessage conversations and save them with a couple of taps.
\nDrafts, Greg Pierce’s note-taking app, has added an iMessage app extension to share notes with friends. You can browse all notes from your inbox or switch to Flagged messages.
\nDrafts’ iMessage app.
Drafts places a note’s plain text in Messages’ input field, ready to be sent. The iMessage app is going to come in handy to share commonly accessed notes and bits of text with colleagues.
\nEver wished your GIFwrapped library – carefully curated over the years – was available in iMessage? With iOS 10, you’ll be able to paste your favorite GIFs without using a custom keyboard.
\nSending GIFs with GIFwrapped on iMessage.
I’ve been using GIFwrapped’s iMessage app to send GIFs of dogs smiling to my mom and girlfriend. They love them.
\nAlongside a widget and rich notifications, CARROT Weather is coming to iMessage with an app to share weather forecasts in conversations. It’s a solid example of the flexibility granted to apps: CARROT for iMessage carries its custom UI and hilarious sound effects, and it displays rich graphics and animations. It can access your current location from Messages, and it even lets you search for locations in expanded mode, where you can browse a full-screen forecast of the upcoming week – all without leaving Messages.
\n\nCARROT creates interactive messages that are prepared in the input field by tapping a Share button. These are bubbles with a custom preview graphic and text labels for location, temperature, and current conditions. If you receive one and tap on it, you’ll open CARROT’s expanded preview.
\nDeveloped by Sven Bacia, Couchy is another iMessage app that sends interactive bubbles that present a full-screen UI when tapped. Couchy is a TV show tracker; on iMessage, it displays a list of recently watched and upcoming show episodes. Pick one, and Couchy will assemble a bubble with the series’ artwork and name of the episode.
\nCouchy’s iMessage app.
When you tap a Couchy message, you get an expanded preview of the episode with metadata and artwork fetched from trakt.tv, plus the ability to view the episode in the main Couchy app.
\nETA, a navigation app I covered on MacStories before, is based on a similar design, using small snippets in compact mode for your favorite locations. Tap one, and the app will calculate travel time on the spot, preparing a message bubble to share with someone.
\nETA’s iMessage app.
The interactive message can be tapped to view more details about the other person’s estimated travel time, as well as get directions to the same address. You can also collaborate on the same travel time and respond with your status (more on collaborative apps below) and search for locations directly from iMessage. ETA is one of the most useful, technically impressive iMessage apps I’ve tried.
\nIt can get even more advanced than this, though. Snappy, for example, is a web browser for iMessage. You can search Google or paste URLs in a search box, or use search suggestions.
\nBrowse the web inside iMessage with Snappy.
Once you’ve found a webpage you want to share in a conversation, you can tap a Send button to insert the link in the input field. The link, of course, will expand into a rich preview. Given Messages’ lack of Safari View Controller, Snappy can be useful to paste links and view them without leaving the app; it’s also a convenient way to look something up on Google while talking to a friend.
\nPico, developed by Clean Shaven Apps, can send photos and videos at lower quality with deeper controls than Apple’s built-in Low Quality Image Mode for iMessage. After choosing media from the library, Pico opens a dark interface with a preview at the top and quality settings at the bottom. You can choose from four quality presets, compare savings with the original item, and tweak dimensions.
\nCompating image saving with Pico.
In addition to downscaling, Pico can remove metadata from media, such as location details. The app remembers settings for each conversation, and, overall, it’s a great way to save on cellular data with more options than iMessage’s default solution.
\nTouch ID can be integrated with iMessage apps, and Cipher uses the authentication framework to let you send “secret messages” encrypted with AES-256 that don’t appear in the transcript as normal text messages. Instead, Cipher generates custom bubbles that hide your text; on the other end, the recipient will have to authenticate with Touch ID (thus confirming the device isn’t being used by someone else) to read your message.
\nYou can also send digitally-signed messages to prove it’s really you by typing in Cipher and “signing” with your Touch ID.
\nThese are just a few examples of what developers can build with the Messages framework. Starting today, we’re going to see an avalanche of iMessage apps, but the best ones will stand out as intuitive utilities suited for sharing.
\nAlong with single-user apps, Apple has emphasized the ability for developers to build iMessage apps that let users collaborate on a task inside Messages.
\nIn a collaborative iMessage app, an interactive message can be modified by participants in a conversation. As two or more people interact with a message in the same session and update its content, Messages removes the previous message from the transcript, collapsing it into a succinct summary text (so that outdated messages with old data don’t pollute the conversation). Only the new message, with the updated content, is displayed at the bottom as normal.
\nLet’s work with a fictional example.
\nImagine that you’re planning a trip with your friends over iMessage. It’s always hard to keep track of everyone’s available times, so developer Myke Hurley has created 1-2-3 Trip Planner, an iMessage app that looks into a user’s calendar, brings up a custom calendar view in Messages, and lets the user choose up to three available slots in their schedule. Once three times are picked, 1-2-3 Trip Planner generates a message bubble with the user’s three choices as a title.
\nStephen has created an iMessage conversation with two of his friends, and they want to plan a trip together. Stephen brings up 1-2-3 Trip Planner, finds some available slots in his weekend schedule, selects three of them, and sends a message. The interactive message uses “Available Times – Stephen” in the bubble and the days of the week as title.
\nStephen creates the first 1-2-3 Trip Planner bubble.
On the other end of the conversation, Christina needs to look at her calendar and pick three available times. When she taps the 1-2-3 Trip Planner bubble, Stephen’s choices are displayed alongside her calendar events, and she can agree on a same slot, or pick a different one. She then replies with her preferences, sending another message bubble.
\nChristina replies with her schedule.
John is the third participant in this conversation. In his iMessage transcript, Stephen’s first bubble has been collapsed into a summary text that says “Stephen picked three time slots” and Christina’s message says “Stephen and Christina’s time slots”. John is only seeing the latest message bubble with the choices of both users. When he taps on it, a full-screen interface comes up, showing him a calendar view with his events and the times Stephen and Christina previously picked.
\nJohn picks his time slots and the trip is planned.
John can also agree on previously chosen time slots or pick new ones. When he’s done, he sends his reply, and the second message bubble from Christina also turns into a summary text. John’s third and final bubble has a title that says “Stephen, Christina, and John”. At this point, the three participants are looking at one interactive message; they can look at the results and decide on a time that works for everyone.
\nRight: what collapsing bubbles into summaries looks like.
Stephen, Christina, and John collaborated on a task within Messages without the back and forth of switching between calendars and texting each other’s available times. 1-2-3 Trip Planner has allowed multiple users to easily agree on a shared schedule in less than a minute.
\nThere are two additional aspects worth noting. In my imaginary (but technically accurate) example, 1-2-3 Trip Planner accessed the native iOS EventKit framework; I’ve tried actual iMessage apps that accessed the camera, location, photos, and the clipboard. Also, Apple is very concerned about user privacy and exposing contact information to iMessage apps. For this reason, the Messages framework doesn’t allow apps to see any details about the participants in a conversation, but only local identifiers (alphanumeric strings that don’t identify a single person).34
\nThe framework Apple has built into Messages should, in theory35, allow for the creation of moderately complex collaborative apps. Calendar collaboration is just one possible use case; imagine utilities to split bills, todo apps, photo compositions, and even games.
\nI tested a couple of straightforward collaborative iMessage apps in the past few weeks. The aforementioned ETA iMessage app lets you respond to a friend’s travel time with another interactive message.
\nETA’s bubbles and summaries.
Another app is ChibiStudio, which lets you assemble “chibi” avatars either by yourself or with a friend choosing from various pieces of clothing and body traits.
\nCollaborating on character creation on iMessage.
When creating a chibi collaboratively, each person can add new details to the character and send an interactive message back. To keep track of progress, the app tells you which items have been added in the title of the message bubble and it collapses previous messages into summaries. I tested ChibiStudio with John, and it was fun.
\nDo With Me uses collaboration in iMessage effectively, enabling you to create shared todo lists where other people can add and complete items inside a conversation.
\nJohn added items to our shared Do With Me list.
I wouldn’t use an iMessage todo app as my only task manager, but I think it’s useful to have something like Do With Me as an extension of a full task manager to collaborate with others on specific lists (grocery shopping, homework, etc.).
\nFinally, it wouldn’t be a new App Store without a re-interpretation of tic-tac-toe. In xoxo, you’ll be able to challenge friends on the classic game with a collaborative iMessage app that uses bubbles and full-screen views to advance the game.
\nSometimes, you need a simple iMessage game to kick back.
The app works surprisingly well, with a good use of summaries in the transcript and captions to describe player moves. It’s a nice way to pass the time during a conversation.36
\nCollaborative iMessage apps are only one part of the story. For single-user iMessage apps, the Messages framework should be enough to create deep, custom experiences unlike anything we’ve seen before.
\nWhen the App Store opened for business, no one could imagine the extent of developers’ imagination. No one could predict what the iPhone would become by letting app makers write software for it. And looking back at that moment today, it’s evident that our devices are deeply different, and dramatically more powerful, because of apps.
\nApple can attain a similar result with the iMessage App Store. iMessage apps create a new avenue for developers to bring any kind of experience into millions of daily conversations. And by plugging into iOS and the App Store, Apple can leverage the scale of an ecosystem other messaging services don’t have.
\nAfter using iMessage apps for the past three months, I have the same feeling of the early App Store days. It’s a new frontier, it’s exciting, and developers are just getting started. Compared to companion App Stores like the Watch and Apple TV ones, I think the iMessage App Store will be a hit among iOS users.
\nApple needed to modernize iMessage in iOS 10, but they went beyond mere aesthetic and functional improvements to the Messages app. They’ve opened the door for apps to reimagine what we share and how we share it.
\nWe’re on the brink of a fundamental change to iMessage. If Apple plays its cards right, we could be witnessing the foundation of a second app platform within iOS.
\n\n“A delayed game is eventually good, but a rushed game is forever bad”, Nintendo’s Shigeru Miyamoto once quipped.
\nUnlike the console Miyamoto was concerned about, modern software and services can always be improved over time, but Apple knows the damage that can be caused by the missteps and perception of a rushed product. With iOS 10’s SiriKit, they’ve taken a different, more prudent route.
\nEver since the company debuted its virtual assistant in 2011, it was clear Siri’s full potential – the rise of a fourth interface – could only be unlocked by extending it to third-party apps. And yet, as Siri’s built-in functionalities grew, a developer SDK remained suspiciously absent from the roster of year-over-year improvements. While others shipped or demoed voice-controlled assistants enriched by app integrations, Siri retained its exclusivity to Apple’s sanctioned services.
\nAs it was recently revealed by top Apple executives, however, work on Siri has continued apace behind the scenes, including the rollout of artificial intelligence that cut error rates in half thanks to machine learning. In iOS 10, Apple is confident that the Siri backend is strong and flexible enough to be opened up to third-party developers with extensions. But at the same time, Apple is in no rush to bring support for any kind of app to Siri in this first release, taking a cautious approach with a few limitations.
\nDevelopers in iOS 10 can integrate their apps with Siri through SiriKit. The framework has been designed to let Siri handle natural language processing automatically, so developers only need to focus on their extensions and apps.
\nAt a high level, SiriKit understands domains – categories of tasks that can be verbally invoked by the user. In iOS 10, apps can integrate with 7 SiriKit domains37:
\nInside a domain, Siri deals with intents. An intent is an action that Siri asks an app to perform. It represents user intention and it can have properties to indicate parameters – like the location of a photo or the date a message was received. An app can support multiple intents within the same domain, and it always needs to ask for permission to integrate with Siri.
\nSiri permissions.
SiriKit is easy to grasp if you visualize it like a Chinese box with a domain, inside of which there are multiple types of actions to be performed, where each can be marked up with properties. In this structure, Apple isn’t asking developers to parse natural language for all the expressions a question can be asked with. They’re giving developers empty boxes that have to be filled with data in the right places.
\nImagine a messaging app that wants to support Siri to let users send messages via voice. Once SiriKit is implemented, a user would need to say something like “Tell Myke I’m going to be late using [app name]”, and the message would be composed in Siri, previewed visually or spoken aloud, and then passed to the app to be sent to Myke.
\nCraig Federighi with an example of WeChat in Siri.
This basic flow of Siri as a language interpreter and middleman between voice and apps is the same for all domains and intents available in SiriKit. Effectively, SiriKit is a framework where app extensions fill the blanks of what Siri understood.
\nThe syntax required by SiriKit simultaneously shows the rigidity and versatility of the framework. To summon an intent from a particular app, users have to say its name. However, thanks to Siri’s multilingual capabilities, developers don’t have to build support for multiple ways of asking the same question.
\nYou could say “Hey Siri, send a message to Stephen using WhatsApp” or “message Stephen via WhatsApp”, but you could also phrase your request differently, asking something like “Check with WhatsApp if I can message Stephen saying I’ll be late”. You can also turn an app’s name into a verb and ask Siri to “WhatsApp Stephen I’m almost home”, and SiriKit will take care of understanding what you said so your command can be turned into an intent and passed to WhatsApp.
\nIf multiple apps for the same domain are installed and you don’t specify an app’s name – let’s say you have both Lyft and Uber installed and you say “Hey Siri, get me a ride to the Colosseum” – Siri will ask you to confirm which app you want to use.
\nApple has built SiriKit so that users can speak naturally, in dozens of languages, with as much verbosity as they want, while developers only have to care about fulfilling requests with their extensions. Apple refers to this multi-step process as “resolve, confirm, and handle”, where Siri itself takes care of most of the work.
\nDevelopers are given some control over certain aspects of the implementation. From a visual standpoint, they can customize their experiences with an Intents UI extension, which makes a Siri snippet look and feel like the app it comes from.
\nCustomizing Siri extensions is optional, but I’d bet on most developers adopting it as it helps with branding and consistency. Slack, for instance, could customize its Siri snippet with their channel interface, while a workout extension could show the same graphics of the main app. Intents UI extensions aren’t interactive (users can’t tap on controls inside the customized snippet), but they can be used for updates on an in-progress intent (like a Uber ride or a workout session).
\nAn app might want to make sure what Siri heard is correct. When that’s the case, an app can ask Siri to have the user double-check some information with a Yes/No dialog, or provide a list of choices to Siri to make sure it’s dealing with the right set of data. By default, Siri will always ask to confirm requesting a ride or sending a payment before the final step.
\nOther behaviors might need authentication from the user. Apps can restrict and increase security of their SiriKit extensions (such as when a device is locked) and request Touch ID verification. I’d imagine that a messaging app might allow sending messages via Siri from the Lock screen (the default behavior of the messaging intent), but restrict searching a user’s message history to Touch ID or the passcode.
\nLast, while Siri takes care of natural language processing out of the box, apps can offer vocabularies with specific terms to aid recognition of requests. A Siri-enabled app can provide user words, which are specific to a single user and include contact names (when not managed by Contacts), contact groups, photo tag and album names, workout names, and vehicle names for CarPlay; or, it can offer a global vocabulary, which is common to all users of an app and indicates workout names and ride options. For example, if Uber and Google Photos integrate with SiriKit, this means you’ll be able to ask “Show me photos from the Italy Trip 2016 album in Google Photos” or “Get me a Uber Black to SFO”.
\nSiriKit has the potential to bring a complete new layer of interaction to apps. On paper, it’s what we’ve always wanted from a Siri API: a way for developers to expose their app’s features to conversational requests without having to build a semantic engine. Precise but flexible, inherently elegant in its constraints, and customizable with native UIs and app vocabularies. SiriKit has it all.
\nThe problem with SiriKit today is that it’s too limited. The 7 domains supported at launch are skewed towards the types of apps large companies offer on iOS. It’s great that SiriKit will allow Facebook, Uber, WeChat, Square, and others to build new voice experiences, but Apple is leaving out obvious categories of apps that would benefit from it as well. Note-taking, media playback, social networking, task management, calendar event creation, weather forecasts – the nature of these apps precludes integration with SiriKit. We can only hope that Apple will continue to open up more domains in future iterations of iOS.
\nFor this reason, SiriKit might as well be considered a public beta for now: it covers a fraction of what users do on their iPhones and iPads. I’ve only been able to test one app with SiriKit integration over the past few days – an upcoming update to Airmail. Bloop’s powerful email client will include a SiriKit extension of the messaging domain (even if email isn’t strictly “messaging”) to let you send email messages to people in your Airmail contact list.
\nSiriKit and Airmail with different levels of verbosity.
In using Airmail and Siri together, I noticed how SiriKit took care of parsing natural language and multiple ways to phrase the same request. The “resolve, confirm, and handle” flow was exemplified by the steps required to confirm pieces of data required by Siri – in Airmail’s case, the recipient’s email address and message text.
\nMultiple steps in SiriKit.
As for other domains, I can’t comment on the practical gains of SiriKit yet, but I feel like messaging and VoIP apps will turn out to be popular options among iPhone users.
\nI want to give Apple some credit. Conversational interactions are extremely difficult to get right. Unlike interface elements that can only be tapped in a limited number of ways, each language supported by Siri has a multitude of possible combinations for each sentence. Offloading language recognition to Siri and letting developers focus on the client side seems like the best solution for the iOS ecosystem.
\nWe’re in the early days of SiriKit. Unlike Split View or notifications, it’s not immediately clear if and how this technology will change how we interact with apps. But what’s evident is that Apple has been laying SiriKit’s foundation for quite some time now. From the pursuit of more accurate language understanding through AI to the extensibility framework and NSUserActivity38, we can trace back SiriKit’s origins to problems and solutions Apple has been working on for years.
\nUnsurprisingly, Apple is playing the long game: standardizing the richness of the App Store in domains will take years and a lot of patient, iterative work. It’s not the kind of effort that is usually appreciated by the tech press, but it’ll be essential to strike a balance between natural conversations and consistent behavior of app extensions.
\nApple isn’t rushing SiriKit. Hopefully, that will turn out to be a good choice.
\n\nAmong various minor enhancements, there’s one notable addition to Safari in iOS 10 that points at the possible direction of many iPad apps going forward. Effectively, it’s the most important iPad-only feature this year.
\nSafari for iPad now supports in-app split view to open two webpages at once in landscape mode. Apple named this “Safari split view”, but it’s not related to the namesake system-wide multitasking mode. Opening two webpages in Safari doesn’t engage the Slide Over app switcher.
\n\nThere are multiple ways to invoke Safari split view. You can tap and hold on the tabs icon in the top toolbar and choose ‘Open Split View’. This causes Safari to create two views and bring up the Favorites grid on the right side.
\n\n\nYou can also tap & hold on a link in a webpage, hit ‘Open in Split View’, and the destination page will load on the right. If split view is already active, holding a link on the right side will offer a similar ‘Open on Other Side’ option.
\n\nIf you’d rather tap once to open a webpage in split view, you can perform a two-finger tap on a link to either activate split view (if you’re in full-screen) or open a new tab on the other side.
\nLast, you can drag a tab out of the toolbar and take it to the other side (either left or right). If split view isn’t already enabled, the tab will morph into a small preview of the webpage as Safari resizes inwards, showing a gray area that indicates you can drop the page to open it in split view.
\nSafari’s new drag & drop for tabs.
\nIt’s a polished, fun animation, which also works the other way around to put a tab back on the left and close split view.39
\nIn addition to drag & drop, you can tap and hold the tabs button to merge all tabs in one screen and close split view. Because Safari for iOS 10 supports opening unlimited tabs (both on the iPhone and iPad), this menu also contains an option to close all tabs at once – one of my favorite tweaks in iOS 10.
\nClose all tabs at once.
Safari split view is aware of iOS’ system-wide Split View. If Safari is in split view and you bring in a second app to use alongside the browser, Safari’s split view is automatically dismissed by merging all tabs. When you close Split View and go back to Safari in full-screen, the browser’s split view resumes where it left off.
\nThere’s nothing surprising about the look of Safari split view: using Size Classes (we meet again, old friend), Safari creates two instances of the same view, each independent from the other and carrying the same controls.40
\nI’ve long wished for the ability to view and interact with multiple Safari tabs at once on my iPad Pro. Before iOS 10, developers who recognized this gap in Safari’s functionality were able to sidestep Apple’s limitations with clever uses of Safari View Controller. The new Safari on the iPad obviates the need for those third-party apps with a native solution. The feature is particularly effective on the 12.9-inch iPad Pro, where you can view two full webpages instead of smaller versions scaled to fit. It’s the same feeling of upgrading to Split View on the 12.9-inch iPad Pro from the 9.7-inch model.
\nThe 9.7-inch iPad Pro, of course, shows less content than the 12.9-inch model (left). (Tap for full size)
After incorporating Safari split view in my workflow, I wish every document-based iPad app offered a way to split the interface in two panes.
\nSafari split view is a brilliant showcase of drag & drop to move content across multiple views, too.
\nLack of a proper drag & drop framework for iPad apps, especially after the introduction of Split View in iOS 9, is baffling at this point. Multitouch and Split View are uniquely suited to breathe new life into the decade-old concept of drag & drop – just look at macOS and how well the system works even without multitouch. Drag & drop would make more sense on iOS than it ever made on the desktop by virtue of direct content manipulation.
\nSafari’s drag & drop tab behavior is, hopefully, showing a glimpse of the future we deserve. A system-wide drag & drop framework is going to be trickier to pull off than a single browser tab41, but we can keep the dream alive.
\nThere are other smaller changes in iOS 10’s Safari.
\nThe parsing engine of Safari Reader – Apple’s tool to increase the readability of webpages by stripping them of interface elements and ads – has been updated to support display of bylines, publication dates, and article subheads. The extraction of these bits of metadata isn’t perfect42, but it’s a step up from the previous version.
\nWhen Apple introduced Safari View Controller last year, they were adamant about its appearance: because the experience had to be consistent with Safari, developers couldn’t modify or style the new in-app web view with their own UI. Third-party apps could set a tint color for the toolbar icons of Safari View Controller to make them match their colors (something we’ve seen implemented in apps like Overcast and NewsBlur), but that was as far as customization went.
\nA customized Safari View Controller in Tweetbot for iOS 10, matching the dark theme.
Apple is letting developers customize Safari View Controller with a tint color for view bar backgrounds in iOS 10. In addition to color tinting for UI controls, the color of the entire toolbar can be set to something other than white. This should make the experience within apps more cohesive and the transition between app and web view less jarring.
\nSpeaking of Safari View Controller: Find in Page is now supported in app web views as an action extension.
\nWhen hitting Command-T on iOS 10, a new tab opens with the cursor placed in the address bar, ready to start typing. External keyboard users rejoice.
\nDownloads, a longtime Safari issue, haven’t been exactly “fixed” in iOS 1043, but Apple has found ways to circumvent old annoyances. First, hitting a link to a file download (such as a .zip file) now displays proper download progress in the address bar. Then, when the download is complete, the file can be saved to iCloud Drive with iOS 10’s new Add to iCloud Drive extension.
\nSaving a downloaded file from Safari to iCloud Drive is now possible with an extension.
We still haven’t reached the point where Safari automatically downloads files into an iCloud Drive folder, but the idea doesn’t seem so far-fetched anymore.
\nAnother limitation of Safari that has been fixed in iOS 10 is video playback. Thanks to a new webkit-playsinline
property, web developers can specify videos that can be played inline on the iPhone without opening the full-screen player.
Minimize and expand.
Even if the property isn’t specified, playback will commence in full-screen but users can pinch close on the video (or tap a resize button) to keep playing it inline. Being able to shrink videos down makes the iPhone’s browsing experience more pleasant.
\nFurthermore, Safari in iOS 10 brings support for automatic video playback of videos without audio tracks on page load (you may have seen such videos in this review). The change, outlined on the WebKit blog earlier this year, was motivated by the rising popularity of animated GIFs. As WebKit engineers noted, the GIF format itself can be computationally intensive and it’s not energy-efficient – part of the reason why online GIF providers have embraced the <video>
element with disabled audio tracks to replace GIFs. This change should also help websites that use muted videos as animated backgrounds, which will display correctly on devices running iOS 10.
Speaking of websites, Safari on iOS 10 enables pinch to zoom by default on all sites – even those that have specifically disabled zooming through meta tags. From an accessibility standpoint, I can only applaud Apple’s decision.
\nMoving onto other features, you can search your Favorites and Reading List items by swiping down to reveal a search bar. Reading List doesn’t support full-text search, so you’ll only be able to search titles and not inside the text of a saved article.
\nFinally, smarter AutoFill. While iOS 9 could suggest passwords and emails when attempting to fill web forms, iOS 10 takes it a step further and replaces the Passwords button above the keyboard with AutoFill Contact. The new dialog offers multiple options for your own contact card (such as Work and Personal email addresses) with the ability to customize your card’s AutoFill without leaving Safari.
\nCustomizing AutoFill.
For the first time, you can also auto-fill any other contact on a webpage by hitting ‘Other Contact…’ and picking an entry from your address book (other contacts can be customized before auto-filling, too).
\nApple is taking advantage of QuickType suggestions to speed up AutoFill: if you don’t want to use the AutoFill interface, QuickType can suggest names, multiple email addresses, phone numbers, and other information from a contact’s card through predictive shortcuts.
\nThe deeper integration of contacts and AutoFill makes it easier to sign up for web services without having to type details. It’s another argument in favor of Safari View Controller for apps: developers of apps with an account component will get more flexible AutoFill features for free if they implement Apple’s web view in their signup flows. I know I wouldn’t want to type contact information (or use an extension) after testing the convenience of AutoFill in iOS 10.
\nEven without new headline features like Safari View Controller and Content Blockers, Safari remains Apple’s most mature and powerful iOS app. This year’s refinements are well thought-out and split view is a boon for web multitasking on the iPad. I have no complaints.
\n\nOf all system apps, Apple Music is the one that got a dramatic makeover in iOS 10.44
\nWith a redesign aimed at placating concerns about an overly complex interface, and with new algorithmic discovery features, iOS 10’s Apple Music is streamlined and more powerful. Beneath a veil of astonishing simplification, the new Apple Music packs significant enhancements to the discovery and listening experience.
\nIf you had to list the shortcomings of Apple Music since its debut in iOS 8.3, a hodgepodge of features tacked onto a crumbling pre-streaming foundation would easily sit at #1. With iOS 10, Apple wants its music streaming service to be more accessible and intuitive for everyone who’s moved past iTunes.
\nPart of this effort has resulted in a modern design language that does away with most iOS UI conventions in favor of big, bold headlines, larger buttons, and a conspicuous decrease of transparency in the interface. If you’re coming from Apple Music on iOS 9 and remember the information-dense layout and translucent Now Playing screen, prepare to be shocked by iOS 10’s rethinking of the Music app.
\nWhile the bold design has intriguing consequences for the overall consistency of iOS’ visuals, its impact on usability is just as remarkable. The new Apple Music is no longer following the interaction paradigms of the iTunes Store and App Store: there’s no front page highlighting dozens of top songs and curated recommendations. Instead, it’s been replaced by a simplified Browse page with a handful of scrollable featured items at the top and links to explore new music, curated playlists, top charts, and genres.
\nRemoving new releases and charts from the Browse page helped Apple accomplish two goals: give each section more room to breathe; and highlight the most important items (singles, albums, videos, etc.) with a big cover photo that can’t be missed.
\nBeing able to discern items with an effective sense of place is the underlying theme of navigation in the new Apple Music. On the one hand, information density has suffered and Apple Music can’t show as much content on screen as it used to. On the other, bold headlines, fewer items per page, and larger controls should prevent users (who aren’t necessarily well versed in the intricacies of the iTunes Store UI, upon which the old Apple Music was based) from feeling overwhelmed. Apple’s new design wants to guide users through Apple Music’s vast catalogue, and it mostly succeeds.
\nApple Music’s For You page on the iPad.
This is evident in the Library page, where Apple’s has switched from a hidden View menu to a set of vertical buttons displayed underneath the “title bar”. If you were confused by the taps needed to browse by Artist or view downloaded music in iOS 9, fear no more – Apple has created customizable buttons for you this time.
\nBig, customizable buttons.
The same philosophy is shared by every other screen in Apple Music. Radio, search, even For Your recommendations – they’ve all moved on from the contortions of their iTunes-like predecessors to embrace clarity and simplicity through white space, large artworks, and big buttons.
\nAnother prominent change from iOS 9 is the removal of translucent panes of glass in the interface. Transparency effects look great in screenshots, but unpredictable color combinations don’t make a good case for legibility and consistency.
\nApple is ditching translucency in the Now Playing screen altogether. In iOS 10, they’ve opted for a plain black-and-white design that lays out every element clearly and keeps text readable at all times.
\n\nIt’s not as fancy as iOS 9, but it’s also not as over-engineered for the sake of beauty. Album artwork stands out against a white background; buttons are big and tappable; there’s even a nice, Material-esque effect when pausing and resuming a song. Alas, the ability to love a song with a single tap from the Now Playing screen is gone.
\nThe new Apple Music is equal parts appearance and substance. The bottom playback widget45 has been enlarged so it’s easier to tap, and it also supports 3D Touch.46 Pressing on it reveals a redesigned contextual menu with options for queue management, saving a song, sharing, liking (and, for the first time, disliking), and, finally, lyrics.
\nApple Music integrates officially licensed lyrics in the listening experience. Apple has struck deals with rightsholders to make this happen; even if lyrics aren’t available for every song on Apple Music yet, they’ve seemed to grow during the beta period this summer, and I expect more lyrics to become available on a regular basis for new and old releases.
\nWhen lyrics are available, you’ll see an additional button in the contextual menu as well as a new section when swiping up on the Now Playing screen. This is where shuffle, repeat, and the Up Next queue live now; I wish Apple had done a better job at hinting this space exists below what you can see. There’s no indication that you can swipe up to reveal more options.
\nSwipe up to reveal repeat, shuffle, Up Next, and lyrics.
Lyrics, unlike Musixmatch, don’t follow a song in real-time. They’re either displayed in a full-screen popup (if opened from the contextual menu) or above Up Next in plain text.
\nLyrics are modal on the iPad when opened from the contextual menu.
That’s not a deal-breaker, though. As a lyrics aficionado (I’ve also learned English through the lyrics of my favorite songs), this is a fantastic differentiator from other streaming services. Not having to Google the lyrics of what I’m listening and being taken to websites riddled by ads and often incorrect lyrics? I’m not exaggerating when I say that this feature alone might push me to use Apple Music every day again. I was hoping Apple would eventually bring native lyrics to Apple Music, and they delivered with iOS 10.47
\nAnother functional improvement to the Now Playing screen is a built-in menu to control audio output. Taking a cue from Spotify, Apple Music sports a button in the middle of the bottom row of icons to switch between the iPhone’s speaker, wired and Bluetooth headphones, and external speakers with just a couple of taps.48
\nYou don’t have to open Bluetooth or AirPlay settings to stream music to different devices. This was probably built with the iPhone 7 and AirPods in mind, but it’s a feature that makes managing audio faster for everyone.
\nAvailable in Settings > Music, a new Optimize Storage option lets iOS automatically remove music from your device that you haven’t played in a while. Unlike the similar setting for iCloud Photo Library, you can control the minimum amount of songs you want to keep downloaded on your device with four tiers. On my 64 GB iPhone 6s Plus, I see the following options:
\nIf you have an older iOS device with limited storage, this should be useful to ensure a compromise of available space and offline songs.
\nApple Music for iPad doesn’t diverge from the iPhone counterpart much, but there a few differences worth noting.
\nThe Browse page collects featured items, hot tracks, new albums, playlists, and more within a single view, with buttons to explore individual sections placed in a popover at the top.
\nInstead of taking over the app in full-screen, playing a song opens a Now Playing sidebar. The view is launched from a tab bar split between icons and the playback widget.
\nThe sidebar feels like having Slide Over within Music: it doesn’t behave like Safari or Mail’s in-app split view, where you can interact with two views at the same time; instead, it’s modal and overlaid on top of content.
\nIt’s not immediately clear why Apple didn’t stick to a full-screen Now Playing view on the iPad if the sidebar still prevents interactions on the other side. Perhaps they realized giant-sized album artwork didn’t make sense on the iPad Pro? Maybe the vertical layout lends itself better to peeking at Up Next and lyrics below playback controls? The sidebar is fine, but I’d rather have a real in-app split view in Music too.
\nThe Split View that Music does have on iOS 10 is the system-wide multitasking one. On the 12.9-inch iPad Pro, Now Playing is a sidebar in both Split View layouts when Music is the primary app, but it turns into a full-screen view when Music is the secondary app in Slide Over.
\niOS 10 brings discovery features that pit Apple Music against Spotify’s algorithmic playlists and personalized curation.
\nApple Music’s For You page features two personalized playlists in a carousel at the top – My New Music Mix and My Favorites Mix. Both are automatically refreshed every week and are personalized for each user based on their listening habits and favorite songs.
\nMy New Music Mix, refreshed every Friday, showcases new music Apple Music thinks you’ll like; My Favorites Mix is a collection of hit singles, deep cuts, and songs related to your favorite artists that is refreshed every Wednesday.
\nThe idea of a personalized mixtape refreshed on a weekly basis isn’t new. Spotify was a pioneer in this field with their excellent Discover Weekly, which recently expanded to Release Radar. Spotify’s system is powered by a neural network (its inner workings are utterly fascinating) and, as I previously wrote, it delivers impressive personalized picks that almost feel like another person made a mixtape for you.
\nIt’s too early to judge Apple’s efforts with personalized playlists in iOS 10. They only rolled out two weeks ago, and, in my experience, such functionalities are best evaluated over a longer span of time after judicious listening and “loving” of songs.
\nMy impression, however, is that Apple has succeeded at launching two great ways to discover new music and re-discover old gems every week. My first two My Favorites Mix playlists have been on point, collecting songs (both hits and lesser known ones) from all artists I knew and liked. Apple Music’s first two My New Music Mix playlists weren’t as accurate as Spotify’s Release Radar, but, to be fair, I have been religiously using Spotify for over 9 months now, whereas I just came back to Apple Music. Accuracy may still be skewed in Spotify’s favor given my listening’s history.
\nStill, we don’t need to wait to argue that algorithmically-generated playlists refreshed weekly are a fantastic addition to Apple Music. As I noted in my story on Spotify’s Discover Weekly earlier this year, human curation is inherently limited. Apple has been at the forefront of human-made playlists, but it was missing the smart curation features of Spotify. Apple’s two personalized mixes seem more – pardon the use of the term – mainstream than Discover Weekly, but that isn’t a downside. Easing people into the idea of personalized playlists made by algorithms and then launching more specific types focused on music aficionados might be a better consumer approach than Spotify. I’d wager Apple is considering a product similar to Spotify’s Discovery Weekly – a playlist that highlights back-catalogue songs you might like from artists you’re not familiar with.
\nMy New Music Mix and My Favorites Mix already seem very good, and they show that Apple can compete with Spotify when it comes to personalized music curation. As with other algorithmic features launched in iOS 10, Apple’s debut is surprisingly capable and well-reasoned.
\nThere are other changes in the For You section. Connect, already an afterthought in iOS 9, has been demoted from standalone view to a sub-section of For You.
\nThose links don’t even open in Apple Music.
Some people must be using Connect (who’s leaving those comments?), but I just don’t see the incentive for artists to post on it and for users to subscribe. Apple doesn’t seem to care about it as a social network, and everyone is better off engaging with fans and following artists on Twitter and Facebook. Unless Apple gives it another try, I don’t think Connect can suddenly gain relevancy. Rolling Connect into For You feels like Apple’s version of “going to live in a farm upstate”.
\nPlaylists and sections recommended in For You have been redesigned and shuffled around. Every section can now be scrolled horizontally to reveal more content; Recently Played and Heavy Rotation tend to float towards the top for easier access; and there’s the usual mix of artist spotlights, essentials (they’re not called “Intro To” anymore), human-curated playlists, and a new section called New Releases For You.
\nThe refreshed For You in iOS 10.
If you liked Apple’s For You section before, you won’t be disappointed by iOS 10’s refresh. But I believe My New Music Mix and My Favorites Mix will steal the show for many.
\nI’m still not sure if I want to give Apple Music another try – I’ve been truly satisfied with Spotify since I moved back to it in January. Apple’s updates in iOS 10 are compelling, though. I got used to the “big and bold” design quickly, and I find it easier to parse and more fun than Spotify’s boring black and green. Apple Music may sport lower information density, but, at least for me, it’s easier to use than Spotify. Personalized playlists are solid already, and I’ve been keeping My Favorites Mix synced offline for fast access to a collection of songs I know I’m going to like. And then there’s lyrics, which is nothing short of a game changer for me.
\nApple’s firing on all cylinders against Spotify and others in the music streaming industry. It might be time to take Apple Music for a spin again.
\n\nWithout new exploration and location editing modes (transit launched in September 2015, and it’s slowly rolling out to more cities; crowdsourced POI collection is still a no-go), Apple is making design and third-party apps the focal points of Maps in iOS 10.
\nMaps’ new look removes UI chrome and enhances usability on large iPhones through lowered controls, intuitive buttons, and more proactive suggestions. There are floating buttons to find your position and open Maps’ settings. Apple has gotten rid of the search bar at the top and replaced it with a card at the bottom (a floating “sidebar” on the iPad). You can swipe up the card to reveal suggestions below the search field.
\nThe sense is that Apple wanted to ship a smarter, more conversational search feature, which now offers proactive place suggestions. Instead of a handful of recent addresses, Maps now curates a richer list of locations based on recently viewed and marked places, favorites, places you’ve been to, and addresses you probably want to go next based on proactive iOS features.
\nA webpage with an address I was viewing in Safari, proactively suggested by Maps.
Each suggestion is associated with a relevant icon, so they’re prettier and easier to identify. You can even swipe on them to remove them from the list or share them with other apps.
\nColorful business and landmark icons are used in search results, which are more lively than iOS 9 and include more Nearby categories. In selected regions, Nearby results can be filtered with sub-categories in a scrollable bar at the bottom of the screen.
\n\nIconography has always been one of the strong suits of Apple Maps, and the company is doubling down on it with iOS 10. Previously, when searching for places that pertained to a specific category such as Restaurants, Maps would drop generic red pins on the map, requiring you to tap on them to open a first popup, then tap again to open a detail view with information about the place. It was a slow, unattractive process that hid useful details from the first step of search results.
\niOS 10 improves upon this in two ways. Instead of red pins, multiple search results are dropped on the map with more descriptive pins that suggest what a result is before tapping it. In the restaurant example, you’ll end up with icons that contain pizza slices, hamburgers, or a fork and knife, for instance. If two results are close to each other on the current zoom level, they’ll be grouped in a numeric orange pin that you can tap to choose one result.
\nSecond, choosing a result uses the iPhone’s search panel as a split view to display business information and the map at the same time. As you tap through results, you can preview place details with a card UI at the bottom that shows ratings, distance, and a button to start directions.
\nThe interaction is similar on the iPad. Instead of managing result cards on the vertical axis, they’re overlaid horizontally in a sidebar on the left.
\nMaps results on iPad
\nBy combining these elements with cards that are more comfortable to reach, iOS 10’s Maps feels like it’s been optimized for humans and nimble exploration. By comparison, the old Maps feels static and arbitrary.
\nThe same philosophy has been brought to navigation. In iOS 10, you can pan freely on the map and re-center navigation with a button.
\nYou can pan around during navigation in iOS 10.
Details for the current trip, such as estimated arrival time and distance, are displayed in a bottom card, which, like results, can be swiped up to access more options. These include audio settings, turn-by-turn details, an overview, and, for the first time, en-route suggestions for places you might want to stop by, like gas stations or coffee shops.
\n\nAfter selecting a category of suggestions during navigation, Maps will return a list of nearby results and tell you how many minutes each will add to your trip. Select one, confirm that you want to stop by, and Maps will update directions for the new destination. When you’re done, you can resume your route to the first destination with a blue banner at the top.
\nApple is also going to let developers plug into Maps with extensions. If an app offers ride booking, restaurant reservations, and “other location-related services”, it can embed its functionalities in Maps.
\nMaps extensions, like SiriKit’s, are based on intents and developers can provide custom interfaces with an Intents UI extension. The same extensions that allow users to hail a Uber and track status with Siri can be used from Maps to get a ride to a selected place.49 Maps extensions contained inside iOS apps are disabled by default; they have to be activated from Settings > Maps > Extensions.
\nOpenTable’s Maps extension.
I’ve only been able to test OpenTable’s Maps extension earlier this week, which has limited integration in Rome for a few restaurants. Once enabled, OpenTable’s extension adds a button to view more information about a restaurant and make a reservation. You can set table size, pick available times, and enter special requests in Maps. OpenTable will ask you to continue the task in the main app to confirm a reservation, but it’s nice to have a way to quickly check times and availability without leaving Maps.
\nI’m curious to see how ride-sharing and other location-based services available in Italy will implement Maps extensions.
\nThe quality of Apple Maps data for my area still isn’t comparable to Google Maps. Apple Maps has improved since iOS 6, but I still wouldn’t trust it to guide me through a sketchy neighborhood in Rome at night. At the same time, I prefer the design of Apple Maps and its many thoughtful touches to Google Maps. From my perspective, Apple has created a more intuitive, better designed app without the data and intelligence of Google. It’s an odd predicament to be in: while I appreciate Apple Maps’ look in iOS 10, I also want navigation to be reliable and trustworthy.
\nThere’s a lot to like in Maps for iOS 10 and great potential for developers to elevate location-driven apps to a more contextual experience. The revised interface imbued with proactive suggestions is a step forward from iOS 9; the richer presentation of results makes Maps friendlier and informative. Maps in iOS 10 feels like someone at Apple finally sat down and tried to understand how regular people want to use maps on a phone. The redesign is outstanding.
\nApple has perfected Maps’ interface and interactions, and now they have a developer platform, too. An underlying problem remains: when it comes to data accuracy, your mileage with Apple Maps may vary.
\n\nApple’s home automation framework, HomeKit, is ready for prime time in iOS 10. In addition to a dedicated page of shortcuts in Control Center, HomeKit is getting a native app for accessory management. It’s also expanding to new types of accessories, including cameras.
\nLike iCloud Drive graduated to an app after a framework-only debut, all your HomeKit accessories can be accessed from a Home app in iOS 10. You won’t find the complexity of advanced tools such as Matthias Hochgatterer’s unfortunately-named Home in Apple’s take. Instead, Apple’s Home app will greet you with the same bold look of Apple Music and News.
\nCustomizable edge-to-edge photo backgrounds and large buttons command the interface.
Home works with any HomeKit accessories previously set up on iOS 9. One of the biggest flaws of the old HomeKit implementation – the inability to set up new accessories without an app from the vendor – has been fixed with iOS 10’s Home app, which offers a complete setup flow from start to finish.
\nRooms are a section of the app, while your favorite accessories and scenes are highlighted in the main Home dashboard. They’re the same shortcuts used in Control Center.
\n\nApple offers a collection of suggested scenes to get started – such as “Good morning” or “I’m home” – but you’ll want to create your own scenes, choosing from custom icons50 and any accessory action you want.
\n\nMost users will only use Home for the initial accessory/scene configuration and to add favorites in Control Center, but there are hidden tricks in the app that are worth exploring (and, like Apple Music, concerning from a discoverability perspective).
\nYou can find a summary of average conditions and statuses at the top of the Home page. You might see humidity, temperature, and door lock status in this message. You can tap Details for an accessory overview.
\nYour home’s wallpaper can be modified by tapping the location icon in the top left and choosing a new one. You can do the same for rooms: after picking a room, tap the list icon in the top left, open Room Settings, and assign a new wallpaper.
\nCustom wallpapers for multiple rooms are a nice touch: they make the Home app look like your home, but I wish they synced with iCloud.
\nSome of the app’s features are too hidden. To navigate between rooms, you can tap the menu at the top, but you can also swipe between rooms. There’s no visual cue to indicate that multiple rooms live on the same horizontal pane. The design language shared by Apple Music and Apple News means both apps have this feature discoverability issue in common.
\nSimilarly, buttons can be pressed with 3D Touch or long-tapped to open a modal view with intensity levels and settings for colors and more.
\n\nThere’s no way of knowing that more functionality lies beyond these “special taps”. And that’s too bad, because this view lets you manage useful options such as accessory grouping51 and bridge configuration.52
\nA front-end HomeKit interface has allowed Apple to bring deeper management features to iOS. First up, sharing: if you want to add members to your home, you can invite other people and give them administrative access to accessories. You can allow editing on a per-user basis, and you can also choose to let them control accessories while inside the house or remotely.
\nSharing with HomeKit.
This ties into the Home app’s second advanced feature – home hubs. What used to be an opaque, poorly documented option in iOS 9 is now a setting screen: your Apple TV or iPad can be used as HomeKit hubs when you’re not at home. As long as the devices are plugged into power and connected to Wi-Fi, you can use them as bridges between a remote device and your accessories at home without additional configuration required.
\nRemote control comes in handy when you consider HomeKit’s deep integration with iOS in Siri and Control Center. In my tests, I was able to turn on my espresso machine remotely when I was driving home just by talking to Siri. Control Center’s Home page works with remote control: I can turn off my lights with one swipe, or I can check the status of my door anywhere on iOS.53
\nThere’s also automation. Third-party HomeKit management apps have long offered ways to set up rules and triggers to automate accessories and scenes based on specific conditions. iOS 10’s Home app brings a simpler interface to have accessories react to changes at home in four different ways:
\nWhen creating a new automation, you won’t be presented with an intimidating workflow UI. Apple has nicely separated the individual steps behind an automation: first you’ll choose the accessory or trigger that will start an automation, then you’ll be shown a handful of options. If you want to turn off your lights when the door closes, for instance, you first choose from Door: Open/Closed then move onto selecting scenes or lights.
\n\nI set up some automation rules in the Home app a couple of months ago, and they’ve been running smoothly since. Every day at 5 AM, lights in my bedroom and kitchen are turned off because I’ve likely gone to sleep by then. In another automation, my bedroom light turns red if the humidity level rises over 60%.
\nIn the future, I’d like to see the ability to create nested automations with support for presence recognition. Currently, I can’t tell the Home app to send me a notification if the main door opens and I’m not at home, or to turn off the lights if it’s after sunset and nobody’s home.
\nLast, HomeKit is expanding to new types of accessories. With iOS 10, third-party manufacturers can create:
\nCameras and doorbells were two highly requested enhancements to HomeKit. Third-party HomeKit cameras aren’t available on the market yet – which is unfortunate, as I couldn’t test them for this review – but I plan on buying one as soon as possible.
\nApple’s plan for the connected home is coming together in iOS 10. Platform fragmentation has been a fundamental problem of third-party smart home devices and hubs: we’ve all heard the tales of devices being unable to talk to each other, being discontinued after a couple of years, or having to support external APIs to bring some communication into the mix.
\nWith HomeKit, Apple’s closed and slower approach is paying off in consistency, durability, and integration with the OS. The Elgato sensors I bought nearly two years ago have worked perfectly with iOS 10 since the first beta. I don’t have to worry about companies supporting IFTTT, Wink, or other protocols as long as they work with HomeKit.
\nIn Apple’s ecosystem, I can always extend my setup. When you consider extra functionalities such as rich notifications, Siri, remote hubs, and Control Center, it’s clear that home automation is best experienced as a tightly integrated extension of our smartphones.
\nI want to believe that the rollout of HomeKit accessories will continue at a steady pace with a Home app front and center in iOS 10. Even if that’s going to be a problem for my wallet.
\n\nAs is often the case with new versions of iOS, Apple added a variety of improvements to its suite of apps – some of which, for the first time, can also be deleted from the Home screen.
\nOn the 12.9-inch iPad Pro, Mail has received a three-panel mode that shows a mailbox sidebar next to the inbox and message content in landscape.
\nThree-panel view on the iPad Pro.
This extended mode is optional; it can be disabled by tapping a button in the top left of the title bar. If you were wondering why iPad apps couldn’t show more content like on a Mac, this is Apple’s answer. It’s the right move, and I’d like it to propagate to more apps.
\nConversation threading has also been updated in iOS 10 to resemble macOS’ conversation view.
\nIn iOS 10, messages in a thread are shown as scrollable cards. Each message can be swiped to bring up actions, and it can be taken in full-screen by tapping on its header (or ‘See More’ at the bottom).
\nYou can control the appearance of conversation view in Settings > Mail (Contacts and Calendars have received their own separate setting screens, too). Mail lets you complete threads (load all messages from a thread even if they’ve been moved to other mailboxes) and display recent messages on top. Conversation view makes it easier to follow replies without having to squint at quoted text. It’s nicer and more readable; I wish more third-party email clients had this feature.
\nThis willingness to make Mail more desktop-like doesn’t apply to smart folders, which are still nowhere to be found on iOS. Instead, Apple hopes that filters will help you sift through email overload.
\n\nFilters can be enabled with the icon at the bottom left of the inbox. You can customize them by tapping ‘Filtered By’ next to the icon. Filters include accounts, unread and flagged messages, messages that are addressed to you or where you’re CC’d, and only mail with attachments or from VIPs.
\nFilters aren’t a replacement for smart folders’ automatic filing, but they can still provide a useful way to cut down a busy inbox to just the most important messages. I wish it was possible to create custom filters, or that Apple added more of them, such as a filter for Today or the ability to include messages from a specific address (without marking it as VIP).
\nLast, like Outlook, Mail now recognizes messages from mailing lists and lets you unsubscribe with one tap without opening Safari.
\nTapping the Unsubscribe button will send a an unsubscribe request as a message on your behalf, which you can find in the Sent folder. In my experience, Mail has done a solid job at finding newsletters and putting its Unsubscribe banner at the top.
\nCompared to apps like Outlook, Airmail, and Google Inbox, Apple is advancing Mail at a deliberately slow pace. You can’t send an email message to extensions with the share sheet (more on this problem here); several macOS Mail functionalities are still missing from the iOS app; and, Google is way ahead of Apple when it comes to smart suggestions and automatic message categorization.
\nMail is a fine client for most people, but it feels like it’s stuck between longing for desktop features and adopting what third-parties are doing. There’s a lot of work left to do.
\nApple’s system dictionary – built into every app via the copy & paste menu – has been overhauled as Look Up, a more versatile interface meshing Spotlight and Safari search suggestions.
\nThe new Look Up in iOS 10.
Look Up still provides dictionary definitions for selected words. The dictionary opens as a translucent full-screen view on the iPhone (a modal window on the iPad) with cards you can tap to read thorough definitions. New in iOS 10, the Italian and Dutch dictionaries can display multilingual translations in English, which I’ve found useful to expand my vocabulary without opening Google or a third-party dictionary app.
\nWhat makes Look Up one of the best additions to iOS 10 is the expansion of available sources. Besides definitions, iOS 10 shows suggestions from Apple Music, Wikipedia, iTunes, suggested websites, web videos, news, Maps, and more. These are the same data providers powering suggestions in Safari and Spotlight, with the advantage of being available from any app as long as you can select text.
\nLike in iOS 9, some results can be expanded inline, such as Wikipedia summary cards, while others take you to a website in Safari. The presentation style is also the same, with rich snippets and thumbnails that make results contextual and glanceable.
\nSmart data detectors have also been updated with Look Up integration. If iOS 10 finds a potential result in text, it’ll be underlined to suggest it can be tapped to open Look Up.
\nLook Up triggered from a data detector in an email subject.
In my tests, Look Up suggestions in text showed up in apps like Messages and Mail, and they often matched names of popular artists (e.g. “Bon Iver”) or movies.
\nBy plugging into a broader collection of sources, Look Up is more than a dictionary. It’s Spotlight for selected text – an omnipresent search engine and reference tool that can take you directly to a relevant result without Google.
\nI’ve become a heavy user of Look Up for all kinds of queries. I look up topics on Wikipedia54 from my text editor or Notes without launching Safari. I even use it for restaurant reviews and Maps directions: iOS can pop up a full-screen Maps UI with the location, a button to get directions, and reviews from TripAdvisor. Look Up is a useful, clever addition, and I wish it worked for more types of content. It’d be nice to have POIs from Foursquare and Yelp in Look Up, for example.55
\nWe first saw the potential for deeply integrated search with Spotlight in iOS 9. It’s not only a matter of competition between Apple and Google – any suggestion that requires fewer interactions is a better experience for Apple and its users. Look Up makes web search a feature of any app; it’s an intelligent continuation of the company’s strategy.
\nNotes was, together with Safari, the crown jewel of Apple’s app updates in iOS 9. This year, Apple is building upon it with subtle refinements and a new sharing feature.
\nLike Mail, Notes on the 12.9-inch iPad Pro offers a three-panel view. If you spend time moving between folders to manage notes, this should be a welcome change.
\nThree-panel view in Notes.
When using an external keyboard, you can now indent items in a bulleted list with the Tab key. The same can be done with the copy & paste menu; curiously, Apple labeled the opposite behavior ‘Indent Left’ instead of ‘Outdent’.
\nWhen a note refreshes with content added on another device, the new bits are temporarily highlighted in yellow. This helps seeing what has changed when syncing with iCloud.
\nNote sharing is the big change in iOS 10. Arguably the most requested feature since the app’s relaunch in iOS 9, collaboration puts Notes on the same playing field of two established competitors – Evernote and OneNote. In pure Apple fashion, collaboration has been kept simple, it’s based on CloudKit, and there’s an API for developers to implement the same functionality in their apps.
\nIn iOS 10, every note has a button to start collaborating with someone. Tapping it opens a screen to share a note, which is done by sending a link to an app like Messages or Mail (you can also copy a link or send it to a third-party extension). Once you’ve picked how you want to share the note’s link, you can add people by email address or phone number.56 As soon as the recipient opens the iCloud.com link for the note and accepts it, the note will gain an icon in the main list to indicate that it’s a shared one.57
\nSharing a note with someone on iMessage.
Collaborating with someone else on the same note doesn’t look different from normal editing. Unlike more capable collaborative editing environments such as Google Docs, Quip, or Dropbox Paper, there are no typing indicators with participant names and you can’t watch someone type in real-time. The experience is somewhat crude: entire sentences simply show up after a couple of seconds (they’re also highlighted in yellow).
\nApple doesn’t view Notes collaboration as a real-time editing service. Rather, it’s meant to offer multiple users a way to permanently store information in a note that is accessed regularly.
\nI believe Notes collaboration will be a hit. I can see families sharing a grocery list or travel itinerary in Notes without having to worry about creating online accounts and downloading apps. Colleagues keeping a collection of screenshots and links, teams sharing sketches and snippets of text – the flexibility of Notes lends itself to any kind of sharing in multiple formats.58
\nEven without the real-time features of Google and Dropbox (and the upcoming iWork update), Notes collaboration works well and is fast. In my three months of testing, I haven’t run into conflicts or prompts to take action.
\nI was skeptical, but Notes collaboration works. In a post-Evernote world, Notes is still the best note-taking app for every iOS user.
\nLike last year, we’re going to have a separate story on Apple News. I wanted to briefly touch upon a few changes, though.
\nApple News is the third iOS 10 app to sport a redesign centered on bold headlines, sizeable sections, and a more varied use of color.
\nThe app launches to a For You view that does away with a traditional title bar to show the date and local weather conditions. Top Stories is the first section, highlighting 4-5 stories curated by Apple editors. These stories tend to be general news articles from well-known publications, and there’s no way to turn them off even if you mute the channel.
\nSections in the main feed are differentiated by color, whether they’re curated by Apple (such as Trending or Featured) or collected algorithmically for your interests. Bold headlines don’t help information density (on an iPhone 6s Plus, you’ll be lucky to see more than four headlines at once), but they don’t look bad either. The large, heavy version of San Francisco employed in the app makes it feel like a digital newspaper.
\nBecause of my job and preferences in terms of news readers, I can’t use Apple News as a replacement for Twitter or RSS. I want to have fine-grained control over my subscriptions, and the power-user tools offered by services like NewsBlur and Inoreader aren’t a good fit for Apple News. There are also editorial choices I don’t like: the more I keep muting and disliking certain topics (such as politics and sports), the more they keep coming back from Apple’s editors or other publications. Apple’s staff believes those are the stories I should care about, but I’ve long moved past this kind of news consumption. I don’t have time for a news feed that I can’t precisely control and customize.
\nAs a general-purpose news reader, Apple News does a decent job, and the redesign gives sections and headlines more personality and structure. At the same time, Apple News still feels less inspired than Apple Music; the changes in iOS 10 aren’t enough to convince me to try it again.
\n\nThe Clock app has been updated with two new features aimed at people who use it at night: a dark theme (it looks nice) and Bedtime.
\nWith Bedtime, Apple wants to give users with a morning routine an easy way to remember when it’s time to sleep. Like other sleep trackers on the App Store, Bedtime sends a notification a few minutes before bed, and it wakes you up with gentle melodies of growing intensity (you can choose from 9 of them, with optional vibration). The goal is to be consistent, always go to bed and wake up at the same time every day, and get a regular amount of sleep each night.
\nBedtime has a good UI design with a dial you can spin to adjust when you’d like to sleep and wake up, and it’s integrated with HealthKit to save data under the Sleep Analysis category.
\nI can’t use Bedtime because, as someone who works from home, I never wake up at the same time every day and I don’t have kids to drive to school. Bedtime is too optimistic for my bad habits. I think it’s a nice enhancement, though, and I bet it’ll be quite popular among iOS users.
\nIf all you ever wanted from the Activity app was a way to stay motivated by comparing your friends’ progress to yours, Apple has you covered in iOS 10.
\nSharing is now built into Activity: once you’ve invited a friend to share data with you, the Sharing tab will display activity circles, total completion, and burned calories. You can tap through to see more details, hide your activity, and mute notifications; at any point, you can start an iMessage conversation with a friend – presumably to taunt or motivate them.
\nIn my defense, I haven’t been wearing my Apple Watch for the past few weeks.
The “gamification” of Activity, combined with the Apple Watch, should help users push towards a daily goal and stay active. It’s a feature I plan to test more in depth once I get back into my exercise regimen.59 We’ll cover more workout and Activity changes in our review of watchOS 3.
\nAs for Health, Apple has overhauled the app’s dashboard with four main sections represented by colorful artwork: Activity, Mindfulness, Nutrition, and Sleep. Each of these primary categories has an explanation video, and there’s also a general overview video about the Health app that you can watch by tapping a button at the bottom of the Health Data screen.
\nIn an effort to make browsing Health less intimidating, Apple has simplified how you can view statistics recorded by your iPhone and Apple Watch. There’s a new Today page with a scrollable calendar ticker; you can tap any day to see all recorded data points as small previews (which support 3D Touch). Tapping one will take you to the category’s detail page, unchanged from iOS 9.
\nIn the top right corner of the Health Data and Today pages, you’ll find a user icon to quickly access details such as date of birth, sex, and blood type. You can configure wheelchair use here, as well as export your recorded Health data as a .zip archive containing an XML backup. U.S. residents will be able to sign up to become organ donors with Donate Life (previously announced by Apple) in the Medical ID screen.
\nAs I’ve been arguing for the past couple of years, the Health app will eventually have to find correlations between categories to help users understand how they’re living and what they should improve. Going beyond data collection and graphs should be the ultimate goal to turn Health into an assistant rather than a dashboard of data points. Until that’s the case, making the app prettier and easier to use is a good call.
\nApple is adding another feature as part of the Continuity initiative launched two years ago: clipboard transfer between devices.
\nThe option, dubbed Universal Clipboard, is designed to have (almost) no interface and “just work” in the background. After you’ve copied something on one device, pasting on another nearby will fetch what you originally copied on the first device and paste it. Universal Clipboard works with text, URLs, images, and other data types that can be pasted on iOS.
\nLike other Continuity functionalities, Universal Clipboard uses Apple IDs and peer-to-peer connectivity (Wi-Fi and Bluetooth) to determine devices in your proximity. Universal Clipboard is only engaged when you paste on a second device – it’s not constantly pushing your copied items to iCloud or broadcasting them to all devices nearby. Because Universal Clipboard is meant to quickly switch from one device to another, there’s a two-minute timeout on copied items – you won’t be able to paste an image you copied two days ago on your iPhone in a message thread on the iPad today.
\nIn my tests, Universal Clipboard worked well. It takes about a second to paste text copied from another device. Pasting a photo was the only case where I came across a “Pasting from…” dialog that loaded for a couple of seconds.
\nPasting an image with Universal Clipboard.
Universal Clipboard’s no-configuration approach may concern developers who don’t want data copied from their apps to propagate across devices . To ease those qualms, iOS 10 includes an API to restrict the pasteboard to the local device or set an expiration timestamp. I suppose AgileBits and makers of other content-sensitive apps will provide settings to control the behavior of Universal Clipboard and disable it permanently.
\nIn the latest update, Workflow lets you configure Universal Clipboard options.
It’s not a replacement for dedicated clipboard managers such as Copied and Clips, but Universal Clipboard is ideal if you don’t want to think about transferring clipboard contents between devices. When you need it, Universal Clipboard lets you paste a link copied on the iPad into a WhatsApp message on the iPhone, or a photo from the iPhone into Notes on a second device. There are no clipboard entries to organize and no persistent storage of information to worry about. Like opening apps through Handoff, it’s a nice option to always have with you.
\nCalendar’s new features in iOS 10 are aimed at speed and location.
\nData detectors for dates and times in iMessage conversations have been improved so they can pre-fill the event creation screen with details fetched from messages. If you’re planning a dinner with friends over iMessage and mention a place and time in the conversation, tapping the data detector should bring up the event creation UI with “dinner” as title and location/time properly assigned. When it works, it’s a neat trick to save time in creating events.
\nWhen creating an event in the Calendar app, iOS 10 suggests similar events so you can re-add them with one tap.
\nIt’s not clear how far back into your history iOS 10 goes looking for old events. Event suggestions are handy – they’re not real event templates, but they pre-fill locations and times, too.
\nSpeaking of locations, iOS 10’s Calendar can suggest a location to add to an event with one tap. If I had to guess, I’d say that iOS uses old events with the same name and frequent locations to suggest an address. And, if you create an event with travel time, Calendar will use the location of a previous event (not your current location) to calculate how long it’ll take you to get there.
\nApple’s Calendar app isn’t as versatile or powerful as Fantastical or Week Calendar. But I’m not a heavy calendar user, and iOS 10’s proactive Calendar features have been smart in small and useful increments. I’m going to stick with Apple’s Calendar app for a while.
\niOS users have long clamored for the removal of pre-installed Apple apps from their devices. Such desire is understandable: Apple has kept adding built-in apps every other year, which hasn’t helped the perception that Apple itself is wasting users’ storage while also selling 16 GB iPhones as base models.60
\niOS 10 adds the ability to remove the majority of pre-installed Apple apps61 from the Home screen. These are:
\nThere’s a catch. By removing an Apple app, you won’t delete it from the system entirely – you’ll remove the icon and delete user data inside the app, but the core bits of each app – the actual binary – will still live in the operating system. According to Apple, removing every app listed above will only recover 150 MB from your device. If you were hoping to get rid of every Apple app and suddenly gain a couple of GBs of storage, you’ll be disappointed.
\nDeleting a pre-installed Apple app works just like any other app on iOS: tap & hold on the icon, hit Delete, and you’re done. For each app, iOS will warn you that you’ll either lose your data or access to specific features, such as location sharing with Find My Friends, the Calculator icon in Control Center, or email data stored locally.62
\nRemoving apps based on core system frameworks won’t delete data inside them. If you remove Contacts, your contacts won’t be deleted; the same applies to Reminders and Calendar, plus other iCloud data and documents. Effectively, you’re removing the shell of an app and its settings; deleting Mail, for instance, removes all your accounts from the app. If you remove Reminders and ask Siri to create a reminder, though, it’ll still be created and made available to third-party clients via EventKit.
\nRestoring previously removed Apple apps could have been more intuitive. You have to open the App Store and search for the name of an app, or look for Apple’s developer profile page and tap the download button for each app you want to bring back. It would have been nice to have a dedicated area in Settings to view which apps have been removed with an easier way to restore them.
\nRestoring Apple apps from the App Store.
Restoring Apple apps is further confirmation of the fact that those apps aren’t actually gone – they’re just hidden. The download isn’t a download: it takes less than a second and it doesn’t even show a progress bar. Try this for yourself: remove an Apple app, find Apple’s developer page on the App Store, put your device in Airplane Mode, and hit download. The app will reappear on your Home screen without the need for an Internet connection.63
\nAs a company that prides itself on the tight integration of its hardware and software, caving to user pressure on the matter of pre-installed apps must have been, politically and technically, tough for Apple. The result is a compromise: Apple is letting users get rid of those Tips and Stocks apps (among others) that few seem to like, but they also can’t completely delete apps because of their ties with the OS.
\nSome people will complain about this decision. I’m not sure users would like the opposite scenario where entire frameworks are deleted from iOS (if even possible without breaking Apple’s code sign), third-party apps lose access to essential APIs, and each download consumes several hundred MBs. Given the architectural complexities involved, the current solution seems the most elegant trade-off.
\n\nA world-class portable camera is one of the modern definitions of the iPhone. Among many factors, people buy an iPhone because it takes great pictures. And Apple’s relentless camera innovation isn’t showing any signs of slowing down this year.
\nBut the importance of the iPhone’s camera goes deeper than technical prowess. The Camera and its related app, Photos, create memories. Notes, Reminders, Maps, and Messages are essential iOS apps; only the Camera and Photos have a lasting, deeply emotional impact on our lives that goes beyond utility. They’re personal. They’re us.
\niOS 10 strives to improve photography in two parallel directions: the capturing phase and the discovery of memories – the technical and the emotional. Each relies on the other; together, they show us a glimpse of where Apple’s hardware and services may be heading next.
\nLet’s start with the technical bits.
\nThe Camera Capture pipeline has been updated to support wide-gamut color in iOS 10. All iOS devices can take pictures in the sRGB color space; the 9.7-inch iPad Pro and the upcoming iPhone 7 hardware also support capturing photos in wide-gamut color formats.
\nWhen viewed on displays enabled for the P3 color space, pictures taken in wide color will be beautifully displayed with richer colors that take advantage of a wider palette of options. That will result in more accurate and deeper color reproduction that wasn’t possible on the iPhone until iOS 10 (the 9.7-inch iPad Pro was the only device with a wide color-enabled display on iOS 9).
\nThere are some noteworthy details in how Apple is rolling out wide color across its iOS product line, using photography as an obvious delivery method.
\nWide color in iOS 10 is used for photography, not video. JPEGs (still images) captured in wide color fall in the P3 color space; Live Photos, despite the presence of an embedded movie file, also support wide color when viewed on the iPad Pro and iPhone 7 (or the Retina 5K iMac).
\nApple has been clever in implementing fallback options for photos displayed on older devices outside of the P3 color space. The company’s photo storage service, iCloud Photo Library, has been made color-aware and it can automatically convert pictures to sRGB for devices without wide color viewing support.
\nMore interestingly, wide-gamut pictures shared via Mail and iMessage are converted to an Apple Wide Color Sharing Profile by iOS 10. This color profile takes care of displaying the image file in the appropriate color space depending on the device it’s viewed on.
\nEven as a tentpole feature of the iPhone 7, wide-gamut photography isn’t something most users will care (or know) about. Wide color is relevant in the context of another major change for iOS photographers and developers of photo editing software – native RAW support.
\nApple used an apt and delicious analogy to describe RAW photo capture at WWDC: it’s like carrying around the ingredients for a cake instead of the fully baked product. Like two chefs can use the same ingredients to produce wildly different cakes, RAW data can be edited by multiple apps to output different versions of the same photo.
\nRAW stores unprocessed scene data: it contains more bits because no compression is involved, which leads to heavier file sizes and higher performance required to capture and edit RAW. On iOS 10, RAW capture is supported on the iPhone SE, 6s and 6s Plus, 7 and 7 Plus (only when not using the dual camera), and 9.7-inch iPad Pro with the rear camera only, and it’s an API available to third-party apps (Apple’s Camera app doesn’t capture in RAW).
\nTo store RAW buffers, Apple is using the Adobe Digital Negative (DNG) format; among the many proprietary RAW formats used by camera manufacturers, DNG is as close to an open, publicly available standard as it gets.64
\nAt a practical level, the upside of RAW capture is the ability to reverse and modify specific values in post-production to improve photos in a way that wouldn’t be possible with processed JPEGs. On iOS 10, RAW APIs allow developers to create apps that can tweak exposure, temperature, noise reduction, and more after having taken a picture, giving professionals more creative control over photo editing.
\nThings are looking pretty good in terms of performance, too. On iOS devices with 2 GB of RAM or more, the system can edit RAW files up to 120 megapixels; on devices with 1 GB of RAM, or if accessed from an editing extension inside Photos (where memory is more limited), apps can edit RAW files up to 60 megapixels.
\nNative RAW support opens up an opportunity for developers to fill a gap on the App Store: desktop-class photo editing and management apps for pros. If adopted by the developer community, native RAW capture and editing could enable workflows that were previously exclusive to the Mac. Imagine shooting RAW with a DSLR, or even an iPhone 7, and then sitting down with an iPad Pro to organize footage, flag pictures, and edit values with finer, deeper controls, while also enjoying the beauty and detail of wide-gamut images (which RAW files inherently are).
\nShooting RAW in Obscura.
I tested an upcoming update to Obscura, Ben McCarthy’s professional photo app for iOS, with RAW support on iOS 10. RAW can be enabled from the app’s viewfinder; after shooting with Obscura, RAW photos are saved directly into iOS’ Photos app.
\nEditing RAW in Snapseed.
Google’s Snapseed photo editor imported RAW files shot in Obscura without issues, and I was able to apply edits with Snapseed’s RAW Develop tool, saving changes back to Photos. I’m not a professional photographer, but I was still impressed by the RAW workflows now possible with third-party apps and iOS 10.
\nOn the other hand, while Apple has improved developer tools for RAW capture and editing, hurdles remain in terms of photo management. iCloud Photo Library, even at its highest tier, only offers 2 TB of storage; professional photographers have libraries that span decades and require several TBs. The situation is worse when it comes to local storage on an iPad, with 256 GB being the maximum capacity you can buy for an iPad Pro today. Perhaps Apple is hoping that these limitations will push users to rely on cloud-based archival solutions that go beyond what’s offered by iCloud and iOS’ offline storage. However, it’s undeniable that it’s still easier for a creative professional to organize 5 TB of RAW footage on a Mac than an iPad.
\nI have no reason to doubt that companies like Adobe will be all over Apple’s RAW APIs in iOS 10. I’m also curious to see how indie developers will approach standalone camera apps for RAW capture and quick edits. There’s still work to be done, but the dream of a full-featured photo capture, editing, and management workflow on iOS is within our grasp.
\nApple isn’t altering the original idea behind Live Photos with iOS 10: they still capture the fleeting moment around a still image, which roughly amounts to 1.5 seconds before a picture is taken and 1.5 seconds after. Photos have become more than still images thanks to Live Photos, and there are some nice additions in iOS 10.
\nLive Photos now use video stabilization for the movie file bundled within them. This doesn’t mean that the iPhone’s camera generates videos as smooth as Google’s Motion Stills, but they’re slightly smoother than iOS 9. Another nice change: taking pictures on iOS 10 no longer stops music playback.
\nFurthermore, editing is fully supported for Live Photos in iOS 10. Apps can apply filters to the movie inside a Live Photo, with the ability to tweak video frames, audio volume, and size.65 To demonstrate the new editing capabilities, Apple has enabled iOS’ built-in filters to work with Live Photos, too.
\nThe key advantage of Apple’s Live Photos is integration with the system Camera, which can’t be beaten by third-party options. I’d like to see higher frame rates in the future of Live Photos; for now, they’re doing a good enough job at defining what capturing a moment feels like.
\n\nThe photos on our devices are more than files in a library. They’re tiny bits of our past. The places we went to; the people we were; those we met. Together, they’re far more powerful than memory alone. Photos allow us to ache, cherish, and remember.
\nWithout tools to rediscover and relive memories, none of that matters. A camera that’s always with us has enabled us to take a picture for every moment, but it created a different set of issues. There’s too much overhead in finding our old selves in a sea of small thumbnails. And what purpose is to a photo if it’s never seen again?
\nApple sees this as a problem, too, and they want to fix it with iOS 10. With storage, syncing, and 3D Touch now taken care of, the new Photos focuses on a single, all-encompassing aspect of the experience:
\nYou.
\nApple’s rethinking of what Photos can do starts with a layer of intelligence built into our devices. The company refers to it as “advanced computer vision”, and it spans elements such as recognition of scenes, objects, places, and faces in photos, categorization, relevancy thresholds, and search.
\nSecond, Apple believes iOS devices are smart and powerful enough to handle this aspect of machine learning themselves. The intelligence-based features of Photos are predicated on an implementation of on-device processing that doesn’t transmit private user information to the cloud – not even Apple’s own iCloud (at least not yet).
\nPhotos’ learning is done locally on each device by taking advantage of the GPU: after a user upgrades to iOS 10, the first backlog of photos will be analyzed overnight when a device is connected to Wi-Fi and charging; after the initial batch is done, new pictures will be processed almost instantaneously after taking them. Photos’ deep learning classification is encrypted locally, it never leaves the user’s device, and it can’t be read by Apple.
\nAs a Google Photos user, I was more than doubtful when Apple touted the benefits of on-device intelligence with iOS 10’s Photos app. What were the chances Apple, a new player in the space, could figure out deep learning in Photos just by using the bits inside an iPhone?
\nYou’ll be surprised by how much Apple has accomplished with Photos in iOS 10. It’s not perfect, and, occasionally, it’s not as eerily accurate as Google Photos, but Photos’ intelligence is good enough, sometimes great, and it’s going to change how we relive our memories.
\nOf the three intelligence features in Photos, Memories is the one that gained a spot in the tab bar. Memories creates collections of photos automatically grouped by people, date, location, and other criteria. They’re generated almost daily depending on the size of your library, quantity of information found in photos, and progress of on-device processing.
\nBrowsing Memories in iOS 10.
The goal of Memories is to let you rediscover moments from your past. There are some specific types of memories. For instance, you’ll find memories for a location, a person, a couple, a day, a weekend, a trip spanning multiple weeks, a place, or “Best Of” collections that highlight photos from multiple years.
\nIn my library, I have memories for my trip to WWDC (both “Great Britain and United States” and “Myke and Me”), pictures taken “At the Beach”, and “Best of This Year”. There’s a common thread in the memories Photos generates, but they’re varied enough and iOS does a good job at bringing up relevant photos at the right time.
\n\nBehind the scenes, Memories are assembled with metadata contained in photos or recognized by on-device intelligence. Pieces of data like location, time of the day, and proximity to points of interest are taken into consideration, feeding an engine that also looks at aspects such as faces.
\nScrolling Memories feels like flipping through pages of a scrapbook. Cover images are intelligently chosen from the app; if you press a memory’s preview, iOS brings up a collage-like peek with buttons to delete a memory or add it to your favorites.
\nTapping a memory transitions to a detail screen where the cover morphs into a playable video preview at the top. Besides photos, Memories generates a slideshow movie that you can save as a video in your library. Slideshows combine built-in soundtracks (over 80), pictures, videos, and Live Photos to capture an extended representation of a memory that you can share with friends or stream for the whole family on an Apple TV.
\nChoosing styles for Memories’ slideshows.
Each video comes with quick adjustment controls and deeper settings reminiscent of iMovie. In the main view, there’s a scrollable bar at the bottom to pick one of eight “moods”, ranging from dreamy and sentimental to club and extreme. Photos picks a neutral mood by default, which is a mix of uplifting and sentimental; moods affect the music used in the slideshows, as well as the cover text, selection of media, and transitions between items. You can also change the duration of a movie (short, medium, and long); doing so may require Photos to download additional assets from iCloud.
\nDeeper movie settings.
To have finer control over Memories’ movies, you can tap the editing button in the bottom right (the three sliders). Here, you can customize the title and subtitle with your own text and different styles, enter a duration in seconds, manually select photos and videos from a memory, and replace Apple’s soundtrack with your favorite music.66
\nBelow the slideshow, Memories displays a grid of highlights. Both in the grid and the slideshow, Photos applies de-duplication, removing photos similar to each other.67 Apple’s Memories algorithm tends to promote pictures that are well-lit, or where people are smiling, to a bigger size in the grid. In Memories, a photo’s 3D Touch peek menu includes a ‘Show Photos from this Day’ option to jump to a specific moment.
\nAs you scroll further down a memory’s contents, you’ll notice how Photos exposes some of the data it uses to build Memories with People and Places.
\nThe memories you see in the main Memories page are highlights – the best memories recommended for you. In reality, iOS 10 keeps a larger collection of memories generated under the hood. For example, every moment (the sub-group of photos taken at specific times and locations) can be viewed as a memory. In each memory, you’ll find up to four suggestions for related memories, where the results are more hit-and-miss.
\nIn many ways, Apple’s Memories are superior to Google Assistant’s creations: they’re not as frequent and they truly feel like the best moments from your past. Where Google Photos’ Assistant throws anything at the wall to see what you might want to save, I can’t find a memory highlighted by Photos that isn’t at least somewhat relevant to me. iOS 10’s Memories feel like precious stories made for me instead of clever collages.68
\nMemories always bring back some kind of emotion. I find myself anticipating new entries in the Memories screen to see where I’ll be taken next.
\nAvailable for years on the desktop, Faces have come to Photos on iOS with the ability to browse and manage people matched by the app.
\nThere are multiple ways to organize people recognized in your photo library. The easiest is the People view, a special album with a grid of faces that have either been matched and assigned to a person or that still need to be tagged.
\nLike on macOS, the initial tagging process is manual: when you tap on an unnamed face, photos from that person have an ‘Add Name’ button in the title bar. You can choose one of your contacts to assign the photos to.
\nAdding John as a recognized contact.
As you start building a collection of People, the album’s grid will populate with more entries. To have quicker access to the most important people – say, your kids or partner – you can drag faces towards the top and drop them in a favorites area.69
\nMarking people as favorites
\nAnother way to deal with faces is from a photo’s detail view. In iOS 10, you can swipe up on a photo (or tap ‘Details’ in the top right) to locate it on a map, show nearby photos, view related memories (again, mostly chosen randomly), and see which people Photos has recognized.
\nSwipe up to view details of a photo, including people.
This is one of my favorite additions to Photos.70 Coalescing location metadata and faces in the same screen is an effective way to remember a photo’s context.
\nNo matter how you get to a person’s photos, there will always be a dedicated view collecting them all. If there are enough pictures, a Memories-like slideshow is available at the top. Below, you get a summary of photos in chronological order, a map of where photos were taken, more related memories, and additional people. When viewing people inside a person’s screen71, iOS will display a sub-filter to view people and groups. Groups help you find photos of that person and yourself together.
\nDue to EU regulations on web photo services, I can’t use Google Photos’ face recognition in Italy, therefore I can’t compare the quality of Google’s feature with Photos in iOS 10. What I have noticed, though, is that local face recognition in Photos isn’t too dissimilar from the functionality that existed in iPhoto. Oftentimes, Photos gets confused by people with similar facial features such as beards; occasionally, Photos can’t understand a photo of someone squinting her eyes belongs to a person that had already been recognized. But then other times, Photos’ face recognition is surprisingly accurate, correctly matching photos from the same person through the years with different hairstyles, beards, hair color, and more. It’s inconsistently good.
\nDespite some shortcomings, I’d rather have face recognition that needs to be trained every couple of weeks than not have it at all.
\nYou have to train face recognition when things go wrong.
You can “teach” photos about matched people in two ways: you can merge unnamed entries that match an existing person (just assign the same name to the second group of photos and you’ll be asked to merge them), or you can confirm a person’s additional photos manually. You can find the option at the bottom of a person’s photos.
\nThe biggest downside of face support in iOS 10 is lack of iCloud sync . Photos runs its face recognition locally on each device, populating the Faces album without syncing sets of people via iCloud Photo Library. The face-matching algorithm is the same between multiple devices, but you’ll have to recreate favorites and perform training on every device. I’ve ended up managing and browsing faces mostly on my iPhone to eschew the annoyance of inconsistent face sets between devices. I hope Apple adds face sync in a future update to iOS 10.
\nConfirming faces in Photos is a time-consuming, boring process that, however, yields a good return on investment. It’s not compulsory, but you’ll want to remember to train Photos every once in a while to help face recognition. In my first training sessions, suggestions were almost hilariously bad – insofar as suggesting pictures of Myke Hurley and me were the same person. After some good laughs and taps, Photos’ questions have become more pertinent, stabilizing suggestions for new photos as well.
\nFace recognition in iOS 10’s Photos is not a dramatic leap from previous implementations in Apple’s Mac clients, but it’s good enough, and it can be useful.
\nDisplay of location metadata has never been Photos’ forte, which created a gap for third-party apps to fill. In iOS 10, Apple has brought MapKit-fueled Places views to, er, various places inside the app.
\nIf Location Services were active when taking a picture, a photo’s detail view will have a map to show where it was taken. The map preview defaults to a single photo. You can tap it to open a bigger preview, with buttons to show photos taken nearby in addition to the current one.
\nViewing nearby photos.
When in full-screen, you can switch from the standard map style to hybrid or satellite (with and without 3D enabled). The combination of nearby photos and satellite map is great to visualize clusters of photos taken around the same location across multiple years. When you want to see the dates of all nearby photos, there’s a grid view that organizes them by moment.
\nNearby photos make 3D maps useful, too. I seldom use Flyover on iOS, but I like to zoom into a 3D map and view, for instance, photos taken around the most beautiful city in the world.
\nYou can view all places at once from a special Places album. By default, this album loads a zoomed-out view of your country, but you can move around freely (like in Nearby) and pan to other countries and continents. It’s a nice way to visualize all your photos on a map, but it can also be used to explore old photos you’ve taken at your current location thanks to the GPS icon in the bottom left.
\nAs someone who’s long wanted proper Maps previews inside Photos, I can’t complain. Nearby and Places are ubiquitous in Photos and they add value to the photographic memory of a picture. Apple waited until they got this feature right.
\nProactive suggestion of memories and faces only solves one half of Photos’ discovery. Sometimes, you have a vague recollection of the contents of a photo and want to search for it. Photos’ content search is where Apple’s artificial intelligence efforts will be measured up against Google’s admirable natural language search.
\nPhotos in iOS 10 lets you search for things in photos. Apple is tackling photo search differently than Google, though. While Google Photos lets you type anything into the search field and see if it returns any results, iOS 10’s Photos search is based on categories. When typing a query, you have to tap on one of the built-in categories for scenes and objects supported by Apple. If there’s no category suggestion for what you’re typing, it means you can’t search for it.
\nIntelligent search in Photos.
The search functionality is imbued with categories added by Apple, plus memories, places, albums, dates, and people – some of which were already supported in iOS 9. Because of Apple’s on-device processing, an initial indexing will be performed after upgrading to iOS 10.72
\nThe range of categories Photos is aware of varies. There are macro categories, such as “animal”, “food”, or “vehicle”, to search for families of objects; mid-range categories that include generic types like “dog”, “hat”, “fountain”, or “pizza”; and there are fewer, but more specific, categories like “beagle”, “teddy bear”, “dark glasses”, or, one of my favorites, the ever-useful “faucet”.
\n\nApple’s goal was to provide users with a reasonable set of common words that represent what humans take pictures of. The technology gets all the more impressive when you start concatenating categories with each other or with other search filters. Two categories like “aircraft” and “sky” can be combined in the same search query and you’ll find the classic picture taken from inside a plane. You can also mix and match categories with places and dates: “Beach, Apulia, 2015” shows me photos of the beach taken during my vacation in Puglia last year; “Rome, food” lets me remember the many times I’ve been at restaurants here. I’ve been able to concatenate at least four search tokens in the same query; more may be possible.
\n\nAll this may not be shocking for tech-inclined folks who have used Google Photos. But there are millions of iOS users who haven’t signed up for Google’s service and have never tried AI-powered photo search before. To have a similar feature in built-in app, developed in a privacy-conscious way, with a large set of categories to choose from – that’s a terrific change for every iOS user.
\nApple isn’t storing photos’ content metadata in the cloud to analyze them at scale – your photos are private and indexing/processing are performed on-device, like Memories (even if you have iCloud Photo Library with Optimize Storage turned on). It’s an over-simplification, but, for the sake of the argument, this means that iOS 10 ships with a “master algorithm” that contains knowledge of its own and indexes photos locally without sending any content-related information to the cloud. Essentially, Apple had to create its computer vision from scratch and teach it what a “beach” looks like.
\nIn everyday usage, Photos’ scene search is remarkable when it works – and a little disappointing when it doesn’t.
\nWhen a query matches a category and results are accurate, content-aware search is amazing. You can type “beach” and Photos will show you pictures of beaches because it knows what a beach is. You can search for pictures of pasta and suddenly feel hungry. Want to remember how cute your dog was as a puppy? There’s a category for that.
\nI’ve tested search in Photos for the past three months, and I’ve often been able to find the photo I was looking for thanks to query concatenation and mid-range descriptions, such as “pasta, 2014” or “Rome, dog, 2016”. Most of the time, what Apple has achieved is genuinely impressive.
\nOn a few occasions, Photos’ categories didn’t contain results I was expecting to be in there, or they matched a photo that belonged to a different category (such as my parents’ border collie, recognized as a “bear”, or fireworks tagged as “Christmas tree”).
\nThat’s one adorable bear.
Understandably, Apple’s first take on scene search with computer vision isn’t perfect. These issues could be remedied if there was a way to fix false positives and train recognition on unmatched photos, but no such option is provided in iOS 10. The decision to omit manual intervention hinders the ability to let users help Photos’ recognition, and it makes me wonder how long we’ll have to wait for improvements to the algorithm.
\nCompared to Google Photos’ search, Apple’s version in iOS 10 is already robust. It’s a good first step, especially considering that Apple is new to this field and they’re not compromising on user privacy.
\nWhat’s most surprising about the new Photos is how, with one iOS update, Apple has gone from zero intelligence built into the app to a useful, capable alternative to Google Photos – all while taking a deeply different approach to image analysis.
\nAdmittedly, iOS 10’s Photos is inspired by what Google has been doing with Google Photos since its launch in May 2015. 200 million monthly active users can’t be wrong: Google Photos has singlehandedly changed consumer photo management thanks to automated discovery tools and scene search. Any platform owner would pay attention to the third-party asking users to delete photos from their devices to archive them in a different cloud.
\nApple has a chance to replicate the same success of Google Photos at a much larger scale, directly into the app more millions of users open every day. It isn’t just a matter of taking a page from Google for the sake of feature parity: photos are, arguably, the most precious data for iPhone users. Bringing easier discovery of memories, new search tools, and emotion into photo management yields loyalty and, ultimately, lock-in.
\nThis isn’t a fight Apple is willing to give up. In their first round, Apple has shown that they can inject intelligence into Photos without sacrificing our privacy. Let’s see where they go from here.
\n\nWhile iOS 10 hasn’t brought a sweeping UI redesign, changes sprinkled throughout the interface underscore how Apple has been refining the iOS 7 design language. On the other hand, a new direction for some apps appears to hint at something bigger.
\nApple Music epitomizes a strikingly different presentation of full-screen views, content grids, and affordances that hint at user interaction.
\nIn iOS 10, Apple Music eschews the traditional title bar in favor of large, bold headlines for first-level views such as For You and Browse.
\nA new look for title bars in Apple Music.
The use of San Francisco bold in lieu of a centered title bar label is similar to a newspaper headline. The heavy typeface sticks out as an odd choice initially, but it clarifies the structure and increases the contrast of Apple Music – two areas in which the company was criticized over the past year.
\n\nTo group content within a view, or to label sub-views in nested navigation, Apple relies on Dynamic Text to scale text at different sizes. Dynamic Text doesn’t affect headlines.
\n\nThe text-based back button at the top of a sub-view isn’t gone, but titles are always displayed in bold next to the content the user is viewing. An album’s name, for instance, isn’t centered in the title bar anymore; instead, it sits atop the artist’s name.
\nAlbum titles no longer sit in the title bar – they’re part of the content itself.
The combination of multiple font weights, color, and thicker labels provides superior hierarchy for content displayed on a page, separating multiple types of tappable items. By doing less, the result is a set of stronger affordances.
\nThe visual statement is clear: when you see a black headline or sub-title, it can’t be tapped. You’ll have to tap on the content preview (artwork, photos) or colored label (artist names, buttons) to continue navigation or perform a task.
\nThis goes beyond fonts. To further limit confusion, Apple Music now displays fewer items per page. Every element – whether it’s an album, a text button, or a collection of playlists – is also larger and more inviting to the touch.
\nFewer, bigger touch targets.
The trade-off is reduced information density and the perception that Apple is babysitting their users with albums and buttons that get in the way too much. It’s a case of over-shooting in the opposite direction of last year’s button-laden Music app; Apple has a history of introducing new design languages and intentionally exaggerating them in the first version. The new Apple Music is a reset of visual expectations.
\nThis is best exemplified by the Now Playing widget at the bottom of the screen: besides being taller (and hence more tappable), the contextual menu it opens blurs the background and is filled with large, full-width buttons that combine text and icons.
\nIt’s impossible to misunderstand what each of these do, and selecting them doesn’t feel like playing a tap lottery, as was the case with the old contextual menu of iOS 9. Apple doesn’t appear too worried about breaking design consistency with other share dialogs on iOS as long as Apple Music’s works better.
\nThe company’s newfound penchant for big titles and explaining functionalities ahead of interaction doesn’t stop at Apple Music. Apple News makes plenty of use of bold headlines for article titles (where they feel like an appropriate fit) and multiple colors to distinguish sections.
\n\nThe Home app adheres to similar principles. There’s no fixed title bar at the top of the screen; rather, a customizable background extends to the top of a view, with a large title indicating which room is being managed.
\nHome has no real title bars either.
We can also look outside of apps for a manifestation of Apple’s bold design sentiment. In Control Center, splitting features across three pages lets functionality stand out more with bigger, comfortable buttons that aren’t constrained by a single-page design. This is evident in the music control page, where album artwork can be tapped to open the app that is currently playing audio.
\nFinally, let’s consider the Lock screen. In addition to redesigned notifications and widgets (which can be simply pressed for expansion), Apple is using thicker fonts and expanded audio controls.
\n\nBigger song information, larger buttons, and a volume nub that can be grabbed effortlessly. I see these as improvements over the iOS 9 Lock screen.
\nAh, buttons. The much contested, derided aspect of the iOS 7 design isn’t officially changing with iOS 10. According to the iOS Human Interface Guidelines, this is still the default look of a system button in iOS 10:
\nAcross iOS 10, however, we can see signs of Apple moving back to eye-catching buttons with borders and filled states in more apps.
\nLet’s start with Apple Music again. In the iOS 10 version, there are numerous instances of button-y buttons that weren’t there in iOS 9.
\nButtons that don’t have borders or a filled state are still present, but most of them have been redrawn with a thicker stroke to increase contrast with the app’s white background.
\nMessages has an interesting take on buttons. Most of them are consistent with iOS 9, but the two buttons to open the Camera or a pick a photo from the library are displayed as icons inside a square with rounded corners.
\nThose are two big buttons.
These replace the textual buttons of the iOS 9 photo picker, where one of them could be confused as the label of the scrollable gallery shown above it.
\nThe same look is used for HomeKit accessories. Device icons are contained in a square that shows the accessory’s name, its icon, and intensity level.
\nThe use of highlights and colors helps discerning on-off states for devices that are turned off (black text, translucent button) and on (colored icon, white-filled button).
\nFilled circles with glyphs are a recurring button type in iOS 10. They’re used in a few places:
\nCircular buttons in iOS 10.
Other examples of buttons redesigned for iOS include the back button in the status bar to return to an app (it’s got a new icon) and variations of ‘Get Started’ buttons for apps like Calendar and Apple News, which are now filled rectangles.
\nUpdates to buttons in iOS 10 may indicate that Apple heard feedback on text labels that many don’t know can be tapped to initiate an action, but we’ll have to wait until next year for further proof.
\nIn Apple Music, Messages, and Maps, the company has rolled out new types of views that could be interesting if ported to other apps.
\nFirst, stacked views. In Music’s Now Playing screen and the iMessage App Store, iOS 10 features stacked panels that open on top of the current view, keeping a tiny part of it visible in the background. There’s a nice animation for the view that recedes and shrinks from the status bar.
\nStacked views.
Stacked views are an intriguing way to show nested navigation. I wonder if more full-screen views that use a back button in the top left could be redesigned with this layout, perhaps using a vertical swipe to dismiss the foreground panel.
\nThere are plenty of card-like interfaces being used in iOS 10 to supplant full-screen views, popups, and other kinds of panels.
\nFrom Maps’ search suggestions that slide up from the bottom of the map to Control Center’s pages, the Apple TV (and AirPods) setup card, and, in a way, expanded notifications, it feels like Apple has realized it’s time to take advantage of bigger iPhone displays to drop modal popups and full-screen views.
\nCards in iOS 10.
Cards enhance usability with definite boundaries and a concise presentation of content. I like where this is going.
\nContextual animations and transitions have always been part of iOS’ visual vocabulary. Several improvements have been brought on this front with iOS 10, including APIs that allow for interactive and interruptible animations.
\nIf developers support the object-based animation framework added to UIKit with iOS 10, they’ll be able to have deeper control over interrupting animations and linking them with gesture-based responses. These improved animations (based on UIViewPropertyAnimator) can be paused and stopped, scrubbed (moved forward and back), and reversed at any point in their lifecycle. In short, it means apps no longer have to finish an entire animation if they want to react to other changes.
\nApps can feel more responsive – and faster – with interruptible animations. It’s not a major change per se, but it’s a welcome response to iOS 7’s indulgent animation curve.
\nNew emoji from the Unicode 9.0 spec aren’t available in iOS 10.0 (they’ll likely be added in a point release in the near future), but Apple still found a way to ship notable emoji updates that will entice users to upgrade.
\nSeveral emoji have been redesigned with less gloss, more details, and new shading. This is most apparent in the Faces category where characters have a more accentuated 3D look. They remind me of emoticons from the original MSN Messenger, redrawn for the modern age.
\n\nApple has implemented the ZWJ technique to add more gender-diverse emoji. Technically, these are combinations of multiple existing characters (codepoints) joined in a ZWJ sequence. To users, they’ll look like new emoji added to the system keyboard, and Apple didn’t miss the opportunity to announce them with a press release.
\nAlas, Apple’s emoji keyboard still doesn’t have a search field. If emoji suggestions fail to bring up the emoji you’re looking for, you’ll still want to keep Gboard installed as a fast way to search for emoji.
\nIn addition to visual tweaks, Apple did some work in the aural department as well.
\nThe keyboard in iOS 10 has distinctive new “pop” sounds for different kinds of keys, including letters, the delete key, and the space bar.73 Some people will find these sounds distracting and cartoon-ish; I think they add an extra dimension to typing on the software keyboard. Because the keyboard has multiple layers of “popping bubbles”, you can now hear what you type besides seeing it. I’m a fan.
\nIt is the lock sound, though, that takes the crown for the most surprising sound effect of iOS 10. I still can’t decide what it is, but I like it.
\nSound design is often underrated. An intelligent use of sound effects can augment the visual experience with context, personality, and just the right amount of whimsy. Whoever is behind the sound-related changes in iOS 10, I want to hear more from them.
\nApple is continuing to iterate on the design language they introduced two years ago, but they’re doing so inconsistently across the system, experimenting with new ideas without fully committing to them. There are multiple design languages coexisting in iOS 10. At times, it’s hard to reconcile them.
\nThe most notable changes mentioned above – the bold look of Apple Music and the revised look of buttons – aren’t new guidelines for a system-wide refresh. They’re isolated test-drives scattered throughout the system without a common thread.
\nMusic, News, and Home have little in common from a functional standpoint, and yet they share the same aesthetic. Does Apple consider these apps the baseline of iOS interfaces going forward? Or should we prepare for an increasingly diversified constellation of Apple apps, each built around a design specifically tailored for it? What types of apps should adopt the “big and bold” style? Should developers read the tea leaves in Apple’s app redesigns this year and prepare for updated guidelines twelve months from now?
\nTaken at face value, what we have in iOS 10 is a collection of design refinements. We also have a clique of apps that look different from the rest of Apple’s portfolio, which may portend future change.
\nUltimately, we’re left asking: where do we go from here?
\n\niOS’ Proactive assistant, introduced last year as a set of suggested shortcuts for apps based on user habits and context, is expanding to locations and contacts in iOS 10, and gaining a foothold in the system keyboard.
\nIf you’re in an iMessage conversation and someone asks you for a contact’s phone number or email address, iOS will automatically put that suggestion in the QuickType keyboard for one-tap insertion. It doesn’t have to be a reply to an existing message: if you compose a new email and type “[Name]’s phone number is”, QuickType will also proactively suggest the phone number from your address book.
\nEven more impressively, if someone asks “Where are you?” on iMessage, QuickType will show a button to send your current location. Tap it, and a Maps bubble will be sent; the other person can tap it to open a full-screen preview and get directions.74
\nSharing your current location from QuickType in iMessage.
NSUserActivity plays a role in proactive suggestions, too. Apps can push out activities for places and have them appear as suggestions in other apps.
\nA Yelp suggestion in Maps.
A restaurant listing from Yelp, for example, can be suggested in Maps’ search view automatically; an app that displays hotel reviews can mark the location the user is viewing, and if the user switches to a travel planning app, that address can be proactively suggested without the need to search for it again.
\n\nRecently viewed places can even be suggested as shortcuts when switching between apps to open directions in Maps.
\nMaps shortcuts for places viewed in third-party apps.
The system is an ingenious spin on NSUserActivity – a framework that developers were asked to start supporting last year for Spotlight search and Siri Reminders. By leveraging existing APIs and work developers have already put into their apps, iOS 10 can be smarter and use location-based activities as dynamic “bookmarks” in the system keyboard.
\nWhen these suggestions work, they’re impressive and delightfully handy. In my tests, I received suggestions for addresses listed on webpages in Safari (and properly marked up with schema.org tags) and Yelp inside Maps; iOS 10 suggested addresses for stores and restaurants when I was switching between Yelp, Safari, Maps, and Messages, and it removed suggestions after I closed the webpages in Safari or the listings in Yelp.
\nI’ve found other QuickType suggestions to be more inconsistent. When talking in English on iMessage, QuickType was pre-filled with suggestions for trigger sentences such as “Let’s meet at” or “We’re going to” because I was viewing a location in Maps or Yelp. I couldn’t get the same suggestions for different phrases like “Let’s have dinner at” or “See you in 10 minutes at”.
\nI couldn’t get proactive QuickType suggestions to work in Italian at all. This is an area where Apple’s deep learning tech should understand how users share addresses and contact information with each other. I’d expect Proactive to gain more predictive capabilities down the road, such as Calendar or Apple Music integration.
\nThere are more instances of Proactive suggestions in iOS 10 that are subtle, but useful. When searching in Spotlight, QuickType will offer suggestions for locations and other content as soon as you start typing. Previous searches are listed at the bottom of Siri suggestions (and I haven’t found a way to disable them, which could be problematic).
\nProactive shortcuts and previous searches in Spotlight.
If you’re already looking at a location in Maps or apps that markup addresses correctly, you can invoke Siri and say “get me there” to open directions to the address you’re viewing. ETA uses this feature to start directions to a place you’re viewing in the app.
\nOpening directions from ETA with Siri.
It’s no Google Now on Tap, but it’s easy to see how Apple could soon replicate some of that functionality through various types of NSUserActivity.75
\nApple is moving towards making Proactive more than a standalone page of shortcuts. Rather, Proactive is becoming an underlying feature of iOS, connecting an invisible web of activities when and where they make the most sense.
\n\nWhile last year’s software keyboard improvements focused on iPad productivity, iOS 10 brings pleasant enhancements that will benefit every iOS user.
\nThe most unexpected change in iOS 10 will be as important as copy & paste for millions of international users. iOS 10 adds support for multilingual typing without switching between keyboards.
\nThe annoyance of alternating keyboards isn’t an issue everyone can relate to. Most single-language speakers only deal with emoji as a separate “keyboard” that requires switching from the QWERTY layout. Those users probably don’t even see emoji as an additional keyboard but just as a special mode of the main system one. Millions of people have never seen iOS’ old keyboard system as a problem.
\nHow most English speakers deal with the system keyboard.
For speakers of multiple languages, the experience couldn’t be more different. As soon as a third keyboard is added to iOS, the emoji face turns into a globe button to switch between keyboards. Tapping it repeatedly cycles between all keyboards; alternatively, holding the globe button brings up a list of them.
\nHow international users switch between keyboards.
Anyone who uses an iOS device to hold conversations in multiple languages is subject to a slower experience. When you’re constantly jumping between iMessage conversations, Twitter replies, Facebook, email, Slack, and Notes, and when you’re staying in touch with friends in multiple languages, and after you’ve been doing it every day for years, those seconds spent cycling through keyboards add up. Millions of people see this as one of the biggest flaws of iOS.
\nIn iOS 10, Apple is taking the first steps to build a better solution: you can now type in multiple languages from one keyboard without having to switch between international layouts. You don’t even have to keep multiple keyboards installed: to type in English and French, leaving the English one enabled will suffice. Multilingual typing appears to be limited to selected keyboards, but it works as advertised, and it’s fantastic.
\nThe idea is simple enough: iOS 10 understands the language you’re typing in and adjusts auto-correct and QuickType predictions on the fly from the same keyboard. Multilingual typing supports two languages at once, it doesn’t work with dictation, but it can suggest emoji in the QuickType bar for multiple languages as well.
\nSwitching between English and Italian from the English keyboard.
I started testing multilingual typing on my iPhone and iPad Pro on the first beta of iOS. The best part is that there’s very little to explain: suggestions retain the predictive nature of QuickType based on the context of the app or conversation, and you can even switch between languages within the same sentence. There’s no training or configuration involved: it’s as if two keyboards were rolled into one and gained the dynamic context-switching of a multilingual person.
\nKnowing which languages can work with multilingual typing is a different discussion. Apple hasn’t updated their iOS feature availability page with details on multilingual typing yet. My understanding is that only keyboards with support for QuickType predictive suggestions and with a traditional QWERTY layout support multilingual typing. You should be able to mix and match Italian and English, or Dutch and French, or German and Spanish, for instance, but not Chinese and English within the same keyboard due to differences in the alphabet and characters.
\nI’ve been having conversations with my family in Italian while talking to colleagues and readers in English. I’m impressed with iOS 10’s ability to detect languages on a word-by-word basis. I assumed the system could be confused easily, particularly with typos or words that are similar between two languages, but that only happened a couple of times over three months. Switching mid-sentence between Italian and English (as I often do when talking about work stuff with my girlfriend, for example) is fast and accurate.
\nMost new iOS features take some time to get used to; multilingual typing isn’t one of them. After years spent fighting the keyboard switcher and auto-correct with multiple languages, multilingual typing is a huge relief. It’s an elegant solution to a difficult problem, and it makes conversations flow naturally. I’m happy to see Apple catering to users who speak multiple languages with a feature that others will never (understandably) care about.
\nMultilingual typing has already become an essential feature of my iOS experience. I love it.
\nApple’s improvements to typing and QuickType don’t stop at text and Proactive – they include emoji as well.
\n\nIf you’ve typed a word or expression that iOS 10 associates with an emoji, such as “pizza” or “not sure”, a suggested emoji will appear in QuickType. You can either put the emoji next to the word you’ve typed (by putting a space after the word and then tapping the emoji) or replace the word with the emoji itself (don’t add a space and tap the emoji). If emoji suggestions don’t immediately appear in an app, try inserting at least 5 emoji from the emoji keyboard first.76
\nEmoji suggestions
\nIn my tests, emoji suggestions have been good, often impressive. I’ve received emoji suggestions in both English and Italian, for a variety of common expressions (like “yum”, “love you”, or “I’m fine”) and with up to three suggestions for a single word (such as “lol”). Popular emoji like the thumbs up/down, clapping hands, and high five can be suggested if you know the trigger word/expression. From this point of view, emoji suggestions are visual text replacements – for instance, I now type “great” and replace the word whenever I want to insert a thumbs up in a message.
\nHowever, because the predictive engine is young and there’s so many different ways to describe an emoji, the dictionary is still growing. Italian doesn’t support as many suggestions as English (“think”, for instance, brings up the Thought Balloon emoji in English; the Italian equivalent, “penso”, doesn’t – but the infinitive form, “pensare”, does); some expressions don’t show an obvious emoji suggestion (try with “blue heart” or “friends”).77
\nAccording to Apple, their Differential Privacy technology will be used to understand how iOS users type emoji. Hopefully, such system can learn and improve its emoji definitions over time as it looks at how people in aggregate use emoji in the real world. If it works, it’s going to make one of the best tweaks to iOS even better.
\nDespite the creativity shown by developers, third-party keyboards haven’t received much love from Apple since their debut in iOS 8. Even without meaningful improvements to the API, two small adjustments in iOS 10 make using custom keyboards slightly better than iOS 9.
\nThe first change sounds like an Apple engineer remembered about a bug and found the time to fix it. In iOS 10, custom keyboards transition on screen with the same bottom-up slide of Apple’s keyboard. Thanks to this, opening a custom keyboard isn’t as jarring as before.
\niOS 10’s new slide transition for custom keyboards
\nFurthermore, iOS 10 lets third-party keyboards display the system keyboard switcher (the globe key menu) with the same options you get in the Apple keyboard.78
\nGboard (left) with a custom keyboard switcher; TextExpander updated for iOS 10 (right) with the new system one.
I still don’t think Apple is particularly invested in the idea of custom keyboards (the lack of any new features is telling), but at least they’ve done the bare minimum to ensure that a third-party keyboard can be used as a primary one without too much struggle. Apple must have recognized the value of some custom keyboards for accessibility purposes, languages iOS doesn’t support, and sharing features for messaging apps that aren’t iMessage.
\nThe likes of Google and Microsoft benefitting from these improvements is the kind of trade-off Apple will have to consider as they keep opening up iOS for everyone.
\n\niPad users who were craving the same attention of last year will be disappointed by iOS 10’s scarcity of iPad-only features. There are some iPad changes, but none of them have the impact of Split View or Picture in Picture.
\nAs mentioned before, there are new three-panel modes for Mail and Notes, a Now Playing sidebar in Apple Music, and in-app split view for Safari. There’s also a different look for alarms in the Clock app. Everything else is a basic adaptation of iPhone layouts or a refinement of the same views in iOS 9.
\nApple brought a few tweaks to the Camera viewfinder in iOS 10. On the iPhone, the camera flip button has been moved to the bottom, which makes it easier to switch between rear and iSight camera as you don’t have to reach to the top of the screen. On the iPad, most of the interface has been redrawn with circular buttons on the right and a persistent zoom slider on the left.
\nThe bigger iPad-only interface changes in iOS 10 can be collected in a single gallery:
\n\nMoving on to other features, Spotlight search invoked from an external keyboard with Command-Space will now open on top of the app(s) you’re currently using without exiting back to the Home screen. When in Split View, this can be used as a quicker app switcher for the primary app on the left side.
\n\nIt’s nice to use a Spotlight that behaves more like the Mac. Unfortunately, apps (including Apple’s Messages, Mail, and Notes) don’t restore cursor position after dismissing Spotlight. If you’re typing a message, open Spotlight, and then close it, you’ll have to tap the screen to focus the cursor on the last active app and continue typing.
\nThere are more external keyboard enhancements that are steps in the right direction. A Home screen icon has been added to the Command-Tab app switcher, so you can return to the Home screen without having to use Command-H. And, Command-tilde (~) can now move backwards in the app switcher, like on macOS.79 Last, you can take a screenshot with Command-Shift-3, which will be saved in the Photos app.
\nA Home screen shortcut has been added to the Command-Tab app switcher.
I’d be remiss if I didn’t mention Playgrounds. Apple hasn’t brought a full Xcode suite to the iPad, but the more streamlined Playgrounds environment feels like a better solution to introduce a new generation of iOS users to programming. Playgrounds isn’t a built-in app (it’s available from the App Store), and it’s got some surprising innovations in terms of code interactions and in-app multitasking. It’s also more powerful than you imagine if you know your way around Swift and native iOS frameworks. We’ll have a separate story on Playgrounds later this week.
\nThe lack of deeper iPad improvements in iOS 10 amplifies problems Apple still hasn’t fixed.
\nOn the 12.9-inch iPad Pro, the Home screen is still a wasteland of icons that don’t take advantage of the space offered to them. This year, the contrast is especially harsh given how iPhones with 3D Touch have received Home screen widgets in addition to quick actions.
\nManaging multiple files at once across different apps is still a task that will test the endurance of the most patient users. The Open In menu, untouched in iOS 10, continues to be limited to moving one file at a time from one app to another. The new ‘Add to iCloud Drive’ extension doesn’t help when even a basic task such as saving multiple attachments from an email message isn’t supported.
\nMore importantly, it’s obvious that Split View could be so much more. Having the clipboard and extensions as the sole data sharing mechanisms between two apps feels too limited when iOS is clearly suited for a system drag & drop framework. And that’s not to mention the Slide Over app picker – unchanged from last year and in desperate need of a redesign.
\nApple says that “there are great iPad features” in iOS 10, but that’s not accurate. There are great iOS features in this update, and, sure, they also work on the iPad, but the iPad-only changes are minor and sparse – with the sole exception of Safari. iOS 10 doesn’t share the same commitment to the iPad as iOS 9, when Apple was willing to reinvent the device’s most fundamental aspects. In many ways, this feels like a regression to the days of iOS being barely “optimized” for the iPad.
\niOS 10 is by no means “bad” on the iPad, it’s just not particularly exciting or what the platform deserves right now. If Apple is planning their own tick-tock schedule for iOS releases going forward, the iPad’s tock had better be a good one.
\n\nFollowing last year’s focus on iPad, built-in apps, and performance, iOS 10 marks Apple’s return to opening up the platform to developers with extensions. After Messages, Maps, and Siri, iOS 10 has got a few more extensibility tricks up its sleeve that are also significant.
\nAfter its debut in Mail with iOS 9, Apple’s Preview-like annotation tool has graduated to a system extension for images and documents.
\nUsing Markup in Photos.
The tools available in Markup haven’t changed. You can draw colored lines of varying thickness80, add magnification loupes, and place text annotations. Notably, Markup can be used in Photos as an editing extension; it doesn’t offer the advanced tools of Annotable, but it should be enough for most users.
\nFollowing iOS 9’s inconsistent use of an iCloud Drive extension (which was only available for attachments in Mail), iOS 10 makes “Add to iCloud Drive” a system-wide option that can be used anywhere, for any file.
\nAdd to iCloud Drive is an action extension that copies a file passed to it into iCloud Drive. It works for individual files shared from apps as well as media from Photos.
\nUnfortunately, the extension is hindered by questionable design decisions. When saving a file, the dialog box shows every folder and sub-folder in your iCloud Drive without a way to collapse them. There’s no quick way to open a specific destination: you’ll have to scroll a seemingly endless list of folders every time you want to save a file. There are no recent locations, no bookmarks, no search. No person who deals with documents on iOS would ever want to save them with an interface like this.
\nI appreciate Apple making iCloud Drive a system extension, but its design is amateur hour. It makes me wonder if anyone at Apple has ever used iCloud Drive with more than a handful of folders. It’s such an obvious misstep, it almost looks like a joke.
\nApple is granting third-party developers access to another part of the OS through extensions: telephony.
\nFor years, VoIP apps for audio and video calling have been relegated to a second-class experience. Apple created an API six years ago to bless VoIP apps with background execution privileges, but without a framework to integrate calls with the rest of the system, apps still needed to maintain their own contact lists and use standard push notifications for incoming calls.
\niOS’ old VoIP calling experience.
It was too easy to miss a call from apps like Skype or WhatsApp; accepting a call from a third-party app was also slow and confusing (why would you pick up a call from a banner alert?). Plus, developers couldn’t get access to functionalities such as blocked contacts, which remained exclusive to Apple’s Phone app.
\nAll this is changing with CallKit, a framework that elevates third-party VoIP apps to a front-seat spot on iOS, allowing them to plug into advanced controls that have complemented Apple’s Phone and FaceTime services for years.
\nThe CallKit framework permits an incoming call from a third-party VoIP app to take over everything else (including the Lock screen) with a full-screen view, just like Apple’s Phone and FaceTime apps. In a foremost example of dogfooding, Apple itself has adopted CallKit in all of their telephony services.
\nCallKit’s interface and behavior are consistent with Phone and FaceTime calls on iOS, with some differences. The calling UI is the same as Apple’s, with a label that describes which app the call is happening with, and the icon of the app replacing the dialer button. Tapping the icon takes users directly to the app for additional features. Developers can customize the in-call UI with a camera icon that indicates whether an app supports video calling or not.
\nLike Phone and FaceTime, CallKit boosts the priority of third-party VoIP apps. Other apps can’t interrupt a call during a CallKit session; routing for Accessibility features, CarPlay, and Bluetooth connections is handled by the system automatically without developers having to optimize for them.
\nA demo CallKit app on iOS 10.
CallKit’s integration with iOS’ calling infrastructure goes beyond a shared UI. VoIP apps built with CallKit get access to the same block list and Do Not Disturb settings used by Apple’s apps, they can support switching between multiple calls, and they can even appear in Contacts via the Recents and Favorites views.
\nApple doesn’t seem to be religious about pushing users to FaceTime anymore. If iOS 10 sees that the same contact is also registered with other VoIP services, buttons to initiate calls through third-party apps will be embedded in the contact card.81 Users only need to give an app permission to be used as a Service Provider, and it’ll be promoted to a first-class calling experience by iOS 10.82
\nApple’s embrace of third-party services with CallKit isn’t an admission of defeat. Rather, it’s a recognition of the fact that millions of people use iPhones to communicate with their friends and families through apps that aren’t FaceTime – that the App Store has reinvented communications beyond FaceTime and iMessage.
\nAs platform owners, Apple understands that they have to help customers who are seeking alternative calling services. With CallKit, they’ve created a secure and consistent framework that takes advantage of every feature that makes an iPhone the ultimate communication device.
\nUsing VoIP apps through CallKit feels and works like any other normal phone call. It’s refreshing to see this happen, and it’s a testament to the power of Apple’s extensibility APIs. I’m looking forward to seeing WhatsApp, Skype, and others update their apps for CallKit.
\nCall Directory is a surprising inclusion in the CallKit framework. With this extension type, apps can label phone numbers for incoming calls on the Lock screen.
\nApple described the use case for call directory extensions at WWDC: spam calls. According to the company, robo-callers and spam calls are particularly problematic in China (though I can vouch for their annoyance in Italy, too), and they’ve set out to improve upon this problem by letting developers maintain a database of phone numbers known to be spam.
\nCraig will tolerate no spam.
In Apple’s examples, a company like Tencent could build a call directory extension. When a call from a spam number comes in, the extension could add a label that identifies it as potential spam so the user can decide to reject the call without answering it.
\nCall Directory is another instance of Apple letting developers take over key bits of iOS in areas where the company doesn’t want to be involved.
\n\nWith the breadth and depth of iOS, it’s impossible to list every single change or new feature. Whether it’s a setting, a briefly documented API, or a subtle visual update, there are plenty of details and tidbits in iOS 10.
\nDifferential Privacy
\nAs a branch of cryptography and mathematics, I want to leave a proper discussion of Apple’s application of differential privacy to folks who are better equipped to talk about it (Apple is supposed to publish a paper on the subject in the near future). See this great explanation by Matthew Green and ‘The Algorithmic Foundations of Differential Privacy’ (PDF link), published by Cynthia Dwork and Arron Roth.
\nHere’s my attempt to offer a layman’s interpretation of differential privacy: it’s a way to collect user data at scale without personally identifying any individual. Differential privacy, used in conjunction with machine learning, can help software spot patterns and trends while also ensuring privacy with a system that goes beyond anonymization of users. It can’t be mathematically reversed. iOS 10 uses differential privacy in specific ways; ideally, the goal is to apply this technique to more data-based features to make iOS smarter.
\nFrom Apple’s explanation of differential privacy:
\n\n\n Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.\n
If Apple’s approach works, iOS will be able to offer more intelligent suggestions at scale without storing identifiable information for individual users. Differential privacy has the potential to give Apple a unique edge on services and data collection. Let’s wait and see how it’ll play out.
\nSpeech recognition
\niOS has offered transcription of spoken commands with a dictation button in the keyboard since the iPhone 4S and iOS 5. According to Apple, a third of all dictation requests comes from apps, with over 65,000 apps using dictation services per day for the 50 languages and dialects iOS supports.
\niOS 10 introduces a new API for continuous speech recognition that enables developers to build apps that can recognize human speech and transcribe it to text. The speech recognition API has been designed for those times when apps don’t want to present a keyboard to start dictation, giving developers more control.
\nSpeech recognition uses the same underlying technology of Siri and dictation. Unlike dictation in the keyboard, though, speech recognition also works for recorded audio files stored locally in addition to live audio. After feeding audio to the API, developers are given rich transcriptions that include alternative interpretations, confidence levels, and timing details. None of this is exposed to the microphone button in the keyboard, and it can be implemented natively in an app’s UI.
\nSame API, different interfaces.
There are some limitations to keep in mind. Speech recognition is free, but not unlimited. There’s a limit of 1 minute for audio recordings (roughly the same of dictation) with per-device and per-day recognition limits that may result in throttling. Also, speech recognition usually requires an Internet connection. On newer devices (including the iPhone 6s), speech recognition is supported offline, too. User permission will always be required to enable speech recognition and allow apps to transcribe audio. Apple itself is likely using the API in their new voicemail transcription feature available in the Phone app.
\nVoicemail transcription in iOS 10, possibly using speech recognition as well.
I was able to test speech recognition with new versions of Drafts and Just Press Record for iOS 10. In Drafts, my iPhone 6s supported offline speech recognition and transcription was nearly instantaneous – words appeared on screen a fraction of a second after I spoke them. Greg Pierce has built a custom UI for audio transcription inside the app; other developers will be able to design their own and implement the API as they see fit. In Just Press Record, transcripts aren’t displayed in real-time as you speak – they’re generated after an audio file has been saved, and they are embedded in the audio player UI.
\nI’m looking forward to podcast clients that will let me share an automatically generated quote from an episode I’m listening to.
\nDo Not Disturb gets smarter
\nDo Not Disturb has a setting to always allow phone calls from everyone while every other notification is being muted.
\nAn Emergency Bypass toggle has been added to a contact’s editing screen for Ringtone and Text Tone. When enabled, it’ll allow sounds and vibrations from that person even when Do Not Disturb is on. If you enable Emergency Bypass, it’ll be listed as a blue button in the contact card to quickly edit it again.
\nTap and hold links for share sheet
\nApple is taking a page from Airmail (as I hoped) to let you tap & hold a link and share it with extensions – a much-needed time saver.
\nParked car
\nI couldn’t test this because I don’t have a car with a Bluetooth system (yet), but iOS 10 adds a proactive Maps feature that saves the location of your car as soon as it’s parked. iOS sends you a notification after you disconnect from your car’s Bluetooth, dropping a special pin in Maps to remind you where you parked. The feature also works with CarPlay systems.
\nSpotlight search continuation
\nSearches for app content that began in Spotlight can now continue inside an app with the tap of a button.
\nDrafts uses the new Spotlight search continuation API to let users continue looking for content on the app’s own search page. Maps has also implemented search continuation to load places in the app.
\nBetter clipboard detection
\niOS 10 brings a more efficient way for apps to query the system pasteboard. Instead of reading clipboard data, developers can now check whether specific data types are stored in the pasteboard without actually reading them.
\nFor example, a text editor can ask iOS 10 if the clipboard contains text before offering to import a text clipping; if it doesn’t, the app can stop the task before reading the pasteboard altogether. This API should help make clipboard data detection more accurate for a lot of apps, and it’s more respectful of a user’s privacy.
\nPrint to PDF anywhere
\nA hidden feature of iOS 9 was the ability to 3D Touch on the print preview screen to pop into the PDF version of a document and export it. iOS 10 makes this available to every device (with and without 3D Touch) by pinching on the print preview to open Quick Look.
\nVideos cellular playback quality settings
\nIf you use Apple’s Videos app to stream movies and TV shows, you can now choose from Good and Best Available settings. I wish this also affected playback quality of YouTube embeds in Safari.
\nHLS and fragmented MP4 files
\nApple’s HTTP Live Streaming framework (HLS) has added support for fragmented MP4 files. In practical terms, this means more flexibility for developers of video player apps that want to stream movie files encoded in MPEG-4.
\nI tested a version of ProTube – the most powerful third-party YouTube client – with HLS optimizations for iOS 10. The upcoming update to ProTube will introduce streaming of videos up to 4K resolution (including 1440p) and 60fps playback thanks to changes in the HLS API.
\nIf your favorite video apps use HLS and deal with MP4 files, expect to see some nice changes in iOS 10.
\nTouch ID for Apple ID settings
\nSettings > iTunes & App Store > View Apple ID no longer requires you to type a password. You can view and manage your account with Touch ID authentication. This one deserves a finally.
\nNo more App Store password prompts after rebooting
\nIn a similar vein, the App Store will no longer ask you for a password to download a new app after rebooting your device. You can just use Touch ID instead.
\nContinuity Keyboard for Apple TV
\nIf your iPhone is paired with an Apple TV, you’ll get a notification whenever the Apple TV brings up a text field (such as search on the tvOS App Store).
\nYou can press (or swipe down) the notification on iOS to start typing in the quick reply box and send text directly to tvOS. A clever and effective way to reduce tvOS keyboard-induced stress.
\nApp Store categories, iPad, and search ads
\nThe App Store’s Explore section, launched with iOS 8 and mostly untouched since, has been discontinued in iOS 10. Categories are back in the tab bar, with the most popular ones (you can count on Games always being there) available as shortcuts at the top.
\nApple had to sacrifice the Nearby view to discover apps popular around you, but categories (with curated sections for each one of them) seem like the most popular choice after years of experiments.
\nOn the iPad, the App Store now supports Split View so you can browse and search apps while working in another app.
\nThis has saved me a few minutes every week when preparing the App Debuts section for MacStories Weekly.
\nApple is also launching paid search ads on the App Store. Developers will be able to bid for certain keywords and buy paid placements in search results. Ads are highlighted with a subtle blue background and an ‘Ad’ label, and they’re listed before the first actual search result – like on Google search.
\n\nIt’s too early to tell how beneficial App Store ads will be for smaller studios and indie developers that can’t afford to be big spenders in search ad bids. Apple argues that the system is aimed to help app discovery for large companies and small development shops alike, but I have some reservations.
\nAs a user, I would have liked to see Apple focus on more practical improvements to App Store search, but maybe the company is right and all kinds of developers will benefit from search ads. We’ll follow up on this.
\nNew ‘Add to Favorites’ UI
\nSimilar to the 3D Touch menu for a contact card, the view for adding a contact to your favorites has been redesigned with icons and expandable menus.
\nMore responsive collection views
\nExpect to see nice performance improvements in apps that use UICollectionView. iOS 10 introduces a new cell lifecycle that pre-fetches cells before displaying them to the user, holding onto them a little longer (pre-fetching is opt-out and automatically disabled when the user scrolls very fast). In daily usage, you should notice that some apps feel more responsive and don’t drop frames while scrolling anymore.
\nSecurity recommendations for Wi-Fi networks, connection status
\nIf you connect to a public Wi-Fi network (such as a restaurant hotspot), iOS 10 will show you recommendations to stay secure and keep your wireless traffic safe. There’s also better detection of poor connectivity with an orange “No Internet Connection” message in the Wi-Fi settings.
\nAccessibility: Magnifier and Color Filters
\nThere are dozens of Accessibility features added to iOS every year. I want to highlight three of them.
\nA new Magnifier app allows you to use the iPhone’s camera to magnify what’s around you and zoom into objects or text. The Magnifier isn’t another Apple app on the Home screen: if enabled in the Settings, a triple-click on the Home button will launch Magnifier as a custom app (it even shows up in the multitasking switcher) with options to control zoom level, color filters, color inversion, and turn on the camera flash. You can opt to adjust brightness and contrast automatically based on ambient light.
\niOS 10’s new Magnifier app.
While in Magnifier, you can move the camera around and apply filters in real-time. If you don’t want to hold up your iPhone for more than a few seconds, you can capture a still frame to zoom into the image and adjust colors.
\niOS 10’s Magnifier is technically impressive and it’s going to help millions of people with vision impairments. I’d suggest everyone to keep it installed as a quick way to use the iPhone’s camera as a magnifier – it’s incredibly well done and convenient.
\nUnder Display accommodations, a Color Filters menu can help users with color blindness or who have difficulty reading text on the display. Apple has included filters for grayscale, protanopia, deuteranopia, tritanopia, and color tint. It’s also a good reminder for developers that not all users see an app’s interface the same way.
\nFinally, you can now define custom pronunciations to be used when iOS reads text aloud. Available in Settings > Accessibility > Speech > Pronunciations, you’ll be able to type a phrase and dictate or spell how you want it to be pronounced by the system voice.
\nDictating a pronunciation is remarkable as iOS automatically inserts it with the phonetic alphabet after recognizing your voice. You can then choose to apply a custom pronunciation to selected languages, ignore case, and pick which apps need to support it.
\n\niOS 10 is characterized by an intrinsic duality: an acknowledgement of the platform’s maturity; and a relentless, yet disciplined pursuit of what’s next. Both depend on each other, and they’re the lens through which iOS 10 is best explained.
\nThe iMessage App Store, SiriKit, rich notifications, CallKit, and Maps extensions are a display of Apple’s willingness to let apps be more than disconnected silos. iOS 10 is continuing what iOS 8 started: third-party apps are becoming system features.
\nIt’s not just a matter of nurturing developer goodwill: the App Store ecosystem can be leveraged to increase the functionality of iOS, building features that appeal to how people want to use their iPhones and iPads. For Apple, such effort is a nod to the App Store’s strengths and progress. For developers and users, it means apps can have ramifications in the most important parts of iOS.
\nAt the same time, allowing apps to reach further into iOS shows how the concept of “app” itself is evolving.
\nWhen different features of an app can be experienced throughout the system, the app becomes more of a collection of services, broken into atomic units. They’re pervasive. Providing apps with more extensibility hooks results in moving more interactions away from the traditional app experience and into single-purpose mini interfaces. Whether it’s an interactive notification, a widget, an iMessage app, or a SiriKit extension, iOS 10 has a clear vision of apps as contextual helpers in addition to being standalone utilities. It’s only reasonable to expect Apple to follow this path going forward.
\nSigns of maturity include fixing what isn’t working, too. The redesigned Apple Music makes the case for a simplified streaming interface that addresses what many found confusing in its debut release. The pagination of Control Center is a welcome enhancement to its capabilities as much as it’s an admission of its original complexity. I’d argue that letting users remove Apple apps falls under the same category.
\nAlas, not every glaring problem has been remedied by iOS 10. File management continues to feel like a chore due to cumbersome document providers, and Apple managed to ship an incomprehensible iCloud Drive extension that doesn’t help at all. Mail is lagging behind a competition that is shipping useful integrations and modernized email features. The Slide Over app picker – one of the worst design decisions of iOS 9 – is still with us.
\nThe most disappointing aspect of iOS 10, in fact, is the treatment the iPad received, with uninspired adaptations of iPhone UIs and a lack of attention that’s in stark contrast with last year. In iOS 10, the iPad feels like a second-class citizen again, put on hold in the backseat, waiting for resources to be devoted to it. Perhaps all this will be resolved as Apple’s plans on iPad updates are revealed, but we can’t know yet. Today, iOS 10 isn’t the big milestone for iPad users that iOS 9 was.
\nAn acceptance of iOS’ grown-up status – and the responsibility that comes with it – isn’t the sole driver of its advancements. iOS 10 demonstrates how, at a fundamental level, change is the only constant in Apple’s software. Ironically, the company’s approach to change is what hasn’t changed at all: it’s iterative, divisive, farsighted, often surprising, and, frankly, never boring.
\nLooking at iOS 10’s features in isolation, we can spot every shade of change that has steered Apple so far. The need to make iMessage a platform and rethink Control Center. The patient expansion of the extensibility framework, done gradually – some might say too slowly – to ensure good performance and security. The first steps towards AI as a feature of our devices, built in a unique Apple way around privacy and laying the groundwork for the future.
\nBut these changes are more than discrete improvements. They’re no islands. As the tenth anniversary of the iPhone and its software draws closer, it’s time we take a holistic view of what iOS has become. iOS’ changes are simply a reflection of our own changes – whether it’s how much time we spend messaging with friends, how many pictures we take, the sensors we put in our homes, or the music we listen to. The memories we cherish, our conversations, the songs we listen to.
\nApple understands that, beyond technology, to improve iOS is to realize how much our lifestyles have changed. How software, after all, is nothing but our extension. From such perspective, iOS is never quite finished – it can only be relevant.
\nAnd even at its tenth version, iOS is still forging ahead.
\n\nThis review wouldn’t have been possible without the help, feedback, and existence of the following people, animals, beverages, and pieces of software:
\nFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.
\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.
\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;
\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;
\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.
\nLearn more here and from our Club FAQs.
\nJoin Now", "content_text": "Sometimes, change is unexpected. More often than not, change sneaks in until it feels grand and inevitable. Gradually, and then suddenly. iOS users have lived through numerous tides of such changes over the past three years.iOS 7, introduced in 2013 as a profound redesign, was a statement from a company ready to let go of its best-selling OS’ legacy. It was time to move on. With iOS 8 a year later, Apple proved that it could open up to developers and trust them to extend core parts of iOS. In the process, a new programming language was born. And with last year’s iOS 9, Apple put the capstone on iOS 7’s design ethos with a typeface crafted in-house, and gave the iPad the attention it deserved.\nYou wouldn’t have expected it from a device that barely accounted for 10% of the company’s revenues, but iOS 9 was, first and foremost, an iPad update. After years of neglect, Apple stood by its belief in the iPad as the future of computing and revitalized it with a good dose of multitasking. Gone was the long-held dogma of the iPad as a one-app-at-a-time deal; Slide Over and Split View – products of the patient work that went into size classes – brought a higher level of efficiency. Video, too, ended its tenure as a full-screen-only feature. Even external keyboards, once first-party accessories and then seemingly forgotten in the attic of the iPad’s broken promises, made a comeback.\niOS 9 melded foundational, anticipated improvements with breakthrough feature additions. The obvious advent of Apple’s own typeface in contrast to radical iPad updates; the next logical step for web views and the surprising embrace of content-blocking Safari extensions. The message was clear: iOS is in constant evolution. It’s a machine sustained by change – however that may happen.\nIt would have been reasonable to expect the tenth iteration of iOS to bring a dramatic refresh to the interface or a full Home screen makeover. It happened with another version 10 before – twice. And considering last year’s iPad reboot, it would have been fair to imagine a continuation of that work in iOS 10, taking the iPad further than Split View.\nThere’s very little of either in iOS 10, which is an iPhone release focused on people – consumers and their iPhone lifestyles; developers and a deeper trust bestowed on their apps. Like its predecessors, iOS 10 treads the line of surprising new features – some of which may appear unforeseen and reactionary – and improvements to existing functionalities.\n\niOS 10 is a major leap forward from iOS 9 – at least for iPhone users.\n\nEven without a clean slate, and with a release cycle that may begin to split across platforms, iOS 10 packs deep changes and hundreds of subtle refinements. The final product is a major leap forward from iOS 9 – at least for iPhone users.\nAt the same time, iOS 10 is more than a collection of new features. It’s the epitome of Apple’s approach to web services and AI, messaging as a platform, virtual assistants, and the connected home. And as a cornucopia of big themes rather than trivial app updates, iOS 10 shows another side of Apple’s strategy:\nSometimes, change is necessary.\n\neBook Version & Exclusive Making Of\n\nAn eBook version of this review is available exclusively for Club MacStories members. Club MacStories offers access to weekly MacStories extras – including workflows, app recommendations, and interviews – and it starts at $5/month.\nThe eBook version contains all the media (screenshots and videos) of the web version, including eBook-specific layout optimizations.\n\nThe eBook can be downloaded from the member Downloads area (to download files on iOS, see here).\nIn addition to the eBook, we’ll publish an exclusive Making Of newsletter for Club MacStories members later this week. In the Making Of, you’ll be able to read more about my writing process, interesting review stats, the image workflows we used, and how this special web layout was put together.\nGet exclusive extras and support MacStories by signing up for Club MacStories today.\n\n\nSupported Devices\nAs more features have been added to iOS over the years, its first-run setup flow has become bloated, if not downright unintuitive.\niOS 10 doesn’t take any meaningful steps to simplify the setup of a new iOS device, which is mostly unchanged from iOS 9. The only notable difference is the action required to begin the setup process, which is now “press Home to open”. As I’ll explore later, there’s a reason for this.\nWhere iOS 10 does break away from the old is in the system requirements needed to install the OS. Most devices from 2011 and 2012 aren’t compatible with iOS 10, including:\niPhone 4S\niPad 2\niPad (3rd generation)\niPad mini\niPod touch (5th generation)\nDevices supported by iOS 10.\nProgress, of course, marches on, but there are other notable points in this move.\nThe iPad 2 – perhaps the most popular iPad model to date – supported iOS 9 (in a highly constrained fashion) despite developers clamoring for its demise. After 5 years of service, Apple is cutting ties with it in iOS 10. By leaving the A5 and A5X CPUs behind, developers are now free to create more computationally intensive iPad apps without worrying about the lack of Retina display on the iPad 2 and the performance issues of the third-generation iPad holding them back.\nLook closer, and you’ll also notice that Apple is dropping support for all devices with the legacy 30-pin dock connector. If a device can run iOS 10, it is equipped with a Lightning port.\nIn addition to Lightning, every iOS 10-eligible iPad has a Retina display, but not every device comes with a Touch ID sensor yet, let alone a 64-bit processor, Apple Pay, or background ‘Hey Siri’ support.\nIt’s going to be a while until Apple can achieve its vision of 64-bit and one-tap payments across the board, but it’s good to see them moving in that direction by phasing out hardware that no longer fits what iOS has grown into. iOS 10 is starting this transition today.\n\n\n \nThe Lock Screen\nOne of the first interactions with iOS 10 is likely going to be an accidental swipe.\nFor the first time since the original iPhone, Apple is changing the “Slide to Unlock” behavior of the iOS Lock screen. iOS 10 gets rid of the popular gesture altogether, bringing tighter integration with Touch ID and an overhauled Lock screen experience.\nPress to Unlock\nLet’s back up a bit and revisit Steve Jobs’ famous unveiling of the iPhone and Slide to Unlock.\nAt a packed Macworld in January 2007, Jobs wowed an audience of consumers and journalists by demonstrating how natural unlocking an iPhone was going to be. Apple devised an unlocking gesture that combined the security of an intentional command with the spontaneity of multitouch. In Jobs’ words:\n\n And to unlock my phone I just take my finger and slide it across.\n We wanted something you couldn’t do by accident in your pocket. Just slide it across…and boom.\n\nAs the iPhone evolved to accommodate stronger passcodes, a fingerprint sensor, and a UI redesign, its unlocking mechanism stayed consistent. The passcode number pad remained on the left side of the Lock screen; even on the iPad’s bigger display, the architecture of the Lock screen was no different from the iPhone.\nWith the iPhone 6s, it became apparent that Slide to Unlock was drifting away from its original purpose. Thanks to substantial speed and accuracy improvements, the second-generation Touch ID sensor obviated the need to slide and type a passcode. However, because users were accustomed to waking an iPhone by pressing the Home button, Touch ID would register that initial click as a successful fingerprint read. The iPhone 6s’ Touch ID often caused the first Home button click to unlock an iPhone, blowing past the Lock screen with no time to check notifications.\nIronically, the convenience of Touch ID became too good for the Lock screen. As I wrote in my story on the iPhone 6s Plus:\n\n The problem, at least for my habits, is that there is useful information to be lost by unlocking an iPhone too quickly. Since Apple’s move to a moderately bigger iPhone with the iPhone 5 and especially after the much taller iPhone 6 Plus, I tweaked my grip to click the Home button not only to unlock the device, but to view Lock screen notifications as well. While annoying, the aforementioned slowness of previous Touch ID sensors wasn’t a deal-breaker: a failed Touch ID scan meant I could at least view notifications. When I wanted to explicitly wake my locked iPhone’s screen to view notifications, I knew I could click the Home button because Touch ID wouldn’t be able to register a quick (and possibly oblique) click anyway.\n That’s not the case with the iPhone 6s Plus, which posed a peculiar conundrum in the first days of usage. Do I prefer the ability to reliably unlock my iPhone with Touch ID in a fraction of a second, or am I bothered too much by the speed of the process as it now prevents me from viewing notifications on the Lock screen?\n\nApple is making two changes to the unlocking process in iOS 10 – a structural one, with a redesign of the Lock screen and its interactivity; and a behavioral one to rethink how unlocking works.\nApple hopes that you’ll no longer need to click any button to wake an iPhone. iOS 10 introduces Raise to Wake, a feature that, like the Apple Watch, turns on the iPhone’s display as soon as it’s picked up.\n\n \nRaise to Wake\n\nRaise to Wake is based on a framework that uses sensors – such as the motion coprocessor, accelerometer, and gyroscope – to understand if a phone has been taken out of a pocket, but also if it’s been picked up from a desk or if it was already in the user’s hands and its elevation changed. Due to ergonomics and hardware requirements, Raise to Wake is only available on the iPhone 6s/7 generations and it’s not supported on the iPad.\nApple has learned from the first iterations of watchOS: Raise to Wake on the iPhone 6s and iOS 10 is more accurate than the similar Watch feature that shipped in 2015. In my tests, Raise to Wake has worked well when taking the iPhone out of my pocket or picking it up from a flat surface; it occasionally struggled when the iPhone was already in my hands and it was tricky for the system to determine if it was being raised enough. In most everyday scenarios, Raise to Wake should wake an iPhone without having to click the Home or sleep buttons.\nRaise to Wake is only one half of the new unlocking behavior in iOS 10: you’ll still need to authenticate and unlock a device to leave the Lock screen. This is where the iPhone’s original unlocking process is changing.\nTo unlock a device running iOS 10, you need to click the Home button. If the display is already on and you place your finger on the Touch ID sensor without clicking it – as you used to do in iOS 9 – that won’t unlock the device. By default, iOS 10 wants you to physically press the Home button.\nBye, slide to unlock.\nThis alteration stems from the unbundling of fingerprint recognition and Home button click, which are now two distinct steps. Placing a finger on Touch ID authenticates without unlocking; pressing the Home button unlocks.\nIn Apple’s view, while Raise to Wake turns on the display, authentication may be required to interact with features on the Lock screen – such as actionable notifications, widgets, or Spotlight results. With iOS 10, users can pick up an iPhone, view what’s new on the Lock screen, and authenticate (if necessary1) without the risk of unlocking it.\nFrom a design standpoint, this change is reflected in the icons and messages displayed to the user on the Lock screen. When the display turns on with Raise to Wake, a padlock icon in the status bar indicates that the user has not yet authenticated with Touch ID. At the bottom, a ‘Press home to unlock’ message replaces the old ‘slide to unlock’ one.\nLocked.\nWith the display on and after Touch ID authentication, ‘Press home to unlock’ becomes ‘Press home to open’ and the status bar lock switches to an ‘Unlocked’ message.\nUnlocked.\nUnder the hood, clicking the Home button and placing a finger on Touch ID are two separate actions. However, the wording of ‘Press home to unlock’ feels like Apple wants you to think of them as one. The entire message is an illusion – pressing the Home button by itself doesn’t actually unlock a device – but Raise to Wake combined with the second-generation Touch ID will make you believe in it.\nTouch ID Nuances\nIf a device doesn’t have Touch ID (or if Touch ID can’t read a fingerprint), pressing the Home button will bring up the number pad or keyboard. And, if an iPhone’s display is activated by clicking the Home button (try this with a non-Touch ID finger) and then the finger is lifted off the button, placing it on Touch ID again (without clicking it) will unlock the device.\nOn an iPhone 6s, one click on the Home button is all that’s needed to exit the Lock screen – at least most of the time. If the iPhone’s display is off because Raise to Wake didn’t work (or because you manually locked it while holding it), the experience is similar to iOS 9. Clicking the Home button with a Touch ID-enabled finger will wake up the display and bypass the Lock screen.\nYou can revert to a pre-iOS 10 unlocking experience if you don’t like the new one. First, Raise to Wake can be disabled in Settings > Display & Brightness, and your iPhone will no longer turn on when picked up. Additionally, tucked away in Settings > Accessibility > Home Button, you’ll find an option called ‘Rest Finger to Open’. When enabled, your iPhone will unlock through Touch ID alone, without having to press the Home button.\nIt takes some time to get used to the new unlocking behavior of iOS 10. The apparent unification of Home button click and Touch ID makes less sense on devices without the second-generation sensor, where one click is rarely enough and tends to bring up the passcode view for a second attempt. And, nostalgically speaking, I miss the old ‘slide to unlock’ message, although for reasons that are merely emotional and not related to function.\n\nI now expect my iPhone to know when it’s time to wake up.\n\nAfter three months, Raise to Wake and Press to Unlock have made the overall unlocking experience faster and more intuitive. I now expect my iPhone to know when it’s time to wake up and show me the Lock screen, and I don’t miss the old unlocking process. Raise to Wake eliminates the need to click a button to wake an iPhone; having to press the Home button to unlock removes the risk of accidentally leaving the Lock screen.\nBut it all goes back to that accidental swipe. Picture this: you’ve just upgraded to iOS 10, or you’ve bought a new iPhone with iOS 10 pre-installed, and, instinctively, you slide to unlock. What you’re going to see isn’t an error message, or the Lock screen bouncing back, telling you that you need to press the Home button instead. You’re going to see the biggest change to the Lock screen – potentially, a better way of interacting with apps without unlocking a device at all.\nSlide to unlock, and you’ll meet the new Lock screen widgets.\n\nLock Screen Widgets\nTechnically, Lock screen widgets predate iOS 10. On both the iOS 8 and iOS 9 Lock screens, users could swipe down to reveal Notification Center and its Today view. However, iOS 10 adds an entirely new dimension to the Lock screen, as well as a refreshed design for widgets throughout the system.\nThe Lock screen’s renovation in iOS 10 starts with three pages: widgets and search on the left, the Lock screen (with notifications and media controls) in the middle, and the Camera on the right. You can swipe to move across pages, as suggested by pagination controls at the bottom of the Lock screen.\nThe Lock screen’s new horizontal hierarchy, with widgets on the left.\nThe leftmost page, called the Search screen, isn’t completely new either. Apple took the functionality of Spotlight search and Proactive of iOS 9, mixed it up with widgets, and made it a standalone page on the iOS 10 Lock screen (and Home screen, too).\nFrom left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.\nNotably absent from iOS 10’s Lock screen is the Camera launcher button. By getting rid of the tiny shortcut in the bottom right corner, Apple has made the Camera easier to launch: swiping anywhere to move between Lock screen and Camera is easier than carefully grabbing an icon from a corner. I’ve been taking more spontaneous, spur-of-the-moment pictures and videos thanks to iOS 10’s faster Camera activation on the Lock screen.\nApple’s sloppy swiping for Lock screen navigation has one caveat. If notifications are shown, swiping horizontally can either conflict with actionable buttons (swipe to the left) or open the app that sent a notification (swipe right). You’ll have to remember to swipe either on the clock/date at the top or from the edge of the display; such is the trade-off of using the same gestures for page navigation and notification actions.\nWhere to swipe when notifications fill the Lock screen. (Tap for full size)\nThree changes stand out when swiping right to open the Search screen:\nThere’s a search field at the top, shown by default;\nThe clock2 stays pinned to the right3;\nWidgets have a new design that favors richer, bigger content areas.\nUnlike their predecessors, widgets in iOS 10 don’t blend in with the dark background of Notification Center. This time, Apple opted for standalone units enclosed in light cells with an extensive use of custom interfaces, buttons, images, and dark text.\nWidgets in Notification Center on iOS 9 and iOS 10.\nThere’s a common thread between widgets and notifications (also redesigned in iOS 10): they’re self-contained boxes of information, they sit on top of the wallpaper rather than meshing with it, and they display an app’s icon and name in a top bar.\nNotifications and widgets. Spot the trend.\nThe new design is more than an aesthetic preference: the makeover has also brought functional changes that will encourage users and developers to rethink the role of widgets.\nA widget in iOS 10 supports two modes: collapsed and expanded. The system loads all widgets in collapsed mode by default, which is about the height of two table rows (about 110 points). All widgets compiled for iOS 10 must support collapsed mode and consider the possibility that some users will never switch to the expanded version. Apps cannot activate expanded mode on the user’s behalf; switching from compact to expanded is only possible by tapping on a ‘Show More’ button in the top right corner of a widget.\nCompact and expanded widgets.\nThis is no small modification, as it poses a problem for apps that have offered widgets since iOS 8. Under the new rules, apps updated for iOS 10 can’t show a widget that takes up half of the display as soon as it’s installed. Any widget that wants to use more vertical space for content – such as a todo list, a calendar, or even a list of workflows – will have to account for the default compact mode.\nFor some developers, this will mean going back to the drawing board and create two separate widget designs as they’ll no longer be able to always enforce one. Others will have to explain the difference to their users. Workflow, which used to offer a widget that could dynamically expand and collapse, is updating the widget for iOS 10 with a label to request expansion upon running a workflow that needs more space.\nWorkflow’s new iOS 10 widget.\nThere’s one exception: legacy iOS 9 apps that haven’t been updated for iOS 10. In that case, the system won’t impose compact mode and it won’t cut off old widgets (which keep a darker background), but there’s a strong possibility that they won’t look nice next to native iOS 10 ones.\nThe same widget in iOS 9 legacy mode and with native iOS 10 support.\nI don’t see how Apple could have handled this transition differently. Design updates aside, there’s an argument to be made about some developers abusing Notification Center with needlessly tall and wasteful widgets in the past. Compact mode is about giving control to the users and letting them choose how they prefer to glance at information. Want to install a widget, but don’t need its full UI? Use it in compact mode. Need to get more out of it? Switch to expanded.\nApple’s decision to adopt compact and expanded modes in iOS 10 is a nod to developers who shipped well-designed widgets in the past, and it provides a more stable foundation going forward.\nI’ve been able to test a few third-party iOS 10 widgets that illustrate the advantages of these changes.\nPCalc, James Thomson’s popular iOS calculator, has a new widget that displays a mini calculator in compact mode with numbers and basic operations split in two rows.\n\nDespite the small touch targets, the compact interface is usable. If you want bigger buttons and a more familiar layout, you can switch to expanded mode, which looks like a small version of PCalc living inside a widget – edge-to-edge design included.\nLauncher doesn’t modify its widget’s interface when toggling between compact and expanded, but the constraints of the smaller layout force you to prioritize actions that are most important to you.\n\nUsing compact mode for summary-style UIs will be a common trend in iOS 10. CARROT Weather is a good example: it shows a summary of current conditions when the widget is compact, but it adds forecasts for the day and week ahead when expanded.\nCARROT’s widget can be customized with two styles.\nEven better, slots in the compact layout can be customized in the app, and you can choose to use the widget in light or dark mode.\nDrafts has an innovative implementation of compact and expanded layouts, too. In compact, the widget features four buttons to create a note or start dictation. When the widget expanded, it grows taller with a list of items from the app’s inbox, which can be tapped to resume editing.\n\nIn the past, developer Greg Pierce would have had to ask users to customize the widget or make it big by default; in iOS 10, they can switch between modes as needed.\nWidgets’ ubiquitous placement pushes them to a more visible stage; as soon as more developers adapt4, iOS 10 has the potential to take widgets to the next level.\nI believe the new design will play an essential role in this.\nThe Design of Widgets\nApple advertises legibility and consistency as core tenets of widgets in iOS 10, and I agree: widget content and labels are easier to read than iOS 9. Standalone light cells separate widgets with further precision; I haven’t found translucency with the Lock screen wallpaper to be an issue.\nIn addition, the light design brings deeper consistency between apps and widgets. Most iOS apps have light backgrounds and they employ color to outline content and indicate interactivity. In iOS 10, widgets are built the same way: the combination of light backgrounds, buttons, and custom interfaces is often consistent with the look of the containing app.\nIn this regard, widgets feel more like mini-apps available anywhere rather than smaller, less capable extras. The line between widget and full app UIs is more blurred than ever in iOS 10.\nApple’s new Notes and Calendar widgets showcase this newfound cohesiveness. The Notes widget displays the same snippets of the list in the Notes app. Buttons to create new notes and checklists are also the same. The widget looks and feels like a small version of Notes available anywhere on iOS.\nFrom app to widget.\nThe Calendar widget is even more indicative. Glancing at events and recognizing their associated calendar wasn’t easy in iOS 9, as they only had a thin stripe of color for the calendar to which they belonged.\nThe Calendar widget is more contextual on iOS 10.\nIn iOS 10, forgoing a dark background has allowed Apple to show Calendar events as tinted blocks matching the look of the app. Discerning events and the calendars they belong to is easier and familiar.\nConsistency of apps and widgets.\nI wouldn’t expect every app to adopt a widget design that exactly mirrors the interface users already know, but it can be done. Switching to a light design has given Apple a chance to reimagine widgets for consistency with apps and lively combinations of color, text, and icons. They are, overall, a step up from iOS 9 in both appearance and function.\nThe new direction also opens up a future opportunity: what is light can be more easily converted to dark. I could see a system dark mode working well for widgets.\nThe iPad Lock Screen\nThe iPad’s Lock screen doesn’t break any new ground, but there are some differences from the iPhone.\nOn the iPad, notifications are displayed on the left side of the screen when in landscape. They’re aligned with the system clock, and they leave room for media controls to be displayed concurrently on the right. Dealing with notifications while controlling music playback is a task well suited for the iPad’s larger display.\n\nUnfortunately, Apple doesn’t think portrait orientation should warrant the same perks. If a notification comes in while album artwork is displayed on the Lock screen, the artwork will be hidden. Apple decided against using a two-column layout in portrait, which I don’t understand: they’re already doing it for widgets on the iPad.\nNo artwork for you, Mr. Portrait.\nFurthermore, if no music is playing on an iPad in landscape, having notifications aligned to the left for no apparent reason looks odd and seems…unnecessary.\nThe right side seems cozy.\nWidgets fare a little better. Apple has kept the two-column design first introduced in the Today view of iOS 9; you can still scroll the two lists of widgets independently.\n\nI would have appreciated the ability to further control the resizing and placement of widgets on the iPad, and the Lock screen design seems uninspired. We’ll have to make the most of this bare minimum work for now.\nApple’s Widgets\niOS 10 sports an increased modularity of widgets. Apple has done away with grouping multiple types of content under Siri Suggestions – most Apple apps/features have their own widget, which can be disabled from a revamped configuration screen.\nWidget’s new configuration screen.\nHere’s an overview of what’s changed.\nActivity\nYour Activity rings from the Apple Watch, with a summary of Move, Exercise, and Stand statistics.\nCalendar\n\nA mini calendar interface. Events are displayed as colored blocks matching the calendar they belong to. You can tap on an event to open it, and expand the widget to reveal more events.\nFavorites\n\nShortcuts to your favorite contacts with different ways to get in touch with them. New in iOS 10, you can assign iMessage as well as third-party communication apps (messaging and VoIP) to contact entries in Favorites, which will be displayed in the widget.\nMail\n…yeah.\nThe Mail widget is the weakest of the bunch: it only displays shortcuts for VIP contacts. I would have preferred to see a preview of the unified inbox, or perhaps an option to show flagged messages.\nMaps\nMaps has three widgets: destinations, nearby, and transit. While the latter isn’t available for my area (Rome, Italy), the other two have worked inconsistently. I’ve never seen a nearby recommendation in the widget, despite being around places rich in POIs. The Destinations widget usually tells me how much time it’ll take me to drive home, but it doesn’t proactively suggest other locations I frequently visit.\nMusic\nThe Music widget is an odd one. It displays a grid of what appears to be either recently played music or your all-time most listened albums. The widget doesn’t clarify whether it’s showcasing albums or individual songs; it uses album artworks with no text labels, and it plays either the most played song from an album, or an entire album starting from the first song.\n\nA nice perk: music starts playing after tapping the widget without opening Apple Music. But it always feels like a lottery.\nNews\nTop Stories from Apple News (shown even if you mute the channel). The widget uses image thumbnails and custom typography matching the bold font of Apple News for headlines.\n\nThe best change from iOS 9: news can be disabled by removing the widget.\nNotes\nA preview of your most recent notes. In compact mode, the widget only shows the last modified note. In expanded mode, you get more notes and buttons to create a new note, a checklist, snap a picture, and create a drawing.\nPhotos\n\nA collection of Memories created by the new Photos app in iOS 10. Each one can be tapped to view the associated memory in Photos.\nSiri App Suggestions\niOS 9’s proactive Siri Suggestions are now smaller in scope and they’re called Siri App Suggestions. The widget displays 4 app shortcuts (8 in expanded mode), and it doesn’t suggest other types of content.\n\nLike News, it can also be removed and be placed anywhere on the Search screen.\nTips\nYou’d think that the Tips widget is useless – everyone likes to make fun of Tips – but hear me out. In compact mode, the widget shows a tip’s snippet; you can tap it and open the Tips app. Switch to expanded mode, though, and you’ll be presented with a custom interface with an explanation of the tip and a large animation at the top to show you the tip in action.\n\nThe Tips widget looks great, and it’s the most technically impressive one on iOS 10.\nUp Next\nThe old Today Summary widget has been renamed Up Next. It displays a smaller version of your next event without the full UI of the Calendar widget. Alas, the Tomorrow Summary widget is gone from iOS 10.\nWeather\nPerhaps the best example of how widgets can use compact and expanded modes, Apple’s Weather widget shows weather conditions for the current location when compact, and a forecast of the next six hours when expanded.\n\nWeather is the widget I’ve used the most in the past three months to look up forecasts from the Lock screen in just a couple of seconds.\nSlide to Glance\nThe move to apps as atomic units scattered across the system is everywhere in iOS 10, with widgets being the foremost example.\nNoticeably absent from iOS 10’s widgets is a push for more proactive recommendations. As we’ll see later, Apple has shifted its Proactive initiative to run through the OS and inside apps rather than distilling it into widgets.\n3D Touch is another illustrious no-show. While notifications have been overhauled to make good use of 3D Touch, pressing on a widget will result in a disappointing lack of feedback. 3D Touch would be a perfect fit for widgets – imagine previewing a full note or reading the first paragraphs of a news story from the Lock screen.\nThe new widget design and Search screen placement make an iPhone more useful without having to unlock it. Apple has done a good job with their built-in widgets; it’s up to developers now to rethink how their apps can take advantage of them. I’m optimistic that everything will turn out better than two years ago.\n\nI unlock my iPhone less thanks to iOS 10’s more capable Lock screen.\n\nI unlock my iPhone less thanks to iOS 10’s more capable Lock screen. Raise to Wake, Press to Open, widgets, search, and rich notifications make the entire Lock screen experience drastically superior to iOS 9.\nEasier to navigate, better structured, less prone to unwanted unlocks. I wouldn’t be able to go back to the old Lock screen.\n\n\n \nNotifications\niOS 10’s rethinking of apps as granular interactions doesn’t stop at widgets. With a new framework that can turn incoming notifications into rich, actionable interfaces, Apple wants users to spend less time jumping between apps.\nNotifications iOS 9 and 10.\nNotifications in iOS 10 share the same design principles of widgets. Rather than being grouped in a list of items on top of a dark background, notifications are discrete light cells that can be pressed (with 3D Touch), pulled down (for incoming banners), or swiped and expanded into a floating card preview.\n\n \nExpanding a Messages notification.\n\nThe anatomy of an expanded notification – whether an app has been updated for iOS 10 or not – has fixed elements that developers can’t control. There’s a header bar at the top with the icon and name of the app, and a close button on the right to dismiss the notification. Tapping the icon on the left side will open the app that sent the notification.\nThe standard look of a notification in iOS 10.\nThis is true for both iPhones with 3D Touch and devices without it; to expand a notification on an iPad or an older iPhone (or if you don’t want to use 3D Touch), you can pull down an incoming notification banner or swipe a notification to the left in Notification Center and tap ‘View’.5\nNew APIs allow developers to take different actions for notifications that have been sent to the user – including ones that have been cleared. First, notifications can be dismissed with a Clear action by swiping on them. Apps can monitor the dismiss action and stop delivering the same notification on other devices.\nAdditionally, developers can remove, update, and promote notifications that have already been sent. Apple’s goal was to prevent Notification Center from being cluttered with old notifications that aren’t relevant anymore. If developers implement this API, updating a notification with fresh content should help users see what’s changed. Imagine sports scores or live-streaming apps and how they could update notifications. I’m curious to see which services will convert to this behavior instead of spamming users with multiple alerts.\nUnderneath the header of an expanded notification is the content developers can control, and where the most important changes to notifications are happening.\nIn iOS 10, notifications can have a title and a subtitle. The title is displayed in a bold font, which helps identifying the subject of a notification. In a Reminders notification, the name of a reminder will be the bold title at the top, with its note displayed as text content below it.\nThe default look of a notification in iOS 10. Expansion is relative to a notification’s placement on screen.\nBelow the title and subtitle, iOS 10 shows a notification’s body text content (same as iOS 9) and actionable buttons. In a welcome change from the past, developers can define more than two notification actions, displayed in a list under the notification’s card.6 If an app requires a quick reply upon expanding a notification, the input field will sit above the keyboard – it’s not attached to the notification like in iOS 9.\nQuick replies in iOS 9 and iOS 10.\nDesign changes alone, though, wouldn’t have sufficed to modernize notifications. To reinvent their feel and capabilities, Apple has created two new extension points for developers in iOS 10: Notification Service and Notification Content.\nThe Notification Service extension doesn’t have an interface and runs in the background. Upon triggering a notification but just before delivering it to the user, an app can call the Notification Service extension to augment or replace its payload. This extension is meant to have a short execution time and it’s not designed for long tasks. Possible use cases for Notification Service extensions could be downloading an image or media file from a URL before showing a notification, or decrypting an encrypted payload locally for messaging apps that rely on end-to-end encryption.\nThe Notification Service extension should come in handy given iOS 10’s ability to include a media attachment (images, audio, videos, and even GIFs) in both the notification banner and the expanded notification. If they adopt it, apps like WhatsApp and Telegram could omit the “[Contact] sent you an image” standard notification and display a thumbnail in the notification banner (like iMessage does) and a full image preview in the expanded notification.\nNotification Content extensions are what users are going to see the most in daily usage, and they motivate iOS 10’s notification card design.\nA notification in iOS 10 can show a custom view between the header and default text content. Custom views can be anything – an embedded map, a message conversation, media, a calendar view, etc. – and they’re managed by the Notification Content extension. Custom views are non-interactive: they can’t receive touch events7, but they can be updated in-place in response to a task performed from a notification action. Apps can hide the default content of a notification if the custom view is informative enough.\nService and Content extensions, combined with the expanded design, have turned notifications in iOS 10 into a completely new experience. Notifications are no longer just text: they are custom app UIs delivered to you with rich previews and interactions that can live on longer than a couple of seconds. Notifications in iOS 10 are mini apps in and of themselves.\n\nNotifications in iOS 10 are mini apps in and of themselves.\n\nWhen you receive an iMessage that contains a photo, the incoming notification can be expanded, either with 3D Touch or a swipe. You’ll be treated to a full iMessage conversation UI, living inside the notification, with the same transcript, read receipts, and typing indicators you’d see in the Messages app.\nTo expand a notification, you can pull it down or press on it.\nNot only can you send a reply – you can keep the iMessage interface open as you keep a conversation going from the notification. It’s a fantastic way to check into a conversation without the constraints of a quick reply.\nScroll up in the transcript to view older messages.\nWhen you’re done, swipe down to dismiss the notification, and you’ll be back to whatever you were doing.8\nCalendar notifications follow the same concept. If an event with a location attached is coming up, the expanded notification will display the default text content at the bottom, but also a preview of the address with a Maps view at the top.\n\nThanks to actionable buttons, you can open directions in Maps without launching Calendar. If an upcoming event doesn’t have a location, you’ll see a preview of your agenda inside the notification.\nI tested a version of Workflow optimized for iOS 10, which brings improved notification support with the ability to customize the content displayed in a notification card. In addition to a title, you’ll be able to embed pictures, videos, GIFs, and even Maps views into a Workflow notification.\nRich notifications created with Workflow.\nPictures are displayed as thumbnails in a notification banner before expanding it; videos can be played inline within the card itself.\n\n \n\nAnd if you often receive messages containing GIFs, iOS 10 will let you preview them directly from a notification.\n\n \n\nCARROT Weather has a clever take on rich notifications in iOS 10. The daily digest and severe weather/precipitation alerts can be expanded into dynamic preview cards.\n\nThrough a Notification Content extension, the app can embed a custom interface, sounds, and even animations inside the notification card. As a result, viewing CARROT’s notifications feels more like using the app rather than reading a plain text summary.\nWith a new framework and the flexibility granted by extensions, we’re going to see a rise of interaction methods fueled primarily by notifications. Of all the places where an app can advertise its functionality on iOS (widgets, keyboards, extensions), a notification is the most direct, contextual way to reach users at an appropriate time.\nA notification carries interest and, in many cases, a sense of urgency. iOS 10 transforms notifications from a passive delivery system into an active experience where users engage with an app through UIs, actions, and feedback they’re already familiar with. It’s a win-win for developers, who can make their apps more useful through richer notifications, and for users, who no longer have to open apps to benefit from their services.\niOS 10’s notifications are a new layer on top of apps. They’re going to change how we deal with them every day.\n\n\nThe Home Screen\nThe iPhone 6s brought the first significant adjustment to the iOS Home screen in years – 3D Touch quick actions. With iOS 10, Apple is cautiously expanding the Home screen beyond app shortcuts, but in ways you might not expect.\nSearch from the Home screen: pull down (left) or swipe right to open the new Search screen.\nAs in iOS 9, Spotlight search can be accessed from two locations: the Search screen on the left side of the Home screen and by pulling down on app icons. The Search screen on the left mirrors its Lock screen counterpart.\nA Pale Page Dot\n\nThe Search screen (both on the Lock screen and Home screen) doesn’t have a special page indicator at the bottom – it has a standard page dot. Apple may be hinting that the page can do more than search alone, but the difference, at least visually, sticks out. A different icon would have been better.\nNotification Center has gone through some deeper changes. The segmented control to switch between notifications and widgets at the top is gone, replaced by another set of page indicators. Every time you open Notification Center, iOS 10 will default to showing you notifications in chronological order under a new ‘Recent’ header – it doesn’t remember your position in the two pages. Unfortunately, the option to group notifications by app has also been removed.\nWhether by laziness or deliberate design, there’s an abundance of ways to activate Spotlight search in iOS 10. Let’s round them up:\nSearch from the Lock screen (above widgets);\nOpen the Search screen (left side of the Home screen) and pull down or tap the search field;\nPull down on icons on the Home screen;\nSwipe down to open Notification Center and tap Search above notifications;\nSwipe right on Notification Center to open widgets and find Search at the top;\nUse Command-Space on an iPad with an external keyboard and Spotlight will open modally on top of whatever app you’re using without going back to the Home screen;\nLast, and perhaps more perplexingly, there’s a hidden way to open Spotlight modally when inside apps on the iPhone 6s. When using an app, swipe down slowly from the status bar until you feel a first haptic feedback, then let go. Instead of opening notifications, the text cursor will focus in the search field. If you don’t let go after the first vibration but keep swiping down, you’ll open Notification Center. This method doesn’t work on the Home screen – only in apps. It’s also supported on older devices, albeit without haptic feedback.\nThat’s seven ways to open Spotlight search on iOS 10.\nSix shades of Spotlight search on iPhone.\nBeing able to access search from everywhere – be it on the Home screen, the Lock screen, or when using an app – is convenient. It makes Spotlight pervasive. As Apple continues to grow their search efforts across native apps, web partnerships, and Proactive suggestions, Spotlight’s omnipresence will become a valuable strategic asset.\nApple continues to be a steadfast supporter of the Home screen as a grid of icons. In a potential disappointment for those who hoped to see a major Home screen refresh this year, the biggest new feature is an extension of 3D Touch quick actions and widgets, rolled into one.\nQuick actions and widgets on the Home screen.\nApps that offer a compact widget in iOS 10 can display it alongside quick actions when a user presses the app’s icon. The widget is the same used in the Search screen – in fact, there’s a button to install it directly from the Home screen.\niPhone Plus models can display quick actions and widgets on the landscape Home screen as well.\nI’m not sure I buy into Apple’s reasoning for combining widgets and quick actions – at least not yet. The glanceability of widgets finds its raison d’être on the Lock screen and inside apps; on the other hand, I associate going back to the Home screen and pressing an icon with launching, not glancing. Years of iOS usage trained me to see the Home screen as a launchpad for apps, not an information dashboard.\nIn three months of iOS 10 – and with plenty of glanceable/actionable widgets to test – I’ve only remembered to use a widget on the Home screen once (it was PCalc). It’s not that having widgets alongside quick actions is bad; it’s just forgettable. It’s the equivalent of two neighbors being forced to live together under the same roof. Having company can be nice sometimes, but everyone would be better off at their own place.\nThere are other smaller 3D Touch additions to the Home screen in iOS 10. You can press on folders to bring up a Rename action, and apps inside folders that have unread badges will be listed in the folder’s quick action menu.\n\nFolders have also received a visual refresh, with a nicer background blur that shows the grid of icons in the current Home screen page.\n\nOn the iPad, Apple didn’t bring any improvements to the Home screen in iOS 10, but I’m sure you’ll be relieved to know that closing an iPad app no longer adjusts the icon’s corner radius on the Home screen.\nThis relates to a deeper change happening to Home screen animations. Apple has rebuilt the entire SpringBoard animation stack with faster, interruptible animations. Along with a reduced animation curve to launch apps (what was one of the most criticized aspects of iOS 7), you can click the Home button right after tapping an app’s icon and the animation will stop, going back to the Home screen in an instant.\n\n \nHome screen animations\n\nYou can try the same with a folder: tapping outside of it will cancel the animation instantly in mid-flight. The difference with iOS 9’s Home screen animations is staggering.\n\n \nHome screen animations\n\nThey’re not a “feature”, but new animations are the best Home screen change in iOS 10.\nIt’s fair to wonder if Apple will ever desecrate the sanctity of the Home screen and allow users to mix icons and widgets.\nAnyone who’s ever looked at Android will spot obvious similarities between widgets for Google’s platform and what Apple has done with widgets in iOS 10. Apple still believes in the separation of icons and app content; they only added widgets to 3D Touch quick actions and they didn’t even allow the iPad Pro’s large Home screen to go beyond icons. But for how long?\nThe iOS Home screen has served us well for years, but as screens keep getting bigger, it’s time to do more than a grid of icons with quick actions. The other side of the fence is closer than ever; a final leap wouldn’t be too absurd.\n\n\nControl Center\nSince its introduction in 2013, Control Center has become a staple of iOS, providing users with a panel of commonly accessed shortcuts. iOS 10’s Control Center is a radical shift from its origins, and a harbinger of how iOS is changing.\nControl Center’s design has evolved over the years, from the wireframe-like look of iOS 7 to the friendlier, rounder buttons of iOS 9.\n\nApple wasn’t led astray by the expansion of iOS, to the point where cramming more functionality into Control Center turned into a balancing act of prioritizing important controls without sacrificing their purpose.\nIt was clear that Control Center’s original vision couldn’t scale to the growing nature of iOS. And so with iOS 10, Apple has torn down Control Center and started from scratch. The single-page mosaic of tiny buttons is no more. The new Control Center breaks up system shortcuts and audio controls in two separate pages, with the addition of a third page for HomeKit (if available). Everything’s bigger, spacious, and colorful.\nThe three pages of Control Center in iOS 10.\nYou still open Control Center with a swipe from the bottom of the display. In iOS 10, swiping pulls up a card with paginated controls underneath it. The design is familiar, yet unmistakably new. Margins across each side convey the card metaphor; controls are bigger and buttons have more padding; there’s more color in every card.\nAfter three years of Control Center, the new version in iOS 10 feels lively and friendly; perhaps even more fun. On the other hand, pagination and bigger controls raise a question: has simplicity come at the expense of efficiency in Control Center?\nSystem Controls\nA useful exercise to understand Control Center in iOS 10 is to take stock of how much Apple is leaving behind. Let’s compare iOS 9’s Control Center to the same screen in iOS 10:\n\nThe first page of Control Center in iOS 10 has lost audio playback. Initially, that may feel like a downgrade. But let’s swipe left and consider what Control Center has gained by separating system and audio controls:\n\nThe difference is striking. Giving audio playback its own space lets Control Center present more information for the media being played. It’s also more accessible thanks to bigger text labels, buttons that don’t need to be carefully tapped, and hardware controls embedded in the same page.\nThis won’t be easy to accept for iOS power users who cherish dense UIs: Control Center buys into a trend followed by many (but not all) parts of iOS 10. Big, bold controls, neatly laid out, spread over multiple views.\nThe first beneficiary of such clarity is the system controls page. The first row of toggles at the top has kept iOS 9’s iconography and arrangement, but each button is color-matched to the setting it activates when toggled.9\nControl Center is bleeding…four colors?\nI found colored toggles extravagant at first; now, I like that I can glance at those buttons and know which setting is engaged.\nDon’t forget about landscape mode.\nThe brightness slider and the AirPlay, AirDrop, and Night Shift buttons have been enlarged and simplified as well. For one, the slider’s puck is more comfortable to grab. The buttons reveal another tendency in iOS 10’s semi-refreshed design language: they’re actual buttons with rounded borders and they use color to indicate status.\n\nIn a change that’s reminiscent of Sam Beckett’s fantastic concept, you can press on the bottom row of shortcuts to show a list of 3D Touch quick actions. These include three intensity levels for the flashlight, timer options, a shortcut to copy the last Calculator result, and different Camera modes.\n\nAs I elaborated before, Control Center was an ideal candidate for 3D Touch actions. However, Apple’s implementation in iOS 10 is limited to the bottom row of apps; you can’t press on the Bluetooth icon to connect to previously paired devices, nor can you press on the Wi-Fi toggle to connect to a different network. The addition of 3D Touch to the lower end of Control Center shows that Apple recognizes the utility of quick actions for system-wide shortcuts, but they’re not fully committed to the idea yet.\nDespite some missing features and growing pains to be expected with a redesign, iOS 10’s first Control Center page is an improvement. With a sensible reliance on color, a more legible layout, and the first steps toward full 3D Touch support, Control Center’s system card is easier to parse, nimble, and intuitive.\n“It Also Looks Great on the iPad”\nControl Center’s design direction has been taken to the extreme on the iPad. Only one page can be used at a time; the AirDrop, AirPlay, and Night Shift buttons are needlessly wide. It doesn’t take a design expert to figure that Apple just wanted to ensure basic compatibility with an iPhone feature instead of designing Control Center around the iPad.\nLook at it this way: if Control Center didn’t exist on the iPhone and Apple decided to introduce it on the iPad today, would it look like this?\n\nThe lack of an iPad-first approach was passable in the old Control Center because of its compact design. But with iOS 10, following the iPhone’s model has a detrimental effect. Buttons are too big and little care went into optimizing the UI for the iPad’s screen. Apple should reconsider what they’re doing with Control Center on the iPad instead of upscaling their iPhone designs.\nMusic Controls\nIn iOS 10, managing music and audio playback from Control Center is a richer experience, visually and functionally superior to iOS 9.\n\nThe page is split in three areas: audio information and, for the first time, artwork at the top; progress, playback controls, and volume in the middle; hardware accessories at the bottom. This is true for Apple Music and Podcasts as well as third-party apps, which don’t need to optimize for iOS 10 to show album artwork.\n\nI was skeptical when I saw that Apple moved audio controls to a separate card. The ubiquitous presence of an audio widget was my favorite aspect of Control Center; adding an extra step to reach it didn’t seem a good idea. After adjusting to Control Center’s audio page in the first month of iOS 10, I went back to iOS 9 and controlling music felt limited and bland.\nThere are two aspects to Apple’s design worth noting. First, Control Center remembers the page you were using before dismissing it. If you swipe up, swipe left to open music playback, then close Control Center, the next time you open it, you’ll get the Now Playing card instead of being taken back to the first page. Thanks to this, having audio controls on a separate page hasn’t been a problem in my experience, but I wonder if Apple should allow reordering pages as an option.\nSecond, the purpose of the redesign. With artwork and comfortable UI elements, the page feels like a miniaturized music app rather than a cumbersome mishmash of buttons and sliders. It’s almost as if Control Center was reimagined for how normal people like to know what’s playing.\nFrom an interaction standpoint, artwork creates a bigger touch target that you can tap to be taken into the app playing audio10; in iOS 9, you had to precisely tap on a song’s small title in Control Center. There’s a deeper sense of context, too. Previously, it always took me a few seconds to read through a song’s information. With iOS 10, I can swipe up and glance at the artwork to see what I’m listening to.\nThere’s a subtle touch I want to mention. When music is playing, artwork is big, it has a drop shadow, and Control Center says ‘Now Playing on…’ at the bottom with an icon for the device where audio output is happening. Hit pause, and the artwork shrinks, losing the drop shadow, as the ‘Now Playing…’ message disappears. Tap play again, and the artwork grows bigger with a delightful transition.\n\n \nControl Center’s music playback\n\nControl Center’s audio page has two functional problems Apple should address. Song details (title, artist, and album) have been turned into lines of text that don’t scroll and get cut off. Try to listen to songs with long titles – say, I’ve Got a Dark Alley and a Bad Idea That Says You Should Shut Your Mouth (Summer Song) – and you’ll be surprised Apple designers didn’t consider the issue.\nThat Says…?\nIn addition, the ability to “love” songs to train Apple Music has been removed from Control Center (and the Lock screen). I don’t understand the decision, as having a dedicated page provides even more room for music controls.\nDespite the merits of artwork and more intuitive controls, I don’t think Apple added a standalone audio card to Control Center for those reasons alone. To me, the most convincing explanation comes from the hardware menu:\nPicking audio accessories in Control Center.\nWith just a few taps, you can connect to Bluetooth headphones or wireless speakers from anywhere on iOS without opening Settings. There’s an obvious subtext: for a device without a headphone jack, an easier way to switch between wireless audio accessories isn’t just a pet peeve – it’s a necessity.\n\nAudio playback is the clear winner of the new Control Center in iOS 10.\n\nAudio playback is the clear winner of the new Control Center in iOS 10. Apple freed themselves from the constraints of iOS 9’s tiny audio controls, and, after three years, music is claiming the prime spot it deserves in Control Center. The new audio page brings a more engaging, integrated listening experience that paves the road for what’s to come.\nHomeKit Controls\nYou can’t use the third page of Control Center unless you’ve configured at least one HomeKit device. I don’t own a lot of HomeKit accessories (I have three Hue lights and a few Elgato sensors), but the new Home page has grown so much on me, I’m no longer using any third-party HomeKit widgets.\nBesides being available to users with HomeKit devices, Control Center’s Home card only displays accessories and scenes that have been marked as favorites in the new Home app. The page doesn’t list every HomeKit accessory, nor does it work with third-party home automation devices that don’t support HomeKit.\nIf you meet these requirements, you’ll be able to swipe over the Music card to reveal the Favorite Accessories view.\n\nAccessory buttons carry a name and icon assigned in the Home app, and, if supported, a percentage label for intensity (lights have it, for example). A button in the top right lets you switch between accessories and scenes. To turn them on and off, you just tap a button once.\nButtons can be long-tapped to open a detail screen with more options.11 For my Hue lights, holding a button for a fraction of a second reveals a vertical slider for intensity, which can be adjusted without lifting a finger off the screen.\n\nA second layer of navigation is nested into the detail view. With multicolor lights, you can tap on a Colors button below the intensity slider to modify presets and open a color wheel to pick a different shade. The wheel even has a segmented control to switch between color and temperature – a surprisingly deep level of hierarchy for a Control Center page.\nAdjusting colors and temperature for lights inside Control Center.\nUnfortunately, accessories that only report basic status messages don’t have a useful detail view.\n\nIn spite of my limited testing environment, Control Center has become my favorite way to manage HomeKit lights and scenes. It’s a testament to Apple’s penchant for native integrations: lights turn on immediately because commands don’t go through a third-party server, and the entire flow is faster than asking Siri to activate an accessory. I was a heavy user of third-party HomeKit widgets and apps before; on iOS 10, I have no reason to do that anymore thanks to Control Center.\nIf Apple didn’t have big plans for the connected home, they wouldn’t have given HomeKit its own section in Control Center. With HomeKit expanding to new accessory lines, I think it’s going to be my second most used card after music.\nExtended Control\nAfter three years, Control Center is growing up. To make the pendulum swing back towards simplicity, Apple has traded some convenience of the original design for three standalone pages. By unbundling functionality in discrete units, Control Center is more legible, usable, and flexible.\nThere are missteps. The lack of any kind of user customization is inexcusable in 2016. The bottom row of shortcuts, down to four icons again, still can’t be modified to accommodate user-selected apps. And you won’t be able to swap toggles at the top for settings you access on a frequent basis.\nHalf-baked integration with 3D Touch feels like a timid attempt to take Control Center further. The addition of quick actions for apps in the first page is laudable, but why isn’t the same true for toggles at the top as well? And if HomeKit accessories can show nested detail views, why can’t Apple Music display a lyrics screen, too?\nI want to believe that iOS 10’s Control Center is foreshadowing the ability for developers to provide their own “app pages” and for users to swap default shortcuts with their favorite ones. More than ever before, Control Center is ripe for extensibility and personalization. Like widgets, I can see a future where we interact with some types of apps primarily through mini interfaces in Control Center.\nI wouldn’t have expected pagination to be what I wanted, but Apple was right in rethinking Control Center as a collection of pages rather than a complex unified dashboard. The majority of iOS users won’t be affected by Apple’s design trade-offs; they’ll appreciate a screen that doesn’t need a manual.\nThe new Control Center experience isn’t a regression; it’s a much needed reassessment of its role in the modern iOS.\n\n\nMore 3D Touch\nAs it’s evident by now, Apple has increased the presence of 3D Touch in iOS 10. On top of notifications, Control Center, and the Home screen, 3D Touch actions have been brought to more apps and system features.\nNotification Center\nLike on the Apple Watch, you can press on the Clear button in Notification Center to clear all notifications in one fell swoop. Finally.\n\nSiri App Suggestions\nApps suggested by Siri support 3D Touch to show the same quick actions available on the Home screen.\n\nApple Music\nAmong many changes, Apple Music has been given the extended 3D Touch treatment with a contextual menu for selected items and playback controls. Pressing a song or the bottom player brings up a list of options that include adding a song to a library, liking it, saving it to a playlist, or opening lyrics.\nManage Downloads\nWhen downloading apps from the App Store or restoring a device from an iCloud backup, you can press on an in-progress download to pause it, cancel it, or prioritize it over others.\nShare Apps\niOS 10 automatically adds a Share button to an app’s quick action menu on the Home screen to share its link with friends. Presumably, this is meant to bolster app discovery and sharing among users.\nBeta Feedback\nPressing on the icon of a TestFlight beta app shows a shortcut to send feedback to the developer via Mail.\nThe pervasive use of 3D Touch in iOS 10 proves Apple wants it to be an essential iOS feature. After using iOS 10, going back to iOS 9 feels like missing several layers of interaction.\nThis creates an even stronger tension between 3D Touch-capable iPhones and devices without it. Right now, Apple is resorting to swipes and long-taps to simulate 3D Touch on iPads and older iPhones; will they always be able to maintain backwards compatibility without making more features exclusive to 3D Touch?\n\n\nMessages\niMessage is a textbook example of how a feature can turn into a liability over time.\nWhen it was introduced five years ago, iMessage promised to bring a grand unification of SMS and free, unlimited texting with media attachments. iMessage turned Apple’s Messages app into a single-stop solution for conversations between iOS users and those who would later be known as green-bubble friends. It was the right move at the time12, and it allowed Apple to have a communication service as a feature of iOS.\nOver the last five years, messaging has outgrown texting. Meanwhile, iMessage (the service) and Messages (the app) have remained stuck in their ways.\nServices like Facebook Messenger, WhatsApp, LINE, and WeChat haven’t only reached (or surpassed) iMessage in terms of users; as mobile-first messaging apps without SMS’ technical (and conceptual) debt, they have been able to relentlessly iterate on design, novel messaging concepts, notifications, and app integrations.\nThese companies, free of past constraints, have envisioned new ways to communicate. They’ve grown messaging apps into platforms, enabling others to extend them. And maybe some of the current messaging trends will turn out to be fads, but it’s hard to argue against Apple’s competitors with their numbers, cultural influence, and progressive lock-in. They’re no joke, and Apple knows it.\nBut I wouldn’t ascribe iMessage’s slow pace of evolution to its SMS legacy alone. Because of its end-to-end encryption and Apple’s strict policy on not storing sensitive user information, iMessage is by nature trickier to extend. Apple’s efforts in this area are commendable, particularly when you consider how the aforementioned services diminish in functionality once you add encryption.\nHowever, security hurdles shouldn’t be an excuse for iMessage’s glaring shortcomings. As laudable as Apple’s stance is, most users aren’t willing to put up with an app that feels old. They want to liven up conversations with rich graphics and apps. They want messaging to be personal. Technologists won’t like this, but, ultimately, people just want a modern messaging app that works.\n\nThe time has come for iMessage to take the next step.\n\nFrom a user’s perspective, it’s fair to say that Apple has been too complacent with iMessage. The service is by no means a failure – it serves hundreds of millions of users every day. But those metrics don’t matter when stasis yields something worse than numbers alone: cultural irrelevancy. That iMessage, as many see it, “is just for simple texting”.\nThe time has come for iMessage to take the next step. With a willingness to welcome developers into its most important app, and without giving up on its security ideals, Apple is reshaping how users can communicate, express themselves, and share. With iMessage in iOS 10, Apple is ready to embrace change.\n\nApp Changes\nBefore delving into the bigger enhancements to Messages, I want to touch upon changes to the app’s interface and some minor features.\nThe conversation’s title bar has been redesigned to embed the recipient’s profile picture. Having a photo above a conversation helps identify the other person; the increase in title bar height is a trade-off worth accepting.\nThere’s new artwork for contacts without a profile picture, too.\nThe profile picture can be tapped to open a person’s contact card; and, you can press it to bring up a 3D Touch menu – the same one available in Contacts and Phone with a list of shortcuts to get in touch with that person.\n\niOS 10 brings a new layout for the bottom conversation drawer. By default, a conversation opens with a narrow text field and three icons next to it – the camera, Digital Touch, and the iMessage app launcher. As you tap into the text field to reply to a message, the three icons collapse into a chevron that can be expanded without dismissing the keyboard.\n\nApple has also redesigned how you can share pictures and videos. The new media picker consists of three parts: a live camera view to quickly take a picture; a scrollable grid of recent items from your library; and buttons to open the full camera interface or the photo library, accessed by swiping right.\n\nThe assumption is that, on iMessage, people tend to share their most recent pictures or take one just before sharing it. The live camera view can be used to snap a photo in a second (you don’t even have to tap on the shutter button to take it). Moving the camera and library buttons to the side (hiding them by default) has freed up space for recent pictures: you can see more of them thanks to a compact grid UI.\n\nSome won’t like the extra swipe required to open the camera or library, but the live photo view makes it easier to take a picture and send it.\nAfter picking or taking a picture, you can tap on the thumbnail in the compose field to preview it in full screen. You can also tap and hold a picture in the grid to enter the preview screen more quickly.13\nMarkup inside Messages.\nHere, you have two options: you can edit a picture with the same tools of the Photos app (albeit without third-party app extensions) or use Markup to annotate it. You can tap on the Live Photo indicator to send a picture without the Live part, or press on it to preview the Live Photo.\nSpeaking of photos, iMessage now lets you send images at lower quality, likely to save on cellular usage. You can enable Low Quality Image Mode in Settings -> Messages.\nOne of the oldest entries of my iOS wish lists is also being addressed in iOS 10: you can choose to enable read receipts on a per-conversation basis.\n\nIf you, like me, always keep read receipts turned off but would like to enable them for important threads, you can do so by tapping the ‘i’ button at the top of a conversation and then ‘Send Read Receipts’. The toggle matches the default you have in Settings and it can be overridden in each conversation.\nRicher Conversations\nWhile Messages may not look much different from iOS 9 on the surface, the core of the app – its conversation view – has been refreshed and expanded. iMessage conversations have received a host of new features in iOS 10, with a focus on rich previews and whimsical, fun interactions.\nLinks\nIn its modernization of iMessage, Apple started from web links. After years of plain, tappable URLs, Messages is adopting rich link previews, which are inspired by iOS 9’s link snippets in Notes, but also more flexible and capable.\nRich links aren’t a special setting of the app: the first time you receive a link in an iMessage conversation in iOS 10, it’ll appear as ‘Tap for Preview’ button in the conversation. This is a one-time dialog to confirm you want to load links as rich previews instead of URLs, which also look different from iOS 9.\nLoading a rich link for the first time in iOS 10.\nLike in Notes (and other services such as Slack and Facebook), rich previews use Open Graph meta tags to determine a link’s title, featured image, audio and video file, or description. A web crawler has been built into Messages: as soon as you send a link, the message’s bubble will show a spinner, and, depending on the speed of your Internet connection, it’ll expand into a rich message bubble after a second, within the conversation.\nPaste, fetch, expand into rich link.\nRich link previews in Messages use the same technology Apple brought to Notes last year, but they’ve been designed differently. They’re message bubbles with a title and domain subtitle; the upper section, where the featured image of a link is, can grow taller than link snippets in Notes. Web articles tend to have rectangular image thumbnails; podcast episodes shared from overcast.fm are square; and links to iPhone apps shared from the App Store show a vertical screenshot.\nMultiple types of shared links in Messages.\nFurthermore, the behavior of sharing links differs between Notes and Messages. Allow me to get a bit technical here.\nIn Notes, only links captured from the share extension are expanded into rich previews; pasting text that contains a link into a note doesn’t turn the link into a rich preview.\nNotes: rich links and plain URLs.\nIn Messages, both individual links and a string of text with a link will generate a rich preview. In the latter case, the link has to be either at the beginning or at the end of a sentence. Messages will break up that single string in two pieces: the link’s preview, and the string of text without the link. Even sending a picture and a link simultaneously will create two message bubbles – one for the image, another for the link.\n\nThe only instance where Messages will resort to the pre-iOS 10 behavior of a rich text (tappable) URL is when the link is surrounded by text:\n\nUnless a link is placed inside a sentence, iOS 10 will never show the full path to the URL – only the root domain. Whether meta tags can’t be crawled14 or if a link is successfully expanded, the URL will be hidden. If you need to see the full URL of a link in Messages, you can long-tap the link to show it in a contextual menu.\n\nThere are multiple types of link previews in iOS 10. The majority of websites with a social presence (including MacStories) have added support for Open Graph meta tags and Facebook/Twitter cards, and their links will appear with a featured image and a title. Alas, Apple hasn’t brought Safari View Controller support to Messages, which doesn’t make the experience of following links as seamless as it is on Facebook Messenger.\nTwitter links have been nicely formatted by Apple: they have a special light blue background and they display a tweet’s text, username, media (except GIFs), and avatar.\nTwitter links on iMessage.\nFor Apple Music, the company has created a rich preview that, in addition to artwork, embeds a native play/pause button to listen to songs without leaving Messages. Unlike other web links, you can’t peek & pop Apple Music links, suggesting that it’s a custom implementation that uses an underlying URL to assemble a special message bubble.\nApple Music links (left) vs. SoundCloud and Spotify.\nThird-party companies can’t take advantage of this – both Spotify and SoundCloud links don’t have a playback UI and they’re treated as webpages with a featured image.\nOvercast Links\n\nMy favorite podcast player, Overcast, supports sharing links to individual episodes that can also be listened to via the web. When sharing Overcast links, Messages will expand them into media thumbnails with a button to start playback inside the app.\niOS 10 treats Overcast links as media that starts playing with no audio by default. There’s a button to turn on the volume, but no additional controls for playback – and it’s not even clear if Messages streams these files, downloads them once, or downloads them every time. I wouldn’t recommend listening to a podcast inside Messages, but it’s possible.\nOther Apple apps with the ability to share links don’t fare as well as Apple Music. App Store and iTunes links show a title, an icon and screenshot, and app categories; you can’t install an app or watch a movie trailer inside Messages. Links to photo albums shared on iCloud.com don’t support rich previews in Messages, and shared notes only come with an icon and the title of a note.\nYouTube links get expanded into a playable video preview that you can tap once to play, and tap again to pause. There are no additional controls (let alone a progress bar), but it’s great to be able to watch a YouTube clip inline without being yanked to the YouTube app.\nPlaying YouTube videos inside Messages.\nMessages will even pause playback if you scroll down in the conversation, and resume it as you focus on the video again. It’s a nice touch.\nRich link previews embody the idea of stages of change and how Apple often adds functionality to iOS.\nUsers will see them as a new feature of Messages, which allows everyone in a thread to see a preview of the destination page. In some cases, message bubbles can even play media. I like how links get expanded inline; plain URLs in old iOS 9 message threads feel archaic already.\nLink previews also build upon Apple’s work with Universal Links and adoption of open standards such as Open Graph and Schema.org for search. The same technologies Applebot and Spotlight have been using for years now power link previews in iMessage.\nI’d like to see Apple open up link previews with more controls for developers in the future, but this is a solid start.\n\nEffects\nWith iOS 10, even how you send a message can be different. The blue ‘Send’ button has been replaced by an upward-facing arrow; tapping it once sends a regular iMessage as usual.\nWithin the arrow lies a secret, though. Press with 3D Touch (tap and hold on the iPad), and you’ll bring up a ‘Send with effect’ screen, which lets you send a message with Bubble and Screen effects.\n\nLet’s start with bubbles, as I believe they’ll be the more popular ones. There are four types of bubble effects, and they support any type of content you can share in Messages – text, emoji, media, and links.\nSlam\nYour message flies across the screen and is slammed to the ground, causing an invisible shock wave to ripple through adjacent messages.\n\n \n\nBest used when you really want to be heard or make a point. Or for shaming a friend with an ugly selfie from the night before.\nLoud\nA more polite version of Slam that enlarges the message without affecting nearby bubbles.\n\n \n\nThe way the text shakes briefly inside the bubble suggests this is appropriate to shout something, either in anger or happiness, without necessarily destroying everything around you.\nGentle\nApple’s version of a kind, intimate whisper. Gentle starts with a slightly larger bubble containing small text, which will quickly grow back to normal size as the bubble shrinks down.\n\n \n\nPersonally, I think Gentle is ideal for dog pictures as well as the “I told you so” moments when you don’t want to upset the recipient too much. At least you’re being gentle about it.\nInvisible Ink\nI won’t explain the ideal use cases for this one, leaving them up to your imagination. Invisible Ink obfuscates the contents of a message and it’s the only interactive bubble of the four.\n\n \n\nTo reveal text hidden by Invisible Ink, you have to swipe over the bubble to remove the magic dust that conceals it. It can be wiped off from notifications, too. Invisible Ink is automatically re-applied after ~6 seconds.\nInvisible Ink gives you the time to make sure no one is looking at your screen. Ingenious.\nBubble effects may not appeal to iOS power users, but they’re a lot of fun, they’re whimsical, and they add personality to conversations.\nBubble effects in iOS 10.\nFrom a technical standpoint, the implementation of 3D Touch is spot-on: you can hold down on the Send button and scroll to preview each bubble effect before sending it. If you receive a message with a bubble effect, it’ll only play once after you open the conversation – they won’t be constantly animating. I’ve been using them with friends and colleagues, and like them.\nScreen effects are a different story. Unlike bubble effects, they take over the entire Messages UI and they play an animation with sounds that lasts a couple of seconds. Screen effects are deliberately over the top, to the point where they can almost be gaudy if misused. Lasers, for instance, will start beaming disco lasers across a conversation.15 Shooting star will cause a star to fly through the screen with a final “ding” sound, while fireworks will put up celebratory explosions, turning the app’s interface dark as you gaze into the virtual New Year’s night of iMessage.\nHere’s what they look like:\nBalloons\n\n \n\nConfetti\n\n \n\nLasers\n\n \n\nFireworks\n\n \n\nShooting Star\n\n \n\nMy problem with screen effects is that they can be triggered by certain keywords and phrases without any prior warning. Texting “congrats” will automatically fire off the Confetti effect, which is nice the first time, but gets annoying quickly when you find yourself texting the expression repeatedly and being showered in confetti every time. The same is true for “happy new year” and “happy birthday”, which will bring up Fireworks and Balloons without the user’s consent.\nI use screen effects occasionally to annoy my friends and throw confetti when I feel like it – but the automatic triggering feels almost un-Apple in its opaque implementation. There should be an indicator, or a setting, to control the activation of screen effects, or Apple should abandon the idea altogether, letting screen effects behave like the bubble ones following a user’s command.16\nScreen effects aren’t the most exciting aspect of the new iMessage, but they bring some unexpected quirkiness into the app, which isn’t bad either. Just use them responsibly.\n\nDigital Touch and Handwriting\nWhen Apple introduced Digital Touch on watchOS in 2014, it was safe to assume it’d eventually find its way to iOS. Two years later, Digital Touch has been built into Messages in iOS 10, gaining a prominent spot between photos and the new iMessage App Store.\nDigital Touch can be activated from the heart icon with two fingers – a reminder of its Apple Watch legacy. Tapping the button turns the lower half of the screen into an interactive pad where you can draw, send taps and heartbeats, and annotate photos and videos.\n\nDigital Touch has three sections: a color picker along the left side (where, like on the Watch, you can long-tap a color to pick another one); a drawing area in the middle; and icons explaining Digital Touch features rotating on the right. At the bottom, a chevron lets you open Digital Touch in expanded mode, taking over the conversation in full-screen.\n\nThere isn’t much to say about the functionalities adapted from watchOS. Sketches are easier to create thanks to the bigger screen, though I think that, to an extent, the constraints of the Watch incentivized creativity. Also, sketches look like images with a black background pasted into conversations: they’re animated, but they don’t feel as integrated as they used to be on the Apple Watch. They look like simple image attachments on iOS 10.\nTaps and heartbeats are the kind of features someone decided to bring over to iOS so they wouldn’t go to waste. They fundamentally feel out of place on iOS given the lack of haptic feedback on the wrist and their black background.\nWhen you receive a tap from someone on the Apple Watch, you feel tapping on your wrist. Taps are animated images on iOS 10 and there’s nothing special about them. The physical connection is lost. Apple could have made taps part of the conversation view, letting them ripple through bubbles like effects do, or use vibration as feedback; instead, they settled on GIFs.\nHeartbeats are even more baffling, as they aren’t “real” heartbeats due to the lack of a heart rate sensor on iOS. When you hold two fingers on the screen to send your heartbeat on iMessage17, iOS generates a generic animation that isn’t a representation of anyone’s heartbeat. The sense of intimacy watchOS fostered thanks to Digital Touch and its heart rate sensor – of knowing that the heartbeat animation represented the actual beating heart of a friend or partner – isn’t there on iOS.\nAnd don’t get me started on the sadness of swiping down with two fingers to send a heartbreak.\n\nThen there’s 3D Touch, which is used in Digital Touch to send “fireballs”. If you press on the Digital Touch pad, iOS 10 creates a pulsing fireball that will be sent as an animated image.\nThat’s a fireball.\nI’m not sure what to make of the fireball – does sending one show you’re thinking of someone? That are you’re upset with them? That you’ve realized 3D Touch exists in iMessage? Is it a reference to John Gruber? It’s an open-ended question I’ll leave to the public to resolve.\nThe standout Digital Touch feature is one that has been built around the iPhone’s hardware. Tap the video icon, and you’ll bring up a camera UI to sketch on top of what the camera is seeing. You can also add Digital Touch effects in real-time while recording a 10-second video (to take a picture, tap the shutter icon).\nEffects with the Digital Touch camera are fun.\nThe combination of sketches and kisses with videos is fun and highly reminiscent of Snapchat; I’ve been using it to send short clips with funny/witty comments or sketches drawn on top of them. Apple should add more iOS-only “stamps” or animations to Digital Touch for photos/video without copying what they’ve done on watchOS.18\nUnrelated to Digital Touch, but still aimed at making conversations more personal, is handwriting mode.\n\nAnyone who’s familiar with handwritten signatures in Preview and Markup will recognize it: handwriting can be accessed by tapping the ink button on the iPad keyboard or turning the iPhone sideways. It opens an empty area where you can handwrite a message in black ink using your finger (or Apple Pencil). There’s a list of default and recent messages at the bottom (which can be deleted by long-tapping them), and no additional controls.\nHow handwritten messages look in conversations.\nI found handwriting mode to be nicer than Digital Touch. Handwritten messages aren’t contained in a black image and ink animates beautifully19 into the conversation view, which creates the illusion that someone has written a message for you inside Messages instead of sending an image attachment. It’s a better integration than Digital Touch.\nNew Indicators\n\nMessages displays a special typing indicator for new kinds of message bubbles in iOS 10. My two favorites: the sketched bubble for handwritten messages, and the black one for Digital Touch. Even apps support custom typing indicators with their own icons.\nDigital Touch on iOS 10 could have used more work. Features that had some reason to exist on watchOS’ hardware have been lazily ported to iOS, removing the physical interaction and feedback mechanism that made them unique on the Watch.\nI’m not sure the iOS Digital Touch we have today is worth giving up a premium slot as a default iMessage app next to the Camera. It’s a “Friends button” scenario all over again. I wouldn’t be surprised if that permanent placement becomes customizable next year.\nTapback\niOS 10 brings new options to react to messages, too.\nCalled Tapback, the feature is, essentially, Apple’s take on Facebook’s redesigned Like button and Slack’s reactions. If you want to tell someone what you’re thinking without texting back, you can double tap20 a message – any kind of bubble – to bring up a menu with six reactions: love, thumbs up, thumbs down, ha-ha, exclamation points, and question mark.\nSending a Tapback.\nThe interaction of Tapback is delightful. Icons animate when you tap on them, and they play a sound effect once attached to a message. You can’t create your own reactions by picking any emoji like on Slack, but, looking at a conversation with a bunch of hearts, thumbs-ups, and ha-has, the feeling is the same.\n\n \nTapback\n\nTapbacks are especially effective in group threads where everyone can “vote” or express their immediate reactions without typing. A Tapback can be changed at any point during a conversation, but you can only leave one reaction per message.\nIf what happened in my Slack teams over the past year is of any indication, Tapback should become a useful way to let someone know you’ve acknowledged or liked their message without writing anything back.\nEmoji\nSlack’s influence on iMessage has propagated to emoji as well. Messages that only contain emoji (no text) will be sent as big emoji (at 3x), so you can truly appreciate the details that make up Apple’s most popular characters.\nRegular and big emoji.\nI’ve been a fan of jumbo emoji since Slack rolled them out last year. They’re a perfect fit for iMessage. Emoji are expanded in the text field before sending them – I chuckle every time I see a big thinking face about to enter a conversation. Messages will only display up to three big emoji at a time; if you create a message containing four emoji, they’ll be sent at normal size.\nEmoji improvements don’t stop there. Apple must have noticed that users like to write messages and replace words inside them with appropriate emoji, and they’re introducing an option to automate the process in iOS 10. Possibly, this innocuous feature (which only works in Messages) is even going to power Apple’s Differential Privacy for crowdsourced data collection.\nIf you write a message in iOS 10 and then open the emoji keyboard, the system will scan words you’ve entered in the text field and try to match them up with emoji. If a related emoji is found, a word will be highlighted in orange. Tap it, and it’ll be replaced with the emoji.\nTap the emoji keyboard to replace words with emoji.\nIf multiple emoji options are available for a single word, tapping it opens a menu to choose one.\nMultiple emoji options.\nI’m not exactly the target audience for this feature (I either only send emoji or put some next to a word), but I recognize that a lot of people treat emoji as substitutes for words. Apple devised a clever and thoughtful way to “emojify” text, letting the OS compensate for a search box still missing from the emoji keyboard.\nUnder the hood, emoji replacements hinge on a system that has to build up associations and trigger words, follow trends, and adapt for international users and different meanings of the same emoji around the world. Based on what Apple has revealed about Differential Privacy, data on emoji picked by users will be collected in aggregate to improve the accuracy of suggestions.\nMy understanding is that Apple started from a set of words curated from common expressions and Unicode annotations, and began scaling to millions of users and dozens of languages for over 1800 emoji during the iOS 10 beta stage. In my case, emoji replacements worked well for both English and Italian.\nCrowdsourcing this aspect of iMessage makes sense given the popularity and many meanings of emoji. It’ll be interesting to see how suggestions will be refined as iOS 10 usage picks up.\n\niMessage as a Platform\nDespite numerous design updates and enhancements to conversations, the most profound change to iMessage isn’t the app itself – it’s other apps developers will build for it.\nApple is opening iMessage to developers in iOS 10, turning it into a platform that can be extended. The company has created a Messages framework for developers to plug into and build apps, which will be available on the new iMessage App Store.\nThe stakes are high. For millions of users, their messaging app is a second Home screen – a highly personal, heavily curated gateway to contacts, private conversations, and shared memories. Messaging isn’t just texting anymore; it’s the touchstone of today’s mobile lifestyle, a condensation of everything smartphones have become.\nApple won’t pass up this opportunity. Not this time. In opening up their most used app, Apple hopes that developers will take iMessage further with new ways to share and enrich our conversations.\niMessage App Store\nDevelopers can write two types of Messages extensions in iOS 10: sticker packs and iMessage apps. Both can be discovered and installed from the iMessage App Store embedded into the Messages app, and both can be created as standalone apps or as extensions within a containing iOS app.\nYou can access the iMessage App Store with the apps button next to the input field. Messages will hide the keyboard and bring up a scrollable gallery of all your installed Messages extensions, opening the last used one by default. Apps are organized in pages and you can swipe between them. You can expand the currently selected app with the chevron in the lower right, and browse recent content from all apps via the leftmost page.\nOpening the last used iMessage app (left) and the Recents page (right).\nThere’s also a way to view a Home screen of iMessage apps as icons. If you tap on the icon in the bottom left corner, you’ll be presented with a grid of oval icons (the shape for iMessage apps) and a ‘+’ button to open the iMessage App Store.\nThe iMessage app drawer (left) and the new iMessage App Store.\nThis view has been designed to resemble the iOS Home screen: you can swipe horizontally across apps, you can tap & hold to delete them and rearrange them, and you can even click the Home button to stop wiggling mode.21\nThe iMessage App Store\n\nIn addition to the Home screen-like app drawer, iMessage apps can be deleted from the Manage view of the iMessage App Store. Like on the Apple Watch, you can choose to automatically add iMessage apps on your device to your app drawer (if you install an iOS app that contains an iMessage extension). I wish the normal App Store supported deleting apps through toggles like the iMessage App Store does.\nAs for content on the iMessage App Store, it’s been modeled after the watchOS App Store with a front page collecting featured apps, categories, games, and other curated picks by Apple. At launch, Apple is heavily promoting sticker packs from brands such as Disney and Nintendo, but there’s a good variety of apps and stickers from smaller developers as well. One nice detail: on the iMessage App Store, Apple is using emoji in section names, which isn’t the case for other App Stores.\nI’m more optimistic about the iMessage App Store as it’s embedded in Messages, which should encourage more users to check it out.\nI like the idea of an iMessage SpringBoard, but it takes too many taps to open it22, especially if you want to launch an app in a hurry. Apps are tucked away behind three taps, and I wonder how that will impact usability in the long run. Right now, the compact app drawer (with the dots at the bottom) doesn’t scale to more than 30 installed apps and it feels like equivalent of the Slide Over app picker from iOS 9; there has to be a faster way to navigate and launch iMessage apps.23\nPerhaps a Messenger-like design with top launchers embedded above the keyboard would have been a preferable solution.\nStickers\niMessage stickers can be seen as Apple’s response to the rise of third-party “emoji” keyboards that offer selections of sticker-like images, usually in collaboration with brands and celebrities. If you’ve seen the likes of KIMOJI, Justmoji, PetMOJI, Bitmoji, and literally anything -moji on the App Store lately, you know that’s an aspect of iOS Apple could improve for both users and developers.\nWhat some third-party companies try to sell as “custom emoji” aren’t really emoji: they are images that can be pasted in conversations.24 Developers don’t control the availability of emoji in Apple’s keyboard, nor can they alter what is defined as emoji in the Unicode specification. By manipulating the public’s perception of what an emoji is, and by leveraging custom keyboards to make their “emoji” look like part of iOS, some developers were able to carve themselves a profitable niche on the App Store. Just ask Kanye West and how they made a million a minute.25\nHowever, I don’t blame developers for trying and riding on the coattails of emoji.26 I’d argue that a lot of companies settled on the “moji” suffix because iMessage was the only big messaging service without native sticker support, and emoji were already in the typical iOS user’s vocabulary.\nStickers provide an enormous opportunity for developers and, yes, brands to give users fun ways to express their feelings with a wider array of emotions and contexts than emoji alone. Look at LINE, and the massive, multi-million dollar success of Cony and Brown and revenue from their Creators Market; think about Twitter and how they commissioned a set of stickers to be used on photos.\nIf every major messaging platform has found stickers to be popular and profitable, there must be something to them that appeals to people. With iOS 10, Apple, too, wants a piece of the action and is letting developers create sticker packs for iMessage. The goal is to entice users to personalize their iMessage conversations with stickers, download additional packs, and spread usage with friends. The company plans to do so with a superior experience than custom keyboards, with the prospect of a new gold rush for developers.\nStickers live in the standard sticker browser – the compact view that opens after choosing a sticker pack from the app drawer. This area can have a custom background color and it’s where you can interact with stickers.\nTwo sticker packs.\nYou can tap on a sticker to place it in the input field and send it individually, or you can peel it off the browser and drag it around in a conversation.\nTapping a sticker to send it (left) and peeling it off (right).\nThe animation for peeling stickers off the browser and re-attaching them is some of Apple’s finest OpenGL work in a while.\n\n \nAttaching stickers\n\nYou can attach stickers to any message bubble in the transcript: you can put one next to a text message, cover a photo with multiple stickers, or even put a sticker atop another one or a GIF. Want to peel off a sticker and use it on an older message? Drag it over the title bar, wait for the conversation to scroll back, and attach it wherever you want. How about covering your friend’s eyes with googly eye stickers? You can do that too.\nThings can get out of hand quickly.\nOnce a sticker has been placed in a conversation, you can tap and hold it to open the sticker details. This is also how you view all stickers that cover a message bubble, with buttons to download the complete packs on the iMessage App Store27. Here, you can swipe on a sticker to delete it from the selected message bubble if you no longer want to see it.28\nOpening sticker details.\nYou’ll come across two kinds of sticker packs. There are the basic ones, which are a collection of images displayed inside a sticker browser. This will probably be the most popular choice for developers, as creating these packs doesn’t require a single line of code. If you’re a developer and want to sell a sticker pack on the iMessage App Store, all you need to do is drop some image files into an Xcode sticker pack project, add icons, and submit it to Apple.29\nStickers can also be rotated and enlarged using pinch gestures.\nThe second kind are sticker packs with a custom sticker browser or other additional features. Technically, these are iMessage apps that use the Messages framework for sticker functionality that goes beyond basic drag & drop. For instance, you may see apps where you can assemble your own stickers, or sticker packs with custom navigation elements and In-App Purchases. The sticker behavior in conversations is the same, but these packs require more work from developers.30\nFrom a user’s perspective, stickers make iMessage conversations feel different. More lively and fun, but also busier and messier if overused.\nI’ve been able to test about 30 different iMessage sticker packs from third-party developers in the past couple of months. One of the highlights is The Iconfactory, which leveraged their expertise in icons and illustrations to create some fantastic sticker packs for iMessage.\nAn example of what The Iconfactory has prepared for iOS 10. (Tap for full size)\nFrom Sunshine Smilies (emoji characters as stickers) and Tabletop RPG (role-playing emoji stickers) to Mystic 9 Ball and Dino, I believe The Iconfactory has found a perfect way to reinvent themselves for the iMessage era. They’re a great fit for stickers.\nDeveloper Raul Riera has created what I believe is going to be a popular type of custom sticker app: Emoji Stickers lets you put together your own stickers containing emoji characters.\nA custom emoji sticker.\nYou can create concoctions like a monkey wearing a crown or a pineapple pizza. This is done with a custom sticker-assembling UI and built-in emoji from the open source Emoji One set.\nMonstermoji, created by Benjamin Mayo and James Byrd, features beautifully hand-drawn monster characters you can attach to messages.\n\nThese stickers are unique, and they show how anyone can easily create a sticker pack and release it.\nI also like Anitate, a set of 80+ animated stickers by Raven Yu.\nLook at that sad pug with bunny ears.\nAnitate’s stickers are like animated emoji, redrawn for a flat style with animations. They’re fun and I’ve been using them a lot.\nLast, I want to mention Sticker Pals – by far, the most impressive and beautiful sticker pack I’ve tried. Designed by David Lanham in collaboration with Impending, Sticker Pals features a large collection of animated hand-drawn stickers for various emoji-like objects, symbols, and animals. The illustrations are gorgeous, animations are fun, and there are hundreds of stickers to choose from.\n\nSticker Pals is a good example of what can be achieved by creating a custom sticker browser with the Messages framework. There are buttons at the top of the browser to switch categories, and each tap corresponds to a different sound effect. Plus, the developers have devised a clever unlocking mechanism for extra stickers with an in-app store and the ability to send stickers as gifts to your friends – all within an iMessage app with a sticker browser.\nJudging from the amount of pre-release sticker packs I received during the summer, I have a feeling the iMessage App Store team at Apple is going to be busy over the next few weeks.31\nWith iMessage stickers, Apple hasn’t just created a better way to paste images in conversations. They’re stickers in the literal sense – they can be attached anywhere, sometimes with questionable results, but always with a surprising amount of freedom and experimentation. Mixing multiple stickers at once on top of messages could become a new activity of its own32 – I know I’ve had fun placing them over photos of my friends.\nStickers are often looked down upon by the tech community because they seem frivolous and juvenile. But emoji were met with the same reaction years ago, and they’ve gone on to reinvent modern communication, trickling into pop culture.\n\nStickers are messaging’s lingua franca.\n\niMessage stickers probably won’t have the same global impact of emoji, primarily because they only work in iMessage33 and the service isn’t cross-platform. But I also believe that stickers are the perfect addition to iMessage in 2016. Stickers are messaging’s lingua franca. Their adoption is going to be massive – bigger than custom keyboards have ever been. Stickers are lighthearted, fun to use, and they make each conversation unique.\nLet’s check back in a year and see how many sticker packs we have installed.\n\niMessage Apps\nThe iMessage platform’s opportunity lies in the second type of extensions available to developers: iMessage apps.\nLike sticker packs, iMessage apps are installed and managed from the iMessage App Store, they live in the Messages app drawer, and they support compact and expanded mode. They can be standalone apps or extensions within a containing iOS app.\nUnlike basic sticker packs, however, iMessage apps have to be programmed. They’re actual apps that can present a user interface with their own view controller. iMessage apps can:\nOffer more control to developers who want to build an interactive sticker browser;\nInsert text and media files in the input field;\nDisplay their custom UI;\nAccess iOS frameworks;\nCreate, send, and update interactive messages.\nWith iMessage apps, developers can bring their apps’ interfaces, data, and experience into Messages.\nExamples of iMessage apps.\n\nBecause of this, there are no limitations for what an iMessage app should look like. Anything developers can put in a view controller (bearing in mind compact mode and memory constraints) can be an iMessage app. Coming up with a miniaturized app that makes sense in Messages, though, will be just as hard as envisioning Watch apps that are suitable for the wrist.\nThere are some differences to consider for compact and expanded mode. In compact, apps cannot access the system keyboard and they can’t support gestures (horizontal swipes are used to navigate between apps and sticker packs). Only taps and vertical scrolling are available in compact mode.\niMessage apps in compact mode.\nIn expanded mode, both the system keyboard and gestures are supported. Developers can ask users to type information in the expanded layout, they can enable deeper gesture controls, and, generally speaking, they have more freedom in what they present to the user. When running in expanded mode, an iMessage extension that has a container app features an icon in the top left to launch the full app.\niMessage apps in expanded mode.\nIn-App Notifications\n\nA nice detail of Messages in iOS 10: when using Digital Touch, an app, or a sticker pack in expanded mode, notifications from the underlying conversation are displayed through small popups at the top, so you won’t miss what is being shared.\nThe other peculiarity of iMessage apps is that they can create interactive messages with special message bubbles. These bubbles are based on a template with some strict limitations. There’s only one layout apps can use. An interactive message can display an image, audio, or video file as the main content; the app’s icon is always shown in the top left; at the bottom, developers can set textual properties for the bubble’s caption bar including title, subtitle, captions, and subcaptions (the caption bar is optional).\niMessage apps can’t alter the standard layout of an interactive message, nor can they inject buttons around it. Any user interaction must be initiated from the bubble itself. iMessage apps can’t send interactive messages on the user’s behalf: they can only prepare an interactive message and place it in the input field.\n\nWhen an interactive message bubble is tapped, an iMessage app can bring up a custom interface to let participants view more information on the shared item or continue a task. Keep in mind, though, that if you tap on an interactive message to open it in full-screen when you don’t have the iMessage app to view it on your device, you’ll be taken to the iMessage App Store to install it.\nThe best way to understand what iMessage apps can do is to try some. Since June, I was able to test over 20 iMessage apps from third-party developers, and I have a general idea of what we should expect throughout the year.\nSupertop’s podcast client, Castro, will soon let you share your favorite episodes with an iMessage app. Castro loads a list of episodes you’ve recently listened to; tap one, and it’ll turn into a rich bubble embedding artwork and episode title.\nCastro’s iMessage app.\nThe best part: you can tap the bubble to open show notes in full-screen (and even follow webpage links inside Messages) and add an episode to your Castro queue. It’s a great way to share podcast episodes in iMessage conversations and save them with a couple of taps.\nDrafts, Greg Pierce’s note-taking app, has added an iMessage app extension to share notes with friends. You can browse all notes from your inbox or switch to Flagged messages.\nDrafts’ iMessage app.\nDrafts places a note’s plain text in Messages’ input field, ready to be sent. The iMessage app is going to come in handy to share commonly accessed notes and bits of text with colleagues.\nEver wished your GIFwrapped library – carefully curated over the years – was available in iMessage? With iOS 10, you’ll be able to paste your favorite GIFs without using a custom keyboard.\nSending GIFs with GIFwrapped on iMessage.\nI’ve been using GIFwrapped’s iMessage app to send GIFs of dogs smiling to my mom and girlfriend. They love them.\nAlongside a widget and rich notifications, CARROT Weather is coming to iMessage with an app to share weather forecasts in conversations. It’s a solid example of the flexibility granted to apps: CARROT for iMessage carries its custom UI and hilarious sound effects, and it displays rich graphics and animations. It can access your current location from Messages, and it even lets you search for locations in expanded mode, where you can browse a full-screen forecast of the upcoming week – all without leaving Messages.\n\nCARROT creates interactive messages that are prepared in the input field by tapping a Share button. These are bubbles with a custom preview graphic and text labels for location, temperature, and current conditions. If you receive one and tap on it, you’ll open CARROT’s expanded preview.\nDeveloped by Sven Bacia, Couchy is another iMessage app that sends interactive bubbles that present a full-screen UI when tapped. Couchy is a TV show tracker; on iMessage, it displays a list of recently watched and upcoming show episodes. Pick one, and Couchy will assemble a bubble with the series’ artwork and name of the episode.\nCouchy’s iMessage app.\nWhen you tap a Couchy message, you get an expanded preview of the episode with metadata and artwork fetched from trakt.tv, plus the ability to view the episode in the main Couchy app.\nETA, a navigation app I covered on MacStories before, is based on a similar design, using small snippets in compact mode for your favorite locations. Tap one, and the app will calculate travel time on the spot, preparing a message bubble to share with someone.\nETA’s iMessage app.\nThe interactive message can be tapped to view more details about the other person’s estimated travel time, as well as get directions to the same address. You can also collaborate on the same travel time and respond with your status (more on collaborative apps below) and search for locations directly from iMessage. ETA is one of the most useful, technically impressive iMessage apps I’ve tried.\nIt can get even more advanced than this, though. Snappy, for example, is a web browser for iMessage. You can search Google or paste URLs in a search box, or use search suggestions.\nBrowse the web inside iMessage with Snappy.\nOnce you’ve found a webpage you want to share in a conversation, you can tap a Send button to insert the link in the input field. The link, of course, will expand into a rich preview. Given Messages’ lack of Safari View Controller, Snappy can be useful to paste links and view them without leaving the app; it’s also a convenient way to look something up on Google while talking to a friend.\nPico, developed by Clean Shaven Apps, can send photos and videos at lower quality with deeper controls than Apple’s built-in Low Quality Image Mode for iMessage. After choosing media from the library, Pico opens a dark interface with a preview at the top and quality settings at the bottom. You can choose from four quality presets, compare savings with the original item, and tweak dimensions.\nCompating image saving with Pico.\nIn addition to downscaling, Pico can remove metadata from media, such as location details. The app remembers settings for each conversation, and, overall, it’s a great way to save on cellular data with more options than iMessage’s default solution.\nTouch ID can be integrated with iMessage apps, and Cipher uses the authentication framework to let you send “secret messages” encrypted with AES-256 that don’t appear in the transcript as normal text messages. Instead, Cipher generates custom bubbles that hide your text; on the other end, the recipient will have to authenticate with Touch ID (thus confirming the device isn’t being used by someone else) to read your message.\n\nYou can also send digitally-signed messages to prove it’s really you by typing in Cipher and “signing” with your Touch ID.\nThese are just a few examples of what developers can build with the Messages framework. Starting today, we’re going to see an avalanche of iMessage apps, but the best ones will stand out as intuitive utilities suited for sharing.\nCollaborative iMessage Apps\nAlong with single-user apps, Apple has emphasized the ability for developers to build iMessage apps that let users collaborate on a task inside Messages.\nIn a collaborative iMessage app, an interactive message can be modified by participants in a conversation. As two or more people interact with a message in the same session and update its content, Messages removes the previous message from the transcript, collapsing it into a succinct summary text (so that outdated messages with old data don’t pollute the conversation). Only the new message, with the updated content, is displayed at the bottom as normal.\nLet’s work with a fictional example.\nImagine that you’re planning a trip with your friends over iMessage. It’s always hard to keep track of everyone’s available times, so developer Myke Hurley has created 1-2-3 Trip Planner, an iMessage app that looks into a user’s calendar, brings up a custom calendar view in Messages, and lets the user choose up to three available slots in their schedule. Once three times are picked, 1-2-3 Trip Planner generates a message bubble with the user’s three choices as a title.\nStephen has created an iMessage conversation with two of his friends, and they want to plan a trip together. Stephen brings up 1-2-3 Trip Planner, finds some available slots in his weekend schedule, selects three of them, and sends a message. The interactive message uses “Available Times – Stephen” in the bubble and the days of the week as title.\nStephen creates the first 1-2-3 Trip Planner bubble.\nOn the other end of the conversation, Christina needs to look at her calendar and pick three available times. When she taps the 1-2-3 Trip Planner bubble, Stephen’s choices are displayed alongside her calendar events, and she can agree on a same slot, or pick a different one. She then replies with her preferences, sending another message bubble.\nChristina replies with her schedule.\nJohn is the third participant in this conversation. In his iMessage transcript, Stephen’s first bubble has been collapsed into a summary text that says “Stephen picked three time slots” and Christina’s message says “Stephen and Christina’s time slots”. John is only seeing the latest message bubble with the choices of both users. When he taps on it, a full-screen interface comes up, showing him a calendar view with his events and the times Stephen and Christina previously picked.\nJohn picks his time slots and the trip is planned.\nJohn can also agree on previously chosen time slots or pick new ones. When he’s done, he sends his reply, and the second message bubble from Christina also turns into a summary text. John’s third and final bubble has a title that says “Stephen, Christina, and John”. At this point, the three participants are looking at one interactive message; they can look at the results and decide on a time that works for everyone.\nRight: what collapsing bubbles into summaries looks like.\nStephen, Christina, and John collaborated on a task within Messages without the back and forth of switching between calendars and texting each other’s available times. 1-2-3 Trip Planner has allowed multiple users to easily agree on a shared schedule in less than a minute.\nThere are two additional aspects worth noting. In my imaginary (but technically accurate) example, 1-2-3 Trip Planner accessed the native iOS EventKit framework; I’ve tried actual iMessage apps that accessed the camera, location, photos, and the clipboard. Also, Apple is very concerned about user privacy and exposing contact information to iMessage apps. For this reason, the Messages framework doesn’t allow apps to see any details about the participants in a conversation, but only local identifiers (alphanumeric strings that don’t identify a single person).34\nThe framework Apple has built into Messages should, in theory35, allow for the creation of moderately complex collaborative apps. Calendar collaboration is just one possible use case; imagine utilities to split bills, todo apps, photo compositions, and even games.\nWhat About Apple’s iMessage Apps?\nIt would have been nice to have more examples of iMessage apps from Apple. The company built an Apple Music app for iMessage to share songs you’ve recently listened to, which are displayed as interactive previews in the transcript.\n\nSongs can be shared quickly with 3D Touch, and the design of this app has inspired other third-party iMessage apps I’ve tested. However, not offering apps for Notes, Calendar, and other system apps feels like an oversight.\nI tested a couple of straightforward collaborative iMessage apps in the past few weeks. The aforementioned ETA iMessage app lets you respond to a friend’s travel time with another interactive message.\nETA’s bubbles and summaries.\nAnother app is ChibiStudio, which lets you assemble “chibi” avatars either by yourself or with a friend choosing from various pieces of clothing and body traits.\nCollaborating on character creation on iMessage.\nWhen creating a chibi collaboratively, each person can add new details to the character and send an interactive message back. To keep track of progress, the app tells you which items have been added in the title of the message bubble and it collapses previous messages into summaries. I tested ChibiStudio with John, and it was fun.\nDo With Me uses collaboration in iMessage effectively, enabling you to create shared todo lists where other people can add and complete items inside a conversation.\nJohn added items to our shared Do With Me list.\nI wouldn’t use an iMessage todo app as my only task manager, but I think it’s useful to have something like Do With Me as an extension of a full task manager to collaborate with others on specific lists (grocery shopping, homework, etc.).\nFinally, it wouldn’t be a new App Store without a re-interpretation of tic-tac-toe. In xoxo, you’ll be able to challenge friends on the classic game with a collaborative iMessage app that uses bubbles and full-screen views to advance the game.\nSometimes, you need a simple iMessage game to kick back.\nThe app works surprisingly well, with a good use of summaries in the transcript and captions to describe player moves. It’s a nice way to pass the time during a conversation.36\nCollaborative iMessage apps are only one part of the story. For single-user iMessage apps, the Messages framework should be enough to create deep, custom experiences unlike anything we’ve seen before.\nThe Future of iMessage\nWhen the App Store opened for business, no one could imagine the extent of developers’ imagination. No one could predict what the iPhone would become by letting app makers write software for it. And looking back at that moment today, it’s evident that our devices are deeply different, and dramatically more powerful, because of apps.\nApple can attain a similar result with the iMessage App Store. iMessage apps create a new avenue for developers to bring any kind of experience into millions of daily conversations. And by plugging into iOS and the App Store, Apple can leverage the scale of an ecosystem other messaging services don’t have.\n\nWe’re on the brink of a fundamental change to iMessage.\n\nAfter using iMessage apps for the past three months, I have the same feeling of the early App Store days. It’s a new frontier, it’s exciting, and developers are just getting started. Compared to companion App Stores like the Watch and Apple TV ones, I think the iMessage App Store will be a hit among iOS users.\nApple needed to modernize iMessage in iOS 10, but they went beyond mere aesthetic and functional improvements to the Messages app. They’ve opened the door for apps to reimagine what we share and how we share it.\nWe’re on the brink of a fundamental change to iMessage. If Apple plays its cards right, we could be witnessing the foundation of a second app platform within iOS.\n\n\nSiri\n“A delayed game is eventually good, but a rushed game is forever bad”, Nintendo’s Shigeru Miyamoto once quipped.\nUnlike the console Miyamoto was concerned about, modern software and services can always be improved over time, but Apple knows the damage that can be caused by the missteps and perception of a rushed product. With iOS 10’s SiriKit, they’ve taken a different, more prudent route.\nEver since the company debuted its virtual assistant in 2011, it was clear Siri’s full potential – the rise of a fourth interface – could only be unlocked by extending it to third-party apps. And yet, as Siri’s built-in functionalities grew, a developer SDK remained suspiciously absent from the roster of year-over-year improvements. While others shipped or demoed voice-controlled assistants enriched by app integrations, Siri retained its exclusivity to Apple’s sanctioned services.\nAs it was recently revealed by top Apple executives, however, work on Siri has continued apace behind the scenes, including the rollout of artificial intelligence that cut error rates in half thanks to machine learning. In iOS 10, Apple is confident that the Siri backend is strong and flexible enough to be opened up to third-party developers with extensions. But at the same time, Apple is in no rush to bring support for any kind of app to Siri in this first release, taking a cautious approach with a few limitations.\nDevelopers in iOS 10 can integrate their apps with Siri through SiriKit. The framework has been designed to let Siri handle natural language processing automatically, so developers only need to focus on their extensions and apps.\nAt a high level, SiriKit understands domains – categories of tasks that can be verbally invoked by the user. In iOS 10, apps can integrate with 7 SiriKit domains37:\nAudio and video calling: initiate a call or search the user’s call history;\nMessaging: send messages and search a user’s message history;\nPayments: send and request payments;\nPhotos: search photos or play slideshows;\nBook rides: shared by Maps and Siri. Siri can book a ride or get the status of a booked ride. Maps can also display a list of available rides for an area;\nWorkouts: start, end, and manage workouts;\nCarPlay: manage vehicle environment by adjusting settings such as climate control, radio stations, defroster, and more.\nInside a domain, Siri deals with intents. An intent is an action that Siri asks an app to perform. It represents user intention and it can have properties to indicate parameters – like the location of a photo or the date a message was received. An app can support multiple intents within the same domain, and it always needs to ask for permission to integrate with Siri.\nSiri permissions.\nSiriKit is easy to grasp if you visualize it like a Chinese box with a domain, inside of which there are multiple types of actions to be performed, where each can be marked up with properties. In this structure, Apple isn’t asking developers to parse natural language for all the expressions a question can be asked with. They’re giving developers empty boxes that have to be filled with data in the right places.\nImagine a messaging app that wants to support Siri to let users send messages via voice. Once SiriKit is implemented, a user would need to say something like “Tell Myke I’m going to be late using [app name]”, and the message would be composed in Siri, previewed visually or spoken aloud, and then passed to the app to be sent to Myke.\nCraig Federighi with an example of WeChat in Siri.\nThis basic flow of Siri as a language interpreter and middleman between voice and apps is the same for all domains and intents available in SiriKit. Effectively, SiriKit is a framework where app extensions fill the blanks of what Siri understood.\nThe syntax required by SiriKit simultaneously shows the rigidity and versatility of the framework. To summon an intent from a particular app, users have to say its name. However, thanks to Siri’s multilingual capabilities, developers don’t have to build support for multiple ways of asking the same question.\nYou could say “Hey Siri, send a message to Stephen using WhatsApp” or “message Stephen via WhatsApp”, but you could also phrase your request differently, asking something like “Check with WhatsApp if I can message Stephen saying I’ll be late”. You can also turn an app’s name into a verb and ask Siri to “WhatsApp Stephen I’m almost home”, and SiriKit will take care of understanding what you said so your command can be turned into an intent and passed to WhatsApp.\n\nIf multiple apps for the same domain are installed and you don’t specify an app’s name – let’s say you have both Lyft and Uber installed and you say “Hey Siri, get me a ride to the Colosseum” – Siri will ask you to confirm which app you want to use.\nApple has built SiriKit so that users can speak naturally, in dozens of languages, with as much verbosity as they want, while developers only have to care about fulfilling requests with their extensions. Apple refers to this multi-step process as “resolve, confirm, and handle”, where Siri itself takes care of most of the work.\nDevelopers are given some control over certain aspects of the implementation. From a visual standpoint, they can customize their experiences with an Intents UI extension, which makes a Siri snippet look and feel like the app it comes from.\nCustomizing Siri extensions is optional, but I’d bet on most developers adopting it as it helps with branding and consistency. Slack, for instance, could customize its Siri snippet with their channel interface, while a workout extension could show the same graphics of the main app. Intents UI extensions aren’t interactive (users can’t tap on controls inside the customized snippet), but they can be used for updates on an in-progress intent (like a Uber ride or a workout session).\nAn app might want to make sure what Siri heard is correct. When that’s the case, an app can ask Siri to have the user double-check some information with a Yes/No dialog, or provide a list of choices to Siri to make sure it’s dealing with the right set of data. By default, Siri will always ask to confirm requesting a ride or sending a payment before the final step.\nOther behaviors might need authentication from the user. Apps can restrict and increase security of their SiriKit extensions (such as when a device is locked) and request Touch ID verification. I’d imagine that a messaging app might allow sending messages via Siri from the Lock screen (the default behavior of the messaging intent), but restrict searching a user’s message history to Touch ID or the passcode.\nLast, while Siri takes care of natural language processing out of the box, apps can offer vocabularies with specific terms to aid recognition of requests. A Siri-enabled app can provide user words, which are specific to a single user and include contact names (when not managed by Contacts), contact groups, photo tag and album names, workout names, and vehicle names for CarPlay; or, it can offer a global vocabulary, which is common to all users of an app and indicates workout names and ride options. For example, if Uber and Google Photos integrate with SiriKit, this means you’ll be able to ask “Show me photos from the Italy Trip 2016 album in Google Photos” or “Get me a Uber Black to SFO”.\nSiriKit has the potential to bring a complete new layer of interaction to apps. On paper, it’s what we’ve always wanted from a Siri API: a way for developers to expose their app’s features to conversational requests without having to build a semantic engine. Precise but flexible, inherently elegant in its constraints, and customizable with native UIs and app vocabularies. SiriKit has it all.\n\nSiriKit has the potential to bring a complete new layer of interaction to apps.\n\nThe problem with SiriKit today is that it’s too limited. The 7 domains supported at launch are skewed towards the types of apps large companies offer on iOS. It’s great that SiriKit will allow Facebook, Uber, WeChat, Square, and others to build new voice experiences, but Apple is leaving out obvious categories of apps that would benefit from it as well. Note-taking, media playback, social networking, task management, calendar event creation, weather forecasts – the nature of these apps precludes integration with SiriKit. We can only hope that Apple will continue to open up more domains in future iterations of iOS.\nFor this reason, SiriKit might as well be considered a public beta for now: it covers a fraction of what users do on their iPhones and iPads. I’ve only been able to test one app with SiriKit integration over the past few days – an upcoming update to Airmail. Bloop’s powerful email client will include a SiriKit extension of the messaging domain (even if email isn’t strictly “messaging”) to let you send email messages to people in your Airmail contact list.\nSiriKit and Airmail with different levels of verbosity.\nIn using Airmail and Siri together, I noticed how SiriKit took care of parsing natural language and multiple ways to phrase the same request. The “resolve, confirm, and handle” flow was exemplified by the steps required to confirm pieces of data required by Siri – in Airmail’s case, the recipient’s email address and message text.\nMultiple steps in SiriKit.\nAs for other domains, I can’t comment on the practical gains of SiriKit yet, but I feel like messaging and VoIP apps will turn out to be popular options among iPhone users.\nI want to give Apple some credit. Conversational interactions are extremely difficult to get right. Unlike interface elements that can only be tapped in a limited number of ways, each language supported by Siri has a multitude of possible combinations for each sentence. Offloading language recognition to Siri and letting developers focus on the client side seems like the best solution for the iOS ecosystem.\nWe’re in the early days of SiriKit. Unlike Split View or notifications, it’s not immediately clear if and how this technology will change how we interact with apps. But what’s evident is that Apple has been laying SiriKit’s foundation for quite some time now. From the pursuit of more accurate language understanding through AI to the extensibility framework and NSUserActivity38, we can trace back SiriKit’s origins to problems and solutions Apple has been working on for years.\nUnsurprisingly, Apple is playing the long game: standardizing the richness of the App Store in domains will take years and a lot of patient, iterative work. It’s not the kind of effort that is usually appreciated by the tech press, but it’ll be essential to strike a balance between natural conversations and consistent behavior of app extensions.\nApple isn’t rushing SiriKit. Hopefully, that will turn out to be a good choice.\n\n\nSafari\nAmong various minor enhancements, there’s one notable addition to Safari in iOS 10 that points at the possible direction of many iPad apps going forward. Effectively, it’s the most important iPad-only feature this year.\nSafari for iPad now supports in-app split view to open two webpages at once in landscape mode. Apple named this “Safari split view”, but it’s not related to the namesake system-wide multitasking mode. Opening two webpages in Safari doesn’t engage the Slide Over app switcher.\n\nThere are multiple ways to invoke Safari split view. You can tap and hold on the tabs icon in the top toolbar and choose ‘Open Split View’. This causes Safari to create two views and bring up the Favorites grid on the right side.\nHold the button to show the menu…\n…and enter split view.\nYou can also tap & hold on a link in a webpage, hit ‘Open in Split View’, and the destination page will load on the right. If split view is already active, holding a link on the right side will offer a similar ‘Open on Other Side’ option.\n\nIf you’d rather tap once to open a webpage in split view, you can perform a two-finger tap on a link to either activate split view (if you’re in full-screen) or open a new tab on the other side.\nLast, you can drag a tab out of the toolbar and take it to the other side (either left or right). If split view isn’t already enabled, the tab will morph into a small preview of the webpage as Safari resizes inwards, showing a gray area that indicates you can drop the page to open it in split view.\n\n \nSafari’s new drag & drop for tabs.\n\nIt’s a polished, fun animation, which also works the other way around to put a tab back on the left and close split view.39\nIn addition to drag & drop, you can tap and hold the tabs button to merge all tabs in one screen and close split view. Because Safari for iOS 10 supports opening unlimited tabs (both on the iPhone and iPad), this menu also contains an option to close all tabs at once – one of my favorite tweaks in iOS 10.\nClose all tabs at once.\nSafari split view is aware of iOS’ system-wide Split View. If Safari is in split view and you bring in a second app to use alongside the browser, Safari’s split view is automatically dismissed by merging all tabs. When you close Split View and go back to Safari in full-screen, the browser’s split view resumes where it left off.\nThere’s nothing surprising about the look of Safari split view: using Size Classes (we meet again, old friend), Safari creates two instances of the same view, each independent from the other and carrying the same controls.40\nI’ve long wished for the ability to view and interact with multiple Safari tabs at once on my iPad Pro. Before iOS 10, developers who recognized this gap in Safari’s functionality were able to sidestep Apple’s limitations with clever uses of Safari View Controller. The new Safari on the iPad obviates the need for those third-party apps with a native solution. The feature is particularly effective on the 12.9-inch iPad Pro, where you can view two full webpages instead of smaller versions scaled to fit. It’s the same feeling of upgrading to Split View on the 12.9-inch iPad Pro from the 9.7-inch model.\nThe 9.7-inch iPad Pro, of course, shows less content than the 12.9-inch model (left). (Tap for full size)\nAfter incorporating Safari split view in my workflow, I wish every document-based iPad app offered a way to split the interface in two panes.\nSafari split view is a brilliant showcase of drag & drop to move content across multiple views, too.\nLack of a proper drag & drop framework for iPad apps, especially after the introduction of Split View in iOS 9, is baffling at this point. Multitouch and Split View are uniquely suited to breathe new life into the decade-old concept of drag & drop – just look at macOS and how well the system works even without multitouch. Drag & drop would make more sense on iOS than it ever made on the desktop by virtue of direct content manipulation.\n\nSafari split view is a brilliant showcase of drag & drop.\n\nSafari’s drag & drop tab behavior is, hopefully, showing a glimpse of the future we deserve. A system-wide drag & drop framework is going to be trickier to pull off than a single browser tab41, but we can keep the dream alive.\nMore Changes\nThere are other smaller changes in iOS 10’s Safari.\nThe parsing engine of Safari Reader – Apple’s tool to increase the readability of webpages by stripping them of interface elements and ads – has been updated to support display of bylines, publication dates, and article subheads. The extraction of these bits of metadata isn’t perfect42, but it’s a step up from the previous version.\nWhen Apple introduced Safari View Controller last year, they were adamant about its appearance: because the experience had to be consistent with Safari, developers couldn’t modify or style the new in-app web view with their own UI. Third-party apps could set a tint color for the toolbar icons of Safari View Controller to make them match their colors (something we’ve seen implemented in apps like Overcast and NewsBlur), but that was as far as customization went.\nA customized Safari View Controller in Tweetbot for iOS 10, matching the dark theme.\nApple is letting developers customize Safari View Controller with a tint color for view bar backgrounds in iOS 10. In addition to color tinting for UI controls, the color of the entire toolbar can be set to something other than white. This should make the experience within apps more cohesive and the transition between app and web view less jarring.\nSpeaking of Safari View Controller: Find in Page is now supported in app web views as an action extension.\n\nWhen hitting Command-T on iOS 10, a new tab opens with the cursor placed in the address bar, ready to start typing. External keyboard users rejoice.\nDownloads, a longtime Safari issue, haven’t been exactly “fixed” in iOS 1043, but Apple has found ways to circumvent old annoyances. First, hitting a link to a file download (such as a .zip file) now displays proper download progress in the address bar. Then, when the download is complete, the file can be saved to iCloud Drive with iOS 10’s new Add to iCloud Drive extension.\nSaving a downloaded file from Safari to iCloud Drive is now possible with an extension.\nWe still haven’t reached the point where Safari automatically downloads files into an iCloud Drive folder, but the idea doesn’t seem so far-fetched anymore.\nAnother limitation of Safari that has been fixed in iOS 10 is video playback. Thanks to a new webkit-playsinline property, web developers can specify videos that can be played inline on the iPhone without opening the full-screen player.\nMinimize and expand.\nEven if the property isn’t specified, playback will commence in full-screen but users can pinch close on the video (or tap a resize button) to keep playing it inline. Being able to shrink videos down makes the iPhone’s browsing experience more pleasant.\nFurthermore, Safari in iOS 10 brings support for automatic video playback of videos without audio tracks on page load (you may have seen such videos in this review). The change, outlined on the WebKit blog earlier this year, was motivated by the rising popularity of animated GIFs. As WebKit engineers noted, the GIF format itself can be computationally intensive and it’s not energy-efficient – part of the reason why online GIF providers have embraced the <video> element with disabled audio tracks to replace GIFs. This change should also help websites that use muted videos as animated backgrounds, which will display correctly on devices running iOS 10.\nSpeaking of websites, Safari on iOS 10 enables pinch to zoom by default on all sites – even those that have specifically disabled zooming through meta tags. From an accessibility standpoint, I can only applaud Apple’s decision.\nMoving onto other features, you can search your Favorites and Reading List items by swiping down to reveal a search bar. Reading List doesn’t support full-text search, so you’ll only be able to search titles and not inside the text of a saved article.\n\nFinally, smarter AutoFill. While iOS 9 could suggest passwords and emails when attempting to fill web forms, iOS 10 takes it a step further and replaces the Passwords button above the keyboard with AutoFill Contact. The new dialog offers multiple options for your own contact card (such as Work and Personal email addresses) with the ability to customize your card’s AutoFill without leaving Safari.\nCustomizing AutoFill.\nFor the first time, you can also auto-fill any other contact on a webpage by hitting ‘Other Contact…’ and picking an entry from your address book (other contacts can be customized before auto-filling, too).\nApple is taking advantage of QuickType suggestions to speed up AutoFill: if you don’t want to use the AutoFill interface, QuickType can suggest names, multiple email addresses, phone numbers, and other information from a contact’s card through predictive shortcuts.\n\nThe deeper integration of contacts and AutoFill makes it easier to sign up for web services without having to type details. It’s another argument in favor of Safari View Controller for apps: developers of apps with an account component will get more flexible AutoFill features for free if they implement Apple’s web view in their signup flows. I know I wouldn’t want to type contact information (or use an extension) after testing the convenience of AutoFill in iOS 10.\n\nSafari remains Apple’s most mature and powerful iOS app.\n\nEven without new headline features like Safari View Controller and Content Blockers, Safari remains Apple’s most mature and powerful iOS app. This year’s refinements are well thought-out and split view is a boon for web multitasking on the iPad. I have no complaints.\n\n\nApple Music\nOf all system apps, Apple Music is the one that got a dramatic makeover in iOS 10.44\nWith a redesign aimed at placating concerns about an overly complex interface, and with new algorithmic discovery features, iOS 10’s Apple Music is streamlined and more powerful. Beneath a veil of astonishing simplification, the new Apple Music packs significant enhancements to the discovery and listening experience.\nIt’s Dangerous to Go Alone\nIf you had to list the shortcomings of Apple Music since its debut in iOS 8.3, a hodgepodge of features tacked onto a crumbling pre-streaming foundation would easily sit at #1. With iOS 10, Apple wants its music streaming service to be more accessible and intuitive for everyone who’s moved past iTunes.\nPart of this effort has resulted in a modern design language that does away with most iOS UI conventions in favor of big, bold headlines, larger buttons, and a conspicuous decrease of transparency in the interface. If you’re coming from Apple Music on iOS 9 and remember the information-dense layout and translucent Now Playing screen, prepare to be shocked by iOS 10’s rethinking of the Music app.\n\nWhile the bold design has intriguing consequences for the overall consistency of iOS’ visuals, its impact on usability is just as remarkable. The new Apple Music is no longer following the interaction paradigms of the iTunes Store and App Store: there’s no front page highlighting dozens of top songs and curated recommendations. Instead, it’s been replaced by a simplified Browse page with a handful of scrollable featured items at the top and links to explore new music, curated playlists, top charts, and genres.\n\nRemoving new releases and charts from the Browse page helped Apple accomplish two goals: give each section more room to breathe; and highlight the most important items (singles, albums, videos, etc.) with a big cover photo that can’t be missed.\nBeing able to discern items with an effective sense of place is the underlying theme of navigation in the new Apple Music. On the one hand, information density has suffered and Apple Music can’t show as much content on screen as it used to. On the other, bold headlines, fewer items per page, and larger controls should prevent users (who aren’t necessarily well versed in the intricacies of the iTunes Store UI, upon which the old Apple Music was based) from feeling overwhelmed. Apple’s new design wants to guide users through Apple Music’s vast catalogue, and it mostly succeeds.\nApple Music’s For You page on the iPad.\nThis is evident in the Library page, where Apple’s has switched from a hidden View menu to a set of vertical buttons displayed underneath the “title bar”. If you were confused by the taps needed to browse by Artist or view downloaded music in iOS 9, fear no more – Apple has created customizable buttons for you this time.\nBig, customizable buttons.\nThe same philosophy is shared by every other screen in Apple Music. Radio, search, even For Your recommendations – they’ve all moved on from the contortions of their iTunes-like predecessors to embrace clarity and simplicity through white space, large artworks, and big buttons.\n\nAnother prominent change from iOS 9 is the removal of translucent panes of glass in the interface. Transparency effects look great in screenshots, but unpredictable color combinations don’t make a good case for legibility and consistency.\nApple is ditching translucency in the Now Playing screen altogether. In iOS 10, they’ve opted for a plain black-and-white design that lays out every element clearly and keeps text readable at all times.\nLegible text, large album artwork.\nIt’s not as fancy as iOS 9, but it’s also not as over-engineered for the sake of beauty. Album artwork stands out against a white background; buttons are big and tappable; there’s even a nice, Material-esque effect when pausing and resuming a song. Alas, the ability to love a song with a single tap from the Now Playing screen is gone.\n\n \n\nThe new Apple Music is equal parts appearance and substance. The bottom playback widget45 has been enlarged so it’s easier to tap, and it also supports 3D Touch.46 Pressing on it reveals a redesigned contextual menu with options for queue management, saving a song, sharing, liking (and, for the first time, disliking), and, finally, lyrics.\n\nApple Music integrates officially licensed lyrics in the listening experience. Apple has struck deals with rightsholders to make this happen; even if lyrics aren’t available for every song on Apple Music yet, they’ve seemed to grow during the beta period this summer, and I expect more lyrics to become available on a regular basis for new and old releases.\nWhen lyrics are available, you’ll see an additional button in the contextual menu as well as a new section when swiping up on the Now Playing screen. This is where shuffle, repeat, and the Up Next queue live now; I wish Apple had done a better job at hinting this space exists below what you can see. There’s no indication that you can swipe up to reveal more options.\nSwipe up to reveal repeat, shuffle, Up Next, and lyrics.\nLyrics, unlike Musixmatch, don’t follow a song in real-time. They’re either displayed in a full-screen popup (if opened from the contextual menu) or above Up Next in plain text.\nLyrics are modal on the iPad when opened from the contextual menu.\nThat’s not a deal-breaker, though. As a lyrics aficionado (I’ve also learned English through the lyrics of my favorite songs), this is a fantastic differentiator from other streaming services. Not having to Google the lyrics of what I’m listening and being taken to websites riddled by ads and often incorrect lyrics? I’m not exaggerating when I say that this feature alone might push me to use Apple Music every day again. I was hoping Apple would eventually bring native lyrics to Apple Music, and they delivered with iOS 10.47\nAnother functional improvement to the Now Playing screen is a built-in menu to control audio output. Taking a cue from Spotify, Apple Music sports a button in the middle of the bottom row of icons to switch between the iPhone’s speaker, wired and Bluetooth headphones, and external speakers with just a couple of taps.48\n\nYou don’t have to open Bluetooth or AirPlay settings to stream music to different devices. This was probably built with the iPhone 7 and AirPods in mind, but it’s a feature that makes managing audio faster for everyone.\nAvailable in Settings > Music, a new Optimize Storage option lets iOS automatically remove music from your device that you haven’t played in a while. Unlike the similar setting for iCloud Photo Library, you can control the minimum amount of songs you want to keep downloaded on your device with four tiers. On my 64 GB iPhone 6s Plus, I see the following options:\n4 GB (800 songs)\n8 GB (1600 songs)\n16 GB (3200 songs)\n32 GB (6400 songs)\nIf you have an older iOS device with limited storage, this should be useful to ensure a compromise of available space and offline songs.\nMusic on iPad\nApple Music for iPad doesn’t diverge from the iPhone counterpart much, but there a few differences worth noting.\nThe Browse page collects featured items, hot tracks, new albums, playlists, and more within a single view, with buttons to explore individual sections placed in a popover at the top.\n\nInstead of taking over the app in full-screen, playing a song opens a Now Playing sidebar. The view is launched from a tab bar split between icons and the playback widget.\n\nThe sidebar feels like having Slide Over within Music: it doesn’t behave like Safari or Mail’s in-app split view, where you can interact with two views at the same time; instead, it’s modal and overlaid on top of content.\nIt’s not immediately clear why Apple didn’t stick to a full-screen Now Playing view on the iPad if the sidebar still prevents interactions on the other side. Perhaps they realized giant-sized album artwork didn’t make sense on the iPad Pro? Maybe the vertical layout lends itself better to peeking at Up Next and lyrics below playback controls? The sidebar is fine, but I’d rather have a real in-app split view in Music too.\n\nThe Split View that Music does have on iOS 10 is the system-wide multitasking one. On the 12.9-inch iPad Pro, Now Playing is a sidebar in both Split View layouts when Music is the primary app, but it turns into a full-screen view when Music is the secondary app in Slide Over.\nNew Discovery\niOS 10 brings discovery features that pit Apple Music against Spotify’s algorithmic playlists and personalized curation.\nApple Music’s For You page features two personalized playlists in a carousel at the top – My New Music Mix and My Favorites Mix. Both are automatically refreshed every week and are personalized for each user based on their listening habits and favorite songs.\n\nMy New Music Mix, refreshed every Friday, showcases new music Apple Music thinks you’ll like; My Favorites Mix is a collection of hit singles, deep cuts, and songs related to your favorite artists that is refreshed every Wednesday.\nThe idea of a personalized mixtape refreshed on a weekly basis isn’t new. Spotify was a pioneer in this field with their excellent Discover Weekly, which recently expanded to Release Radar. Spotify’s system is powered by a neural network (its inner workings are utterly fascinating) and, as I previously wrote, it delivers impressive personalized picks that almost feel like another person made a mixtape for you.\nIt’s too early to judge Apple’s efforts with personalized playlists in iOS 10. They only rolled out two weeks ago, and, in my experience, such functionalities are best evaluated over a longer span of time after judicious listening and “loving” of songs.\nMy impression, however, is that Apple has succeeded at launching two great ways to discover new music and re-discover old gems every week. My first two My Favorites Mix playlists have been on point, collecting songs (both hits and lesser known ones) from all artists I knew and liked. Apple Music’s first two My New Music Mix playlists weren’t as accurate as Spotify’s Release Radar, but, to be fair, I have been religiously using Spotify for over 9 months now, whereas I just came back to Apple Music. Accuracy may still be skewed in Spotify’s favor given my listening’s history.\nStill, we don’t need to wait to argue that algorithmically-generated playlists refreshed weekly are a fantastic addition to Apple Music. As I noted in my story on Spotify’s Discover Weekly earlier this year, human curation is inherently limited. Apple has been at the forefront of human-made playlists, but it was missing the smart curation features of Spotify. Apple’s two personalized mixes seem more – pardon the use of the term – mainstream than Discover Weekly, but that isn’t a downside. Easing people into the idea of personalized playlists made by algorithms and then launching more specific types focused on music aficionados might be a better consumer approach than Spotify. I’d wager Apple is considering a product similar to Spotify’s Discovery Weekly – a playlist that highlights back-catalogue songs you might like from artists you’re not familiar with.\nMy New Music Mix and My Favorites Mix already seem very good, and they show that Apple can compete with Spotify when it comes to personalized music curation. As with other algorithmic features launched in iOS 10, Apple’s debut is surprisingly capable and well-reasoned.\nThere are other changes in the For You section. Connect, already an afterthought in iOS 9, has been demoted from standalone view to a sub-section of For You.\nThose links don’t even open in Apple Music.\nSome people must be using Connect (who’s leaving those comments?), but I just don’t see the incentive for artists to post on it and for users to subscribe. Apple doesn’t seem to care about it as a social network, and everyone is better off engaging with fans and following artists on Twitter and Facebook. Unless Apple gives it another try, I don’t think Connect can suddenly gain relevancy. Rolling Connect into For You feels like Apple’s version of “going to live in a farm upstate”.\n\nRolling Connect into For You feels like Apple’s version of “going to live in a farm upstate”.\n\nPlaylists and sections recommended in For You have been redesigned and shuffled around. Every section can now be scrolled horizontally to reveal more content; Recently Played and Heavy Rotation tend to float towards the top for easier access; and there’s the usual mix of artist spotlights, essentials (they’re not called “Intro To” anymore), human-curated playlists, and a new section called New Releases For You.\nThe refreshed For You in iOS 10.\nIf you liked Apple’s For You section before, you won’t be disappointed by iOS 10’s refresh. But I believe My New Music Mix and My Favorites Mix will steal the show for many.\n\nApple’s firing on all cylinders against Spotify and others in the music streaming industry.\n\nI’m still not sure if I want to give Apple Music another try – I’ve been truly satisfied with Spotify since I moved back to it in January. Apple’s updates in iOS 10 are compelling, though. I got used to the “big and bold” design quickly, and I find it easier to parse and more fun than Spotify’s boring black and green. Apple Music may sport lower information density, but, at least for me, it’s easier to use than Spotify. Personalized playlists are solid already, and I’ve been keeping My Favorites Mix synced offline for fast access to a collection of songs I know I’m going to like. And then there’s lyrics, which is nothing short of a game changer for me.\nApple’s firing on all cylinders against Spotify and others in the music streaming industry. It might be time to take Apple Music for a spin again.\n\n\nMaps\nWithout new exploration and location editing modes (transit launched in September 2015, and it’s slowly rolling out to more cities; crowdsourced POI collection is still a no-go), Apple is making design and third-party apps the focal points of Maps in iOS 10.\nMaps’ new look removes UI chrome and enhances usability on large iPhones through lowered controls, intuitive buttons, and more proactive suggestions. There are floating buttons to find your position and open Maps’ settings. Apple has gotten rid of the search bar at the top and replaced it with a card at the bottom (a floating “sidebar” on the iPad). You can swipe up the card to reveal suggestions below the search field.\n\nThe sense is that Apple wanted to ship a smarter, more conversational search feature, which now offers proactive place suggestions. Instead of a handful of recent addresses, Maps now curates a richer list of locations based on recently viewed and marked places, favorites, places you’ve been to, and addresses you probably want to go next based on proactive iOS features.\nA webpage with an address I was viewing in Safari, proactively suggested by Maps.\nEach suggestion is associated with a relevant icon, so they’re prettier and easier to identify. You can even swipe on them to remove them from the list or share them with other apps.\n\nColorful business and landmark icons are used in search results, which are more lively than iOS 9 and include more Nearby categories. In selected regions, Nearby results can be filtered with sub-categories in a scrollable bar at the bottom of the screen.\n\nIconography has always been one of the strong suits of Apple Maps, and the company is doubling down on it with iOS 10. Previously, when searching for places that pertained to a specific category such as Restaurants, Maps would drop generic red pins on the map, requiring you to tap on them to open a first popup, then tap again to open a detail view with information about the place. It was a slow, unattractive process that hid useful details from the first step of search results.\niOS 10 improves upon this in two ways. Instead of red pins, multiple search results are dropped on the map with more descriptive pins that suggest what a result is before tapping it. In the restaurant example, you’ll end up with icons that contain pizza slices, hamburgers, or a fork and knife, for instance. If two results are close to each other on the current zoom level, they’ll be grouped in a numeric orange pin that you can tap to choose one result.\n\nSecond, choosing a result uses the iPhone’s search panel as a split view to display business information and the map at the same time. As you tap through results, you can preview place details with a card UI at the bottom that shows ratings, distance, and a button to start directions.\nThe interaction is similar on the iPad. Instead of managing result cards on the vertical axis, they’re overlaid horizontally in a sidebar on the left.\n\n \nMaps results on iPad\n\nBy combining these elements with cards that are more comfortable to reach, iOS 10’s Maps feels like it’s been optimized for humans and nimble exploration. By comparison, the old Maps feels static and arbitrary.\nThe same philosophy has been brought to navigation. In iOS 10, you can pan freely on the map and re-center navigation with a button.\nYou can pan around during navigation in iOS 10.\nDetails for the current trip, such as estimated arrival time and distance, are displayed in a bottom card, which, like results, can be swiped up to access more options. These include audio settings, turn-by-turn details, an overview, and, for the first time, en-route suggestions for places you might want to stop by, like gas stations or coffee shops.\nMore card-like UIs in Maps’ navigation. (Tap for full size)\nAfter selecting a category of suggestions during navigation, Maps will return a list of nearby results and tell you how many minutes each will add to your trip. Select one, confirm that you want to stop by, and Maps will update directions for the new destination. When you’re done, you can resume your route to the first destination with a blue banner at the top.\nMaps Settings\n\niOS 10 brings settings to control your preferred transportation types (driving, walking, or transit) and options to avoid tolls and highways or show the compass in navigation. You can choose which kinds of transit vehicles to use when planning a trip, such as bus, subway/light rail, commuter rail, or ferry. You can customize everything in Settings > Maps.\nApple is also going to let developers plug into Maps with extensions. If an app offers ride booking, restaurant reservations, and “other location-related services”, it can embed its functionalities in Maps.\nMaps extensions, like SiriKit’s, are based on intents and developers can provide custom interfaces with an Intents UI extension. The same extensions that allow users to hail a Uber and track status with Siri can be used from Maps to get a ride to a selected place.49 Maps extensions contained inside iOS apps are disabled by default; they have to be activated from Settings > Maps > Extensions.\nOpenTable’s Maps extension.\nI’ve only been able to test OpenTable’s Maps extension earlier this week, which has limited integration in Rome for a few restaurants. Once enabled, OpenTable’s extension adds a button to view more information about a restaurant and make a reservation. You can set table size, pick available times, and enter special requests in Maps. OpenTable will ask you to continue the task in the main app to confirm a reservation, but it’s nice to have a way to quickly check times and availability without leaving Maps.\nI’m curious to see how ride-sharing and other location-based services available in Italy will implement Maps extensions.\nThe quality of Apple Maps data for my area still isn’t comparable to Google Maps. Apple Maps has improved since iOS 6, but I still wouldn’t trust it to guide me through a sketchy neighborhood in Rome at night. At the same time, I prefer the design of Apple Maps and its many thoughtful touches to Google Maps. From my perspective, Apple has created a more intuitive, better designed app without the data and intelligence of Google. It’s an odd predicament to be in: while I appreciate Apple Maps’ look in iOS 10, I also want navigation to be reliable and trustworthy.\nThere’s a lot to like in Maps for iOS 10 and great potential for developers to elevate location-driven apps to a more contextual experience. The revised interface imbued with proactive suggestions is a step forward from iOS 9; the richer presentation of results makes Maps friendlier and informative. Maps in iOS 10 feels like someone at Apple finally sat down and tried to understand how regular people want to use maps on a phone. The redesign is outstanding.\nApple has perfected Maps’ interface and interactions, and now they have a developer platform, too. An underlying problem remains: when it comes to data accuracy, your mileage with Apple Maps may vary.\n\n\nHome\nApple’s home automation framework, HomeKit, is ready for prime time in iOS 10. In addition to a dedicated page of shortcuts in Control Center, HomeKit is getting a native app for accessory management. It’s also expanding to new types of accessories, including cameras.\nLike iCloud Drive graduated to an app after a framework-only debut, all your HomeKit accessories can be accessed from a Home app in iOS 10. You won’t find the complexity of advanced tools such as Matthias Hochgatterer’s unfortunately-named Home in Apple’s take. Instead, Apple’s Home app will greet you with the same bold look of Apple Music and News.\nCustomizable edge-to-edge photo backgrounds and large buttons command the interface.\nHome works with any HomeKit accessories previously set up on iOS 9. One of the biggest flaws of the old HomeKit implementation – the inability to set up new accessories without an app from the vendor – has been fixed with iOS 10’s Home app, which offers a complete setup flow from start to finish.\nRooms are a section of the app, while your favorite accessories and scenes are highlighted in the main Home dashboard. They’re the same shortcuts used in Control Center.\nLong-tapping an accessory in a room to open its detail screen.\nApple offers a collection of suggested scenes to get started – such as “Good morning” or “I’m home” – but you’ll want to create your own scenes, choosing from custom icons50 and any accessory action you want.\n\nMost users will only use Home for the initial accessory/scene configuration and to add favorites in Control Center, but there are hidden tricks in the app that are worth exploring (and, like Apple Music, concerning from a discoverability perspective).\nYou can find a summary of average conditions and statuses at the top of the Home page. You might see humidity, temperature, and door lock status in this message. You can tap Details for an accessory overview.\n\nYour home’s wallpaper can be modified by tapping the location icon in the top left and choosing a new one. You can do the same for rooms: after picking a room, tap the list icon in the top left, open Room Settings, and assign a new wallpaper.\n\nCustom wallpapers for multiple rooms are a nice touch: they make the Home app look like your home, but I wish they synced with iCloud.\nSome of the app’s features are too hidden. To navigate between rooms, you can tap the menu at the top, but you can also swipe between rooms. There’s no visual cue to indicate that multiple rooms live on the same horizontal pane. The design language shared by Apple Music and Apple News means both apps have this feature discoverability issue in common.\nSimilarly, buttons can be pressed with 3D Touch or long-tapped to open a modal view with intensity levels and settings for colors and more.\nColor options for lights and group settings.\nThere’s no way of knowing that more functionality lies beyond these “special taps”. And that’s too bad, because this view lets you manage useful options such as accessory grouping51 and bridge configuration.52\nA front-end HomeKit interface has allowed Apple to bring deeper management features to iOS. First up, sharing: if you want to add members to your home, you can invite other people and give them administrative access to accessories. You can allow editing on a per-user basis, and you can also choose to let them control accessories while inside the house or remotely.\nSharing with HomeKit.\nThis ties into the Home app’s second advanced feature – home hubs. What used to be an opaque, poorly documented option in iOS 9 is now a setting screen: your Apple TV or iPad can be used as HomeKit hubs when you’re not at home. As long as the devices are plugged into power and connected to Wi-Fi, you can use them as bridges between a remote device and your accessories at home without additional configuration required.\nRemote control comes in handy when you consider HomeKit’s deep integration with iOS in Siri and Control Center. In my tests, I was able to turn on my espresso machine remotely when I was driving home just by talking to Siri. Control Center’s Home page works with remote control: I can turn off my lights with one swipe, or I can check the status of my door anywhere on iOS.53\nThere’s also automation. Third-party HomeKit management apps have long offered ways to set up rules and triggers to automate accessories and scenes based on specific conditions. iOS 10’s Home app brings a simpler interface to have accessories react to changes at home in four different ways:\nYour location changes;\nA time of day occurs;\nAn accessory is controlled;\nA sensor detects something.\nWhen creating a new automation, you won’t be presented with an intimidating workflow UI. Apple has nicely separated the individual steps behind an automation: first you’ll choose the accessory or trigger that will start an automation, then you’ll be shown a handful of options. If you want to turn off your lights when the door closes, for instance, you first choose from Door: Open/Closed then move onto selecting scenes or lights.\nThere’s no complicated language to learn for automation. (Tap for full size)\nI set up some automation rules in the Home app a couple of months ago, and they’ve been running smoothly since. Every day at 5 AM, lights in my bedroom and kitchen are turned off because I’ve likely gone to sleep by then. In another automation, my bedroom light turns red if the humidity level rises over 60%.\nIn the future, I’d like to see the ability to create nested automations with support for presence recognition. Currently, I can’t tell the Home app to send me a notification if the main door opens and I’m not at home, or to turn off the lights if it’s after sunset and nobody’s home.\nLast, HomeKit is expanding to new types of accessories. With iOS 10, third-party manufacturers can create:\nAir accessories: conditioners, heaters, purifiers, and humidifiers.\nCameras: they can display live video streams and still images. HomeKit can control settings such as night vision, tilt, zoom, rotation, and mirroring, as well as built-in speaker and microphone preferences.\nDoorbells: both standard and camera-equipped doorbells are supported. These devices generate an event once the the doorbell is pressed, sending a notification to HomeKit. In the case of doorbells with a camera built-in, iOS 10’s rich notifications can display a live video stream without opening an app, and they can embed an intercom button to start a two-way conversation with a person outside.\nCameras and doorbells were two highly requested enhancements to HomeKit. Third-party HomeKit cameras aren’t available on the market yet – which is unfortunate, as I couldn’t test them for this review – but I plan on buying one as soon as possible.\nApple’s plan for the connected home is coming together in iOS 10. Platform fragmentation has been a fundamental problem of third-party smart home devices and hubs: we’ve all heard the tales of devices being unable to talk to each other, being discontinued after a couple of years, or having to support external APIs to bring some communication into the mix.\n\nHome automation is best experienced as a tightly integrated extension of our smartphones.\n\nWith HomeKit, Apple’s closed and slower approach is paying off in consistency, durability, and integration with the OS. The Elgato sensors I bought nearly two years ago have worked perfectly with iOS 10 since the first beta. I don’t have to worry about companies supporting IFTTT, Wink, or other protocols as long as they work with HomeKit.\nIn Apple’s ecosystem, I can always extend my setup. When you consider extra functionalities such as rich notifications, Siri, remote hubs, and Control Center, it’s clear that home automation is best experienced as a tightly integrated extension of our smartphones.\nI want to believe that the rollout of HomeKit accessories will continue at a steady pace with a Home app front and center in iOS 10. Even if that’s going to be a problem for my wallet.\n\n\n \nApps\nAs is often the case with new versions of iOS, Apple added a variety of improvements to its suite of apps – some of which, for the first time, can also be deleted from the Home screen.\nMail\nOn the 12.9-inch iPad Pro, Mail has received a three-panel mode that shows a mailbox sidebar next to the inbox and message content in landscape.\nThree-panel view on the iPad Pro.\nThis extended mode is optional; it can be disabled by tapping a button in the top left of the title bar. If you were wondering why iPad apps couldn’t show more content like on a Mac, this is Apple’s answer. It’s the right move, and I’d like it to propagate to more apps.\nConversation threading has also been updated in iOS 10 to resemble macOS’ conversation view.\n\nIn iOS 10, messages in a thread are shown as scrollable cards. Each message can be swiped to bring up actions, and it can be taken in full-screen by tapping on its header (or ‘See More’ at the bottom).\n\nYou can control the appearance of conversation view in Settings > Mail (Contacts and Calendars have received their own separate setting screens, too). Mail lets you complete threads (load all messages from a thread even if they’ve been moved to other mailboxes) and display recent messages on top. Conversation view makes it easier to follow replies without having to squint at quoted text. It’s nicer and more readable; I wish more third-party email clients had this feature.\nThis willingness to make Mail more desktop-like doesn’t apply to smart folders, which are still nowhere to be found on iOS. Instead, Apple hopes that filters will help you sift through email overload.\nFiltering an inbox.\nFilters can be enabled with the icon at the bottom left of the inbox. You can customize them by tapping ‘Filtered By’ next to the icon. Filters include accounts, unread and flagged messages, messages that are addressed to you or where you’re CC’d, and only mail with attachments or from VIPs.\nFilters aren’t a replacement for smart folders’ automatic filing, but they can still provide a useful way to cut down a busy inbox to just the most important messages. I wish it was possible to create custom filters, or that Apple added more of them, such as a filter for Today or the ability to include messages from a specific address (without marking it as VIP).\nLast, like Outlook, Mail now recognizes messages from mailing lists and lets you unsubscribe with one tap without opening Safari.\n\nTapping the Unsubscribe button will send a an unsubscribe request as a message on your behalf, which you can find in the Sent folder. In my experience, Mail has done a solid job at finding newsletters and putting its Unsubscribe banner at the top.\nCompared to apps like Outlook, Airmail, and Google Inbox, Apple is advancing Mail at a deliberately slow pace. You can’t send an email message to extensions with the share sheet (more on this problem here); several macOS Mail functionalities are still missing from the iOS app; and, Google is way ahead of Apple when it comes to smart suggestions and automatic message categorization.\nMail is a fine client for most people, but it feels like it’s stuck between longing for desktop features and adopting what third-parties are doing. There’s a lot of work left to do.\nLook Up and Dictionary\nApple’s system dictionary – built into every app via the copy & paste menu – has been overhauled as Look Up, a more versatile interface meshing Spotlight and Safari search suggestions.\nThe new Look Up in iOS 10.\nLook Up still provides dictionary definitions for selected words. The dictionary opens as a translucent full-screen view on the iPhone (a modal window on the iPad) with cards you can tap to read thorough definitions. New in iOS 10, the Italian and Dutch dictionaries can display multilingual translations in English, which I’ve found useful to expand my vocabulary without opening Google or a third-party dictionary app.\n\nWhat makes Look Up one of the best additions to iOS 10 is the expansion of available sources. Besides definitions, iOS 10 shows suggestions from Apple Music, Wikipedia, iTunes, suggested websites, web videos, news, Maps, and more. These are the same data providers powering suggestions in Safari and Spotlight, with the advantage of being available from any app as long as you can select text.\n\nLike in iOS 9, some results can be expanded inline, such as Wikipedia summary cards, while others take you to a website in Safari. The presentation style is also the same, with rich snippets and thumbnails that make results contextual and glanceable.\nSmart data detectors have also been updated with Look Up integration. If iOS 10 finds a potential result in text, it’ll be underlined to suggest it can be tapped to open Look Up.\nLook Up triggered from a data detector in an email subject.\nIn my tests, Look Up suggestions in text showed up in apps like Messages and Mail, and they often matched names of popular artists (e.g. “Bon Iver”) or movies.\nBy plugging into a broader collection of sources, Look Up is more than a dictionary. It’s Spotlight for selected text – an omnipresent search engine and reference tool that can take you directly to a relevant result without Google.\nI’ve become a heavy user of Look Up for all kinds of queries. I look up topics on Wikipedia54 from my text editor or Notes without launching Safari. I even use it for restaurant reviews and Maps directions: iOS can pop up a full-screen Maps UI with the location, a button to get directions, and reviews from TripAdvisor. Look Up is a useful, clever addition, and I wish it worked for more types of content. It’d be nice to have POIs from Foursquare and Yelp in Look Up, for example.55\nWe first saw the potential for deeply integrated search with Spotlight in iOS 9. It’s not only a matter of competition between Apple and Google – any suggestion that requires fewer interactions is a better experience for Apple and its users. Look Up makes web search a feature of any app; it’s an intelligent continuation of the company’s strategy.\nNotes\nNotes was, together with Safari, the crown jewel of Apple’s app updates in iOS 9. This year, Apple is building upon it with subtle refinements and a new sharing feature.\nLike Mail, Notes on the 12.9-inch iPad Pro offers a three-panel view. If you spend time moving between folders to manage notes, this should be a welcome change.\nThree-panel view in Notes.\nWhen using an external keyboard, you can now indent items in a bulleted list with the Tab key. The same can be done with the copy & paste menu; curiously, Apple labeled the opposite behavior ‘Indent Left’ instead of ‘Outdent’.\nWhen a note refreshes with content added on another device, the new bits are temporarily highlighted in yellow. This helps seeing what has changed when syncing with iCloud.\nNotes and Link Snippets\n\nAllow me to nitpick and point out how Apple is shipping two different implementations of rich link previews in Messages and Notes. While the new Messages app has a superior snippet presentation with larger images and full text and images for tweets, Notes still comes with small snippets of text that are often cut off and don’t display photos from tweets or tappable video previews. Just as images inside a note can be switched from small to large previews, I’d like an option to load small and large link previews in Notes.\nNote sharing is the big change in iOS 10. Arguably the most requested feature since the app’s relaunch in iOS 9, collaboration puts Notes on the same playing field of two established competitors – Evernote and OneNote. In pure Apple fashion, collaboration has been kept simple, it’s based on CloudKit, and there’s an API for developers to implement the same functionality in their apps.\nIn iOS 10, every note has a button to start collaborating with someone. Tapping it opens a screen to share a note, which is done by sending a link to an app like Messages or Mail (you can also copy a link or send it to a third-party extension). Once you’ve picked how you want to share the note’s link, you can add people by email address or phone number.56 As soon as the recipient opens the iCloud.com link for the note and accepts it, the note will gain an icon in the main list to indicate that it’s a shared one.57\nSharing a note with someone on iMessage.\nCollaborating with someone else on the same note doesn’t look different from normal editing. Unlike more capable collaborative editing environments such as Google Docs, Quip, or Dropbox Paper, there are no typing indicators with participant names and you can’t watch someone type in real-time. The experience is somewhat crude: entire sentences simply show up after a couple of seconds (they’re also highlighted in yellow).\nApple doesn’t view Notes collaboration as a real-time editing service. Rather, it’s meant to offer multiple users a way to permanently store information in a note that is accessed regularly.\nI believe Notes collaboration will be a hit. I can see families sharing a grocery list or travel itinerary in Notes without having to worry about creating online accounts and downloading apps. Colleagues keeping a collection of screenshots and links, teams sharing sketches and snippets of text – the flexibility of Notes lends itself to any kind of sharing in multiple formats.58\nEven without the real-time features of Google and Dropbox (and the upcoming iWork update), Notes collaboration works well and is fast. In my three months of testing, I haven’t run into conflicts or prompts to take action.\nI was skeptical, but Notes collaboration works. In a post-Evernote world, Notes is still the best note-taking app for every iOS user.\nApple News\nLike last year, we’re going to have a separate story on Apple News. I wanted to briefly touch upon a few changes, though.\nApple News is the third iOS 10 app to sport a redesign centered on bold headlines, sizeable sections, and a more varied use of color.\nThe app launches to a For You view that does away with a traditional title bar to show the date and local weather conditions. Top Stories is the first section, highlighting 4-5 stories curated by Apple editors. These stories tend to be general news articles from well-known publications, and there’s no way to turn them off even if you mute the channel.\n\nSections in the main feed are differentiated by color, whether they’re curated by Apple (such as Trending or Featured) or collected algorithmically for your interests. Bold headlines don’t help information density (on an iPhone 6s Plus, you’ll be lucky to see more than four headlines at once), but they don’t look bad either. The large, heavy version of San Francisco employed in the app makes it feel like a digital newspaper.\n\nBecause of my job and preferences in terms of news readers, I can’t use Apple News as a replacement for Twitter or RSS. I want to have fine-grained control over my subscriptions, and the power-user tools offered by services like NewsBlur and Inoreader aren’t a good fit for Apple News. There are also editorial choices I don’t like: the more I keep muting and disliking certain topics (such as politics and sports), the more they keep coming back from Apple’s editors or other publications. Apple’s staff believes those are the stories I should care about, but I’ve long moved past this kind of news consumption. I don’t have time for a news feed that I can’t precisely control and customize.\nAs a general-purpose news reader, Apple News does a decent job, and the redesign gives sections and headlines more personality and structure. At the same time, Apple News still feels less inspired than Apple Music; the changes in iOS 10 aren’t enough to convince me to try it again.\n\nClock\nThe Clock app has been updated with two new features aimed at people who use it at night: a dark theme (it looks nice) and Bedtime.\n\nWith Bedtime, Apple wants to give users with a morning routine an easy way to remember when it’s time to sleep. Like other sleep trackers on the App Store, Bedtime sends a notification a few minutes before bed, and it wakes you up with gentle melodies of growing intensity (you can choose from 9 of them, with optional vibration). The goal is to be consistent, always go to bed and wake up at the same time every day, and get a regular amount of sleep each night.\nBedtime has a good UI design with a dial you can spin to adjust when you’d like to sleep and wake up, and it’s integrated with HealthKit to save data under the Sleep Analysis category.\nI can’t use Bedtime because, as someone who works from home, I never wake up at the same time every day and I don’t have kids to drive to school. Bedtime is too optimistic for my bad habits. I think it’s a nice enhancement, though, and I bet it’ll be quite popular among iOS users.\nHealth and Activity\nIf all you ever wanted from the Activity app was a way to stay motivated by comparing your friends’ progress to yours, Apple has you covered in iOS 10.\nSharing is now built into Activity: once you’ve invited a friend to share data with you, the Sharing tab will display activity circles, total completion, and burned calories. You can tap through to see more details, hide your activity, and mute notifications; at any point, you can start an iMessage conversation with a friend – presumably to taunt or motivate them.\nIn my defense, I haven’t been wearing my Apple Watch for the past few weeks.\nThe “gamification” of Activity, combined with the Apple Watch, should help users push towards a daily goal and stay active. It’s a feature I plan to test more in depth once I get back into my exercise regimen.59 We’ll cover more workout and Activity changes in our review of watchOS 3.\nAs for Health, Apple has overhauled the app’s dashboard with four main sections represented by colorful artwork: Activity, Mindfulness, Nutrition, and Sleep. Each of these primary categories has an explanation video, and there’s also a general overview video about the Health app that you can watch by tapping a button at the bottom of the Health Data screen.\n\nIn an effort to make browsing Health less intimidating, Apple has simplified how you can view statistics recorded by your iPhone and Apple Watch. There’s a new Today page with a scrollable calendar ticker; you can tap any day to see all recorded data points as small previews (which support 3D Touch). Tapping one will take you to the category’s detail page, unchanged from iOS 9.\n\nIn the top right corner of the Health Data and Today pages, you’ll find a user icon to quickly access details such as date of birth, sex, and blood type. You can configure wheelchair use here, as well as export your recorded Health data as a .zip archive containing an XML backup. U.S. residents will be able to sign up to become organ donors with Donate Life (previously announced by Apple) in the Medical ID screen.\nAs I’ve been arguing for the past couple of years, the Health app will eventually have to find correlations between categories to help users understand how they’re living and what they should improve. Going beyond data collection and graphs should be the ultimate goal to turn Health into an assistant rather than a dashboard of data points. Until that’s the case, making the app prettier and easier to use is a good call.\nUniversal Clipboard\nApple is adding another feature as part of the Continuity initiative launched two years ago: clipboard transfer between devices.\nThe option, dubbed Universal Clipboard, is designed to have (almost) no interface and “just work” in the background. After you’ve copied something on one device, pasting on another nearby will fetch what you originally copied on the first device and paste it. Universal Clipboard works with text, URLs, images, and other data types that can be pasted on iOS.\nLike other Continuity functionalities, Universal Clipboard uses Apple IDs and peer-to-peer connectivity (Wi-Fi and Bluetooth) to determine devices in your proximity. Universal Clipboard is only engaged when you paste on a second device – it’s not constantly pushing your copied items to iCloud or broadcasting them to all devices nearby. Because Universal Clipboard is meant to quickly switch from one device to another, there’s a two-minute timeout on copied items – you won’t be able to paste an image you copied two days ago on your iPhone in a message thread on the iPad today.\nIn my tests, Universal Clipboard worked well. It takes about a second to paste text copied from another device. Pasting a photo was the only case where I came across a “Pasting from…” dialog that loaded for a couple of seconds.\nPasting an image with Universal Clipboard.\nUniversal Clipboard’s no-configuration approach may concern developers who don’t want data copied from their apps to propagate across devices . To ease those qualms, iOS 10 includes an API to restrict the pasteboard to the local device or set an expiration timestamp. I suppose AgileBits and makers of other content-sensitive apps will provide settings to control the behavior of Universal Clipboard and disable it permanently.\nIn the latest update, Workflow lets you configure Universal Clipboard options.\nIt’s not a replacement for dedicated clipboard managers such as Copied and Clips, but Universal Clipboard is ideal if you don’t want to think about transferring clipboard contents between devices. When you need it, Universal Clipboard lets you paste a link copied on the iPad into a WhatsApp message on the iPhone, or a photo from the iPhone into Notes on a second device. There are no clipboard entries to organize and no persistent storage of information to worry about. Like opening apps through Handoff, it’s a nice option to always have with you.\nCalendar\nCalendar’s new features in iOS 10 are aimed at speed and location.\nData detectors for dates and times in iMessage conversations have been improved so they can pre-fill the event creation screen with details fetched from messages. If you’re planning a dinner with friends over iMessage and mention a place and time in the conversation, tapping the data detector should bring up the event creation UI with “dinner” as title and location/time properly assigned. When it works, it’s a neat trick to save time in creating events.\nWhen creating an event in the Calendar app, iOS 10 suggests similar events so you can re-add them with one tap.\n\nIt’s not clear how far back into your history iOS 10 goes looking for old events. Event suggestions are handy – they’re not real event templates, but they pre-fill locations and times, too.\nSpeaking of locations, iOS 10’s Calendar can suggest a location to add to an event with one tap. If I had to guess, I’d say that iOS uses old events with the same name and frequent locations to suggest an address. And, if you create an event with travel time, Calendar will use the location of a previous event (not your current location) to calculate how long it’ll take you to get there.\nApple’s Calendar app isn’t as versatile or powerful as Fantastical or Week Calendar. But I’m not a heavy calendar user, and iOS 10’s proactive Calendar features have been smart in small and useful increments. I’m going to stick with Apple’s Calendar app for a while.\n“Removing” Apple Apps\niOS users have long clamored for the removal of pre-installed Apple apps from their devices. Such desire is understandable: Apple has kept adding built-in apps every other year, which hasn’t helped the perception that Apple itself is wasting users’ storage while also selling 16 GB iPhones as base models.60\niOS 10 adds the ability to remove the majority of pre-installed Apple apps61 from the Home screen. These are:\nCalculator\nCalendar\nCompass\nContacts\nFaceTime\nFind My Friends \nHome\niBooks \niCloud Drive\niTunes Store\nMail\nMaps\nMusic\nNews\nNotes\nPodcasts\nReminders \nStocks\nTips\nVideos\nVoice Memos\nWatch\nWeather \nThere’s a catch. By removing an Apple app, you won’t delete it from the system entirely – you’ll remove the icon and delete user data inside the app, but the core bits of each app – the actual binary – will still live in the operating system. According to Apple, removing every app listed above will only recover 150 MB from your device. If you were hoping to get rid of every Apple app and suddenly gain a couple of GBs of storage, you’ll be disappointed.\nDeleting a pre-installed Apple app works just like any other app on iOS: tap & hold on the icon, hit Delete, and you’re done. For each app, iOS will warn you that you’ll either lose your data or access to specific features, such as location sharing with Find My Friends, the Calculator icon in Control Center, or email data stored locally.62\n\nRemoving apps based on core system frameworks won’t delete data inside them. If you remove Contacts, your contacts won’t be deleted; the same applies to Reminders and Calendar, plus other iCloud data and documents. Effectively, you’re removing the shell of an app and its settings; deleting Mail, for instance, removes all your accounts from the app. If you remove Reminders and ask Siri to create a reminder, though, it’ll still be created and made available to third-party clients via EventKit.\nRestoring previously removed Apple apps could have been more intuitive. You have to open the App Store and search for the name of an app, or look for Apple’s developer profile page and tap the download button for each app you want to bring back. It would have been nice to have a dedicated area in Settings to view which apps have been removed with an easier way to restore them.\nRestoring Apple apps from the App Store.\nRestoring Apple apps is further confirmation of the fact that those apps aren’t actually gone – they’re just hidden. The download isn’t a download: it takes less than a second and it doesn’t even show a progress bar. Try this for yourself: remove an Apple app, find Apple’s developer page on the App Store, put your device in Airplane Mode, and hit download. The app will reappear on your Home screen without the need for an Internet connection.63\nDefault Links\n\nWith the current system, a removed Apple app can’t be replaced as the default choice with a third-party alternative. If you tap a mailto: link in Safari, iOS won’t ask if you want to use a different email client; it’ll tell you to open the App Store and re-download Mail (same with links to shared notes and other apps). Is Apple preparing for a future of customizable default apps?\nAs a company that prides itself on the tight integration of its hardware and software, caving to user pressure on the matter of pre-installed apps must have been, politically and technically, tough for Apple. The result is a compromise: Apple is letting users get rid of those Tips and Stocks apps (among others) that few seem to like, but they also can’t completely delete apps because of their ties with the OS.\nSome people will complain about this decision. I’m not sure users would like the opposite scenario where entire frameworks are deleted from iOS (if even possible without breaking Apple’s code sign), third-party apps lose access to essential APIs, and each download consumes several hundred MBs. Given the architectural complexities involved, the current solution seems the most elegant trade-off.\n\n\n \nPhotos and Camera\nA world-class portable camera is one of the modern definitions of the iPhone. Among many factors, people buy an iPhone because it takes great pictures. And Apple’s relentless camera innovation isn’t showing any signs of slowing down this year.\nBut the importance of the iPhone’s camera goes deeper than technical prowess. The Camera and its related app, Photos, create memories. Notes, Reminders, Maps, and Messages are essential iOS apps; only the Camera and Photos have a lasting, deeply emotional impact on our lives that goes beyond utility. They’re personal. They’re us.\niOS 10 strives to improve photography in two parallel directions: the capturing phase and the discovery of memories – the technical and the emotional. Each relies on the other; together, they show us a glimpse of where Apple’s hardware and services may be heading next.\nWide Color\nLet’s start with the technical bits.\nThe Camera Capture pipeline has been updated to support wide-gamut color in iOS 10. All iOS devices can take pictures in the sRGB color space; the 9.7-inch iPad Pro and the upcoming iPhone 7 hardware also support capturing photos in wide-gamut color formats.\nWhen viewed on displays enabled for the P3 color space, pictures taken in wide color will be beautifully displayed with richer colors that take advantage of a wider palette of options. That will result in more accurate and deeper color reproduction that wasn’t possible on the iPhone until iOS 10 (the 9.7-inch iPad Pro was the only device with a wide color-enabled display on iOS 9).\nThere are some noteworthy details in how Apple is rolling out wide color across its iOS product line, using photography as an obvious delivery method.\nWide color in iOS 10 is used for photography, not video. JPEGs (still images) captured in wide color fall in the P3 color space; Live Photos, despite the presence of an embedded movie file, also support wide color when viewed on the iPad Pro and iPhone 7 (or the Retina 5K iMac).\nApple has been clever in implementing fallback options for photos displayed on older devices outside of the P3 color space. The company’s photo storage service, iCloud Photo Library, has been made color-aware and it can automatically convert pictures to sRGB for devices without wide color viewing support.\nMore interestingly, wide-gamut pictures shared via Mail and iMessage are converted to an Apple Wide Color Sharing Profile by iOS 10. This color profile takes care of displaying the image file in the appropriate color space depending on the device it’s viewed on.\nEven as a tentpole feature of the iPhone 7, wide-gamut photography isn’t something most users will care (or know) about. Wide color is relevant in the context of another major change for iOS photographers and developers of photo editing software – native RAW support.\nRAW Capture and Editing\nApple used an apt and delicious analogy to describe RAW photo capture at WWDC: it’s like carrying around the ingredients for a cake instead of the fully baked product. Like two chefs can use the same ingredients to produce wildly different cakes, RAW data can be edited by multiple apps to output different versions of the same photo.\nRAW stores unprocessed scene data: it contains more bits because no compression is involved, which leads to heavier file sizes and higher performance required to capture and edit RAW. On iOS 10, RAW capture is supported on the iPhone SE, 6s and 6s Plus, 7 and 7 Plus (only when not using the dual camera), and 9.7-inch iPad Pro with the rear camera only, and it’s an API available to third-party apps (Apple’s Camera app doesn’t capture in RAW).\nTo store RAW buffers, Apple is using the Adobe Digital Negative (DNG) format; among the many proprietary RAW formats used by camera manufacturers, DNG is as close to an open, publicly available standard as it gets.64\nAt a practical level, the upside of RAW capture is the ability to reverse and modify specific values in post-production to improve photos in a way that wouldn’t be possible with processed JPEGs. On iOS 10, RAW APIs allow developers to create apps that can tweak exposure, temperature, noise reduction, and more after having taken a picture, giving professionals more creative control over photo editing.\nThings are looking pretty good in terms of performance, too. On iOS devices with 2 GB of RAM or more, the system can edit RAW files up to 120 megapixels; on devices with 1 GB of RAM, or if accessed from an editing extension inside Photos (where memory is more limited), apps can edit RAW files up to 60 megapixels.\nNative RAW support opens up an opportunity for developers to fill a gap on the App Store: desktop-class photo editing and management apps for pros. If adopted by the developer community, native RAW capture and editing could enable workflows that were previously exclusive to the Mac. Imagine shooting RAW with a DSLR, or even an iPhone 7, and then sitting down with an iPad Pro to organize footage, flag pictures, and edit values with finer, deeper controls, while also enjoying the beauty and detail of wide-gamut images (which RAW files inherently are).\nShooting RAW in Obscura.\nI tested an upcoming update to Obscura, Ben McCarthy’s professional photo app for iOS, with RAW support on iOS 10. RAW can be enabled from the app’s viewfinder; after shooting with Obscura, RAW photos are saved directly into iOS’ Photos app.\nEditing RAW in Snapseed.\nGoogle’s Snapseed photo editor imported RAW files shot in Obscura without issues, and I was able to apply edits with Snapseed’s RAW Develop tool, saving changes back to Photos. I’m not a professional photographer, but I was still impressed by the RAW workflows now possible with third-party apps and iOS 10.\nThumbnails\nIt’s not related to RAW, but iOS 10 has also added an API to generate preview-sized images (thumbnails) from larger photos. The API gets a small version of an image directly from the camera, which is uncompressed and provided at a default size unless an app requests specific dimensions. This is another indication of how Apple wants to simplify the creation of desktop-level photo management apps – thumbnail generation through downscaling would be a memory intensive task; iOS 10 solves it with a single API call.\nOn the other hand, while Apple has improved developer tools for RAW capture and editing, hurdles remain in terms of photo management. iCloud Photo Library, even at its highest tier, only offers 2 TB of storage; professional photographers have libraries that span decades and require several TBs. The situation is worse when it comes to local storage on an iPad, with 256 GB being the maximum capacity you can buy for an iPad Pro today. Perhaps Apple is hoping that these limitations will push users to rely on cloud-based archival solutions that go beyond what’s offered by iCloud and iOS’ offline storage. However, it’s undeniable that it’s still easier for a creative professional to organize 5 TB of RAW footage on a Mac than an iPad.\nI have no reason to doubt that companies like Adobe will be all over Apple’s RAW APIs in iOS 10. I’m also curious to see how indie developers will approach standalone camera apps for RAW capture and quick edits. There’s still work to be done, but the dream of a full-featured photo capture, editing, and management workflow on iOS is within our grasp.\nLive Photos\nApple isn’t altering the original idea behind Live Photos with iOS 10: they still capture the fleeting moment around a still image, which roughly amounts to 1.5 seconds before a picture is taken and 1.5 seconds after. Photos have become more than still images thanks to Live Photos, and there are some nice additions in iOS 10.\nLive Photos now use video stabilization for the movie file bundled within them. This doesn’t mean that the iPhone’s camera generates videos as smooth as Google’s Motion Stills, but they’re slightly smoother than iOS 9. Another nice change: taking pictures on iOS 10 no longer stops music playback.\nFurthermore, editing is fully supported for Live Photos in iOS 10. Apps can apply filters to the movie inside a Live Photo, with the ability to tweak video frames, audio volume, and size.65 To demonstrate the new editing capabilities, Apple has enabled iOS’ built-in filters to work with Live Photos, too.\nThe key advantage of Apple’s Live Photos is integration with the system Camera, which can’t be beaten by third-party options. I’d like to see higher frame rates in the future of Live Photos; for now, they’re doing a good enough job at defining what capturing a moment feels like.\n\nIt’s Called the Carousel\nThe photos on our devices are more than files in a library. They’re tiny bits of our past. The places we went to; the people we were; those we met. Together, they’re far more powerful than memory alone. Photos allow us to ache, cherish, and remember.\nWithout tools to rediscover and relive memories, none of that matters. A camera that’s always with us has enabled us to take a picture for every moment, but it created a different set of issues. There’s too much overhead in finding our old selves in a sea of small thumbnails. And what purpose is to a photo if it’s never seen again?\nApple sees this as a problem, too, and they want to fix it with iOS 10. With storage, syncing, and 3D Touch now taken care of, the new Photos focuses on a single, all-encompassing aspect of the experience:\nYou.\nComputer Vision\nApple’s rethinking of what Photos can do starts with a layer of intelligence built into our devices. The company refers to it as “advanced computer vision”, and it spans elements such as recognition of scenes, objects, places, and faces in photos, categorization, relevancy thresholds, and search.\nSecond, Apple believes iOS devices are smart and powerful enough to handle this aspect of machine learning themselves. The intelligence-based features of Photos are predicated on an implementation of on-device processing that doesn’t transmit private user information to the cloud – not even Apple’s own iCloud (at least not yet).\nPhotos’ learning is done locally on each device by taking advantage of the GPU: after a user upgrades to iOS 10, the first backlog of photos will be analyzed overnight when a device is connected to Wi-Fi and charging; after the initial batch is done, new pictures will be processed almost instantaneously after taking them. Photos’ deep learning classification is encrypted locally, it never leaves the user’s device, and it can’t be read by Apple.\n\nYou’ll be surprised by how much Apple has accomplished with Photos in iOS 10.\n\nAs a Google Photos user, I was more than doubtful when Apple touted the benefits of on-device intelligence with iOS 10’s Photos app. What were the chances Apple, a new player in the space, could figure out deep learning in Photos just by using the bits inside an iPhone?\nYou’ll be surprised by how much Apple has accomplished with Photos in iOS 10. It’s not perfect, and, occasionally, it’s not as eerily accurate as Google Photos, but Photos’ intelligence is good enough, sometimes great, and it’s going to change how we relive our memories.\nMemories\nOf the three intelligence features in Photos, Memories is the one that gained a spot in the tab bar. Memories creates collections of photos automatically grouped by people, date, location, and other criteria. They’re generated almost daily depending on the size of your library, quantity of information found in photos, and progress of on-device processing.\nBrowsing Memories in iOS 10.\nThe goal of Memories is to let you rediscover moments from your past. There are some specific types of memories. For instance, you’ll find memories for a location, a person, a couple, a day, a weekend, a trip spanning multiple weeks, a place, or “Best Of” collections that highlight photos from multiple years.\nIn my library, I have memories for my trip to WWDC (both “Great Britain and United States” and “Myke and Me”), pictures taken “At the Beach”, and “Best of This Year”. There’s a common thread in the memories Photos generates, but they’re varied enough and iOS does a good job at bringing up relevant photos at the right time.\nDifferent types of memories.\nBehind the scenes, Memories are assembled with metadata contained in photos or recognized by on-device intelligence. Pieces of data like location, time of the day, and proximity to points of interest are taken into consideration, feeding an engine that also looks at aspects such as faces.\nScrolling Memories feels like flipping through pages of a scrapbook. Cover images are intelligently chosen from the app; if you press a memory’s preview, iOS brings up a collage-like peek with buttons to delete a memory or add it to your favorites.\n\nTapping a memory transitions to a detail screen where the cover morphs into a playable video preview at the top. Besides photos, Memories generates a slideshow movie that you can save as a video in your library. Slideshows combine built-in soundtracks (over 80), pictures, videos, and Live Photos to capture an extended representation of a memory that you can share with friends or stream for the whole family on an Apple TV.\nChoosing styles for Memories’ slideshows.\nEach video comes with quick adjustment controls and deeper settings reminiscent of iMovie. In the main view, there’s a scrollable bar at the bottom to pick one of eight “moods”, ranging from dreamy and sentimental to club and extreme. Photos picks a neutral mood by default, which is a mix of uplifting and sentimental; moods affect the music used in the slideshows, as well as the cover text, selection of media, and transitions between items. You can also change the duration of a movie (short, medium, and long); doing so may require Photos to download additional assets from iCloud.\nDeeper movie settings.\nTo have finer control over Memories’ movies, you can tap the editing button in the bottom right (the three sliders). Here, you can customize the title and subtitle with your own text and different styles, enter a duration in seconds, manually select photos and videos from a memory, and replace Apple’s soundtrack with your favorite music.66\nBelow the slideshow, Memories displays a grid of highlights. Both in the grid and the slideshow, Photos applies de-duplication, removing photos similar to each other.67 Apple’s Memories algorithm tends to promote pictures that are well-lit, or where people are smiling, to a bigger size in the grid. In Memories, a photo’s 3D Touch peek menu includes a ‘Show Photos from this Day’ option to jump to a specific moment.\nAs you scroll further down a memory’s contents, you’ll notice how Photos exposes some of the data it uses to build Memories with People and Places.\nThe memories you see in the main Memories page are highlights – the best memories recommended for you. In reality, iOS 10 keeps a larger collection of memories generated under the hood. For example, every moment (the sub-group of photos taken at specific times and locations) can be viewed as a memory. In each memory, you’ll find up to four suggestions for related memories, where the results are more hit-and-miss.\nRelated Memories\n\nRelated memories have rarely managed to find deeper connections between the main memory and photos related to it. In my library, a 2012 trip had a related memory of a trip in 2015 and a weekend in 2011. Related memories are inconsequential – they’re fillers.\nApple seems to know this, too, as related memories have an option to be saved in the main Memories screen. If the company itself doesn’t believe these memories are good enough to be shown in the highlights, I wonder why they’re available at all.\nI do, however, appreciate the inclusion of a setting to hide memories for holiday events in your home country. Excluding memories for events you don’t celebrate is a nice touch.\nIn many ways, Apple’s Memories are superior to Google Assistant’s creations: they’re not as frequent and they truly feel like the best moments from your past. Where Google Photos’ Assistant throws anything at the wall to see what you might want to save, I can’t find a memory highlighted by Photos that isn’t at least somewhat relevant to me. iOS 10’s Memories feel like precious stories made for me instead of clever collages.68\nMemories always bring back some kind of emotion. I find myself anticipating new entries in the Memories screen to see where I’ll be taken next.\nPeople and Groups\nAvailable for years on the desktop, Faces have come to Photos on iOS with the ability to browse and manage people matched by the app.\nThere are multiple ways to organize people recognized in your photo library. The easiest is the People view, a special album with a grid of faces that have either been matched and assigned to a person or that still need to be tagged.\n\nLike on macOS, the initial tagging process is manual: when you tap on an unnamed face, photos from that person have an ‘Add Name’ button in the title bar. You can choose one of your contacts to assign the photos to.\nAdding John as a recognized contact.\nAs you start building a collection of People, the album’s grid will populate with more entries. To have quicker access to the most important people – say, your kids or partner – you can drag faces towards the top and drop them in a favorites area.69\n\n \nMarking people as favorites\n\nAnother way to deal with faces is from a photo’s detail view. In iOS 10, you can swipe up on a photo (or tap ‘Details’ in the top right) to locate it on a map, show nearby photos, view related memories (again, mostly chosen randomly), and see which people Photos has recognized.\nSwipe up to view details of a photo, including people.\nThis is one of my favorite additions to Photos.70 Coalescing location metadata and faces in the same screen is an effective way to remember a photo’s context.\nNo matter how you get to a person’s photos, there will always be a dedicated view collecting them all. If there are enough pictures, a Memories-like slideshow is available at the top. Below, you get a summary of photos in chronological order, a map of where photos were taken, more related memories, and additional people. When viewing people inside a person’s screen71, iOS will display a sub-filter to view people and groups. Groups help you find photos of that person and yourself together.\nDue to EU regulations on web photo services, I can’t use Google Photos’ face recognition in Italy, therefore I can’t compare the quality of Google’s feature with Photos in iOS 10. What I have noticed, though, is that local face recognition in Photos isn’t too dissimilar from the functionality that existed in iPhoto. Oftentimes, Photos gets confused by people with similar facial features such as beards; occasionally, Photos can’t understand a photo of someone squinting her eyes belongs to a person that had already been recognized. But then other times, Photos’ face recognition is surprisingly accurate, correctly matching photos from the same person through the years with different hairstyles, beards, hair color, and more. It’s inconsistently good.\nDespite some shortcomings, I’d rather have face recognition that needs to be trained every couple of weeks than not have it at all.\nYou have to train face recognition when things go wrong.\nYou can “teach” photos about matched people in two ways: you can merge unnamed entries that match an existing person (just assign the same name to the second group of photos and you’ll be asked to merge them), or you can confirm a person’s additional photos manually. You can find the option at the bottom of a person’s photos.\nNot This Person\n\nThere are a couple of ways to tell the app a photo doesn’t belong to a recognized person. On iPhones with 3D Touch, you can press on a photo, swipe the peek upwards, and tap ‘Not This Person’.\nAlternatively, you can select a photo from the grid view, hit Share, and find the same option in the share sheet. This menu also contains a button to set a photo as the Key Face for that person, which will be used as the default face preview.\nThe biggest downside of face support in iOS 10 is lack of iCloud sync . Photos runs its face recognition locally on each device, populating the Faces album without syncing sets of people via iCloud Photo Library. The face-matching algorithm is the same between multiple devices, but you’ll have to recreate favorites and perform training on every device. I’ve ended up managing and browsing faces mostly on my iPhone to eschew the annoyance of inconsistent face sets between devices. I hope Apple adds face sync in a future update to iOS 10.\nConfirming faces in Photos is a time-consuming, boring process that, however, yields a good return on investment. It’s not compulsory, but you’ll want to remember to train Photos every once in a while to help face recognition. In my first training sessions, suggestions were almost hilariously bad – insofar as suggesting pictures of Myke Hurley and me were the same person. After some good laughs and taps, Photos’ questions have become more pertinent, stabilizing suggestions for new photos as well.\nFace recognition in iOS 10’s Photos is not a dramatic leap from previous implementations in Apple’s Mac clients, but it’s good enough, and it can be useful.\nPlaces\nDisplay of location metadata has never been Photos’ forte, which created a gap for third-party apps to fill. In iOS 10, Apple has brought MapKit-fueled Places views to, er, various places inside the app.\nIf Location Services were active when taking a picture, a photo’s detail view will have a map to show where it was taken. The map preview defaults to a single photo. You can tap it to open a bigger preview, with buttons to show photos taken nearby in addition to the current one.\nViewing nearby photos.\nWhen in full-screen, you can switch from the standard map style to hybrid or satellite (with and without 3D enabled). The combination of nearby photos and satellite map is great to visualize clusters of photos taken around the same location across multiple years. When you want to see the dates of all nearby photos, there’s a grid view that organizes them by moment.\n\nNearby photos make 3D maps useful, too. I seldom use Flyover on iOS, but I like to zoom into a 3D map and view, for instance, photos taken around the most beautiful city in the world.\nYou can view all places at once from a special Places album. By default, this album loads a zoomed-out view of your country, but you can move around freely (like in Nearby) and pan to other countries and continents. It’s a nice way to visualize all your photos on a map, but it can also be used to explore old photos you’ve taken at your current location thanks to the GPS icon in the bottom left.\n\nAs someone who’s long wanted proper Maps previews inside Photos, I can’t complain. Nearby and Places are ubiquitous in Photos and they add value to the photographic memory of a picture. Apple waited until they got this feature right.\nSearch\nProactive suggestion of memories and faces only solves one half of Photos’ discovery. Sometimes, you have a vague recollection of the contents of a photo and want to search for it. Photos’ content search is where Apple’s artificial intelligence efforts will be measured up against Google’s admirable natural language search.\nPhotos in iOS 10 lets you search for things in photos. Apple is tackling photo search differently than Google, though. While Google Photos lets you type anything into the search field and see if it returns any results, iOS 10’s Photos search is based on categories. When typing a query, you have to tap on one of the built-in categories for scenes and objects supported by Apple. If there’s no category suggestion for what you’re typing, it means you can’t search for it.\nIntelligent search in Photos.\nThe search functionality is imbued with categories added by Apple, plus memories, places, albums, dates, and people – some of which were already supported in iOS 9. Because of Apple’s on-device processing, an initial indexing will be performed after upgrading to iOS 10.72\nThe range of categories Photos is aware of varies. There are macro categories, such as “animal”, “food”, or “vehicle”, to search for families of objects; mid-range categories that include generic types like “dog”, “hat”, “fountain”, or “pizza”; and there are fewer, but more specific, categories like “beagle”, “teddy bear”, “dark glasses”, or, one of my favorites, the ever-useful “faucet”.\nExamples of categories in Photos’ search. (Tap for full size)\nApple’s goal was to provide users with a reasonable set of common words that represent what humans take pictures of. The technology gets all the more impressive when you start concatenating categories with each other or with other search filters. Two categories like “aircraft” and “sky” can be combined in the same search query and you’ll find the classic picture taken from inside a plane. You can also mix and match categories with places and dates: “Beach, Apulia, 2015” shows me photos of the beach taken during my vacation in Puglia last year; “Rome, food” lets me remember the many times I’ve been at restaurants here. I’ve been able to concatenate at least four search tokens in the same query; more may be possible.\nSearch token concatenation for more precise results.\nAll this may not be shocking for tech-inclined folks who have used Google Photos. But there are millions of iOS users who haven’t signed up for Google’s service and have never tried AI-powered photo search before. To have a similar feature in built-in app, developed in a privacy-conscious way, with a large set of categories to choose from – that’s a terrific change for every iOS user.\nApple isn’t storing photos’ content metadata in the cloud to analyze them at scale – your photos are private and indexing/processing are performed on-device, like Memories (even if you have iCloud Photo Library with Optimize Storage turned on). It’s an over-simplification, but, for the sake of the argument, this means that iOS 10 ships with a “master algorithm” that contains knowledge of its own and indexes photos locally without sending any content-related information to the cloud. Essentially, Apple had to create its computer vision from scratch and teach it what a “beach” looks like.\nIn everyday usage, Photos’ scene search is remarkable when it works – and a little disappointing when it doesn’t.\nWhen a query matches a category and results are accurate, content-aware search is amazing. You can type “beach” and Photos will show you pictures of beaches because it knows what a beach is. You can search for pictures of pasta and suddenly feel hungry. Want to remember how cute your dog was as a puppy? There’s a category for that.\nI’ve tested search in Photos for the past three months, and I’ve often been able to find the photo I was looking for thanks to query concatenation and mid-range descriptions, such as “pasta, 2014” or “Rome, dog, 2016”. Most of the time, what Apple has achieved is genuinely impressive.\nA Dictionary of Categories?\nA few months ago, I speculated that Apple used a combination of public images as well as databases licensed from someone else (think: stock photo companies) to develop its algorithm. But there’s probably no need for Apple to buy photos to train their algorithms. Wordnet, the popular lexical database, offers a controlled vocabulary of every concept in the English language for free, for everyone. A corresponding hierarchy of WordNet nouns is mirrored on ImageNet, which provides 14 million publicly available images with ~22,000 synonym sets, with an average of 500-1000 photos per set. In most cases, 500 photos should suffice to train computer vision algorithms to recognize matched nouns in photos – whether they’re full-resolution pictures or thumbnails.\nA possible hint of Apple’s reliance on ImageNet is that categories in Photos are only nouns. There are no verbs to search for. Whatever Apple has picked to train their computer vision, though, they haven’t always done a great job at curating categories exposed to users. Have you ever wished you could search for “apparatus” in your photo library? What about “H2O” and “habiliment”? It’s nice that iOS 10 supports a broad range of search queries, but some of them feel like they’ve been lifted straight from a dictionary or a set of default tags.\nOn a few occasions, Photos’ categories didn’t contain results I was expecting to be in there, or they matched a photo that belonged to a different category (such as my parents’ border collie, recognized as a “bear”, or fireworks tagged as “Christmas tree”).\nThat’s one adorable bear.\nUnderstandably, Apple’s first take on scene search with computer vision isn’t perfect. These issues could be remedied if there was a way to fix false positives and train recognition on unmatched photos, but no such option is provided in iOS 10. The decision to omit manual intervention hinders the ability to let users help Photos’ recognition, and it makes me wonder how long we’ll have to wait for improvements to the algorithm.\nCompared to Google Photos’ search, Apple’s version in iOS 10 is already robust. It’s a good first step, especially considering that Apple is new to this field and they’re not compromising on user privacy.\nAn Intelligent Future\nWhat’s most surprising about the new Photos is how, with one iOS update, Apple has gone from zero intelligence built into the app to a useful, capable alternative to Google Photos – all while taking a deeply different approach to image analysis.\nAdmittedly, iOS 10’s Photos is inspired by what Google has been doing with Google Photos since its launch in May 2015. 200 million monthly active users can’t be wrong: Google Photos has singlehandedly changed consumer photo management thanks to automated discovery tools and scene search. Any platform owner would pay attention to the third-party asking users to delete photos from their devices to archive them in a different cloud.\nApple has a chance to replicate the same success of Google Photos at a much larger scale, directly into the app more millions of users open every day. It isn’t just a matter of taking a page from Google for the sake of feature parity: photos are, arguably, the most precious data for iPhone users. Bringing easier discovery of memories, new search tools, and emotion into photo management yields loyalty and, ultimately, lock-in.\nThis isn’t a fight Apple is willing to give up. In their first round, Apple has shown that they can inject intelligence into Photos without sacrificing our privacy. Let’s see where they go from here.\n\n\n \nDesign\nWhile iOS 10 hasn’t brought a sweeping UI redesign, changes sprinkled throughout the interface underscore how Apple has been refining the iOS 7 design language. On the other hand, a new direction for some apps appears to hint at something bigger.\nBold Typography, Larger Controls\nApple Music epitomizes a strikingly different presentation of full-screen views, content grids, and affordances that hint at user interaction.\nIn iOS 10, Apple Music eschews the traditional title bar in favor of large, bold headlines for first-level views such as For You and Browse.\nA new look for title bars in Apple Music.\nThe use of San Francisco bold in lieu of a centered title bar label is similar to a newspaper headline. The heavy typeface sticks out as an odd choice initially, but it clarifies the structure and increases the contrast of Apple Music – two areas in which the company was criticized over the past year.\nThe evolution of Apple’s Music app. (Tap for full size)\nTo group content within a view, or to label sub-views in nested navigation, Apple relies on Dynamic Text to scale text at different sizes. Dynamic Text doesn’t affect headlines.\nDynamic Text and Apple Music’s new design.\nThe text-based back button at the top of a sub-view isn’t gone, but titles are always displayed in bold next to the content the user is viewing. An album’s name, for instance, isn’t centered in the title bar anymore; instead, it sits atop the artist’s name.\nAlbum titles no longer sit in the title bar – they’re part of the content itself.\nThe combination of multiple font weights, color, and thicker labels provides superior hierarchy for content displayed on a page, separating multiple types of tappable items. By doing less, the result is a set of stronger affordances.\nThe visual statement is clear: when you see a black headline or sub-title, it can’t be tapped. You’ll have to tap on the content preview (artwork, photos) or colored label (artist names, buttons) to continue navigation or perform a task.\nThis goes beyond fonts. To further limit confusion, Apple Music now displays fewer items per page. Every element – whether it’s an album, a text button, or a collection of playlists – is also larger and more inviting to the touch.\nFewer, bigger touch targets.\nThe trade-off is reduced information density and the perception that Apple is babysitting their users with albums and buttons that get in the way too much. It’s a case of over-shooting in the opposite direction of last year’s button-laden Music app; Apple has a history of introducing new design languages and intentionally exaggerating them in the first version. The new Apple Music is a reset of visual expectations.\nThis is best exemplified by the Now Playing widget at the bottom of the screen: besides being taller (and hence more tappable), the contextual menu it opens blurs the background and is filled with large, full-width buttons that combine text and icons.\n\nIt’s impossible to misunderstand what each of these do, and selecting them doesn’t feel like playing a tap lottery, as was the case with the old contextual menu of iOS 9. Apple doesn’t appear too worried about breaking design consistency with other share dialogs on iOS as long as Apple Music’s works better.\n\nThe new Apple Music is a reset of visual expectations.\n\nThe company’s newfound penchant for big titles and explaining functionalities ahead of interaction doesn’t stop at Apple Music. Apple News makes plenty of use of bold headlines for article titles (where they feel like an appropriate fit) and multiple colors to distinguish sections.\nAnother non-traditional title bar in Apple News.\nThe Home app adheres to similar principles. There’s no fixed title bar at the top of the screen; rather, a customizable background extends to the top of a view, with a large title indicating which room is being managed.\nHome has no real title bars either.\nWe can also look outside of apps for a manifestation of Apple’s bold design sentiment. In Control Center, splitting features across three pages lets functionality stand out more with bigger, comfortable buttons that aren’t constrained by a single-page design. This is evident in the music control page, where album artwork can be tapped to open the app that is currently playing audio.\nFinally, let’s consider the Lock screen. In addition to redesigned notifications and widgets (which can be simply pressed for expansion), Apple is using thicker fonts and expanded audio controls.\nThe evolution of audio controls on the Lock screen. (Tap for full size)\nBigger song information, larger buttons, and a volume nub that can be grabbed effortlessly. I see these as improvements over the iOS 9 Lock screen.\nButtons\nAh, buttons. The much contested, derided aspect of the iOS 7 design isn’t officially changing with iOS 10. According to the iOS Human Interface Guidelines, this is still the default look of a system button in iOS 10:\n\nAcross iOS 10, however, we can see signs of Apple moving back to eye-catching buttons with borders and filled states in more apps.\nLet’s start with Apple Music again. In the iOS 10 version, there are numerous instances of button-y buttons that weren’t there in iOS 9.\n\nButtons that don’t have borders or a filled state are still present, but most of them have been redrawn with a thicker stroke to increase contrast with the app’s white background.\nMessages has an interesting take on buttons. Most of them are consistent with iOS 9, but the two buttons to open the Camera or a pick a photo from the library are displayed as icons inside a square with rounded corners.\nThose are two big buttons.\nThese replace the textual buttons of the iOS 9 photo picker, where one of them could be confused as the label of the scrollable gallery shown above it.\nThe same look is used for HomeKit accessories. Device icons are contained in a square that shows the accessory’s name, its icon, and intensity level.\n\nThe use of highlights and colors helps discerning on-off states for devices that are turned off (black text, translucent button) and on (colored icon, white-filled button).\nFilled circles with glyphs are a recurring button type in iOS 10. They’re used in a few places:\nCircular buttons in iOS 10.\nSpotlight: search shortcuts to FaceTime, message, or call a contact;\nCamera on iPad: the HDR, timer, and iSight buttons have been updated with the new circular design;\nContact cards: this is a notable change given the the ability to add third-party messaging and VoIP app shortcuts for contacts. Apple has moved buttons to get in touch with a user to the top of the card;\nMaps: the detail card of a point of interest/address has new buttons to call, share, mark as favorite, and add to Contacts.\nOther examples of buttons redesigned for iOS include the back button in the status bar to return to an app (it’s got a new icon) and variations of ‘Get Started’ buttons for apps like Calendar and Apple News, which are now filled rectangles.\nUpdates to buttons in iOS 10 may indicate that Apple heard feedback on text labels that many don’t know can be tapped to initiate an action, but we’ll have to wait until next year for further proof.\nCards and Stacked Views\nIn Apple Music, Messages, and Maps, the company has rolled out new types of views that could be interesting if ported to other apps.\nFirst, stacked views. In Music’s Now Playing screen and the iMessage App Store, iOS 10 features stacked panels that open on top of the current view, keeping a tiny part of it visible in the background. There’s a nice animation for the view that recedes and shrinks from the status bar.\nStacked views.\nStacked views are an intriguing way to show nested navigation. I wonder if more full-screen views that use a back button in the top left could be redesigned with this layout, perhaps using a vertical swipe to dismiss the foreground panel.\nThere are plenty of card-like interfaces being used in iOS 10 to supplant full-screen views, popups, and other kinds of panels.\nFrom Maps’ search suggestions that slide up from the bottom of the map to Control Center’s pages, the Apple TV (and AirPods) setup card, and, in a way, expanded notifications, it feels like Apple has realized it’s time to take advantage of bigger iPhone displays to drop modal popups and full-screen views.\nCards in iOS 10.\nCards enhance usability with definite boundaries and a concise presentation of content. I like where this is going.\nAnimations\nContextual animations and transitions have always been part of iOS’ visual vocabulary. Several improvements have been brought on this front with iOS 10, including APIs that allow for interactive and interruptible animations.\nIf developers support the object-based animation framework added to UIKit with iOS 10, they’ll be able to have deeper control over interrupting animations and linking them with gesture-based responses. These improved animations (based on UIViewPropertyAnimator) can be paused and stopped, scrubbed (moved forward and back), and reversed at any point in their lifecycle. In short, it means apps no longer have to finish an entire animation if they want to react to other changes.\nApps can feel more responsive – and faster – with interruptible animations. It’s not a major change per se, but it’s a welcome response to iOS 7’s indulgent animation curve.\nEmoji Updates\nNew emoji from the Unicode 9.0 spec aren’t available in iOS 10.0 (they’ll likely be added in a point release in the near future), but Apple still found a way to ship notable emoji updates that will entice users to upgrade.\nSeveral emoji have been redesigned with less gloss, more details, and new shading. This is most apparent in the Faces category where characters have a more accentuated 3D look. They remind me of emoticons from the original MSN Messenger, redrawn for the modern age.\nRedesigned Emoji in iOS 10. (Tap for full size)\nApple has implemented the ZWJ technique to add more gender-diverse emoji. Technically, these are combinations of multiple existing characters (codepoints) joined in a ZWJ sequence. To users, they’ll look like new emoji added to the system keyboard, and Apple didn’t miss the opportunity to announce them with a press release.\nAlas, Apple’s emoji keyboard still doesn’t have a search field. If emoji suggestions fail to bring up the emoji you’re looking for, you’ll still want to keep Gboard installed as a fast way to search for emoji.\nSound Design\nIn addition to visual tweaks, Apple did some work in the aural department as well.\nThe keyboard in iOS 10 has distinctive new “pop” sounds for different kinds of keys, including letters, the delete key, and the space bar.73 Some people will find these sounds distracting and cartoon-ish; I think they add an extra dimension to typing on the software keyboard. Because the keyboard has multiple layers of “popping bubbles”, you can now hear what you type besides seeing it. I’m a fan.\n\n \n\nIt is the lock sound, though, that takes the crown for the most surprising sound effect of iOS 10. I still can’t decide what it is, but I like it.\n\n \n\nSound design is often underrated. An intelligent use of sound effects can augment the visual experience with context, personality, and just the right amount of whimsy. Whoever is behind the sound-related changes in iOS 10, I want to hear more from them.\nA State of Flux?\nApple is continuing to iterate on the design language they introduced two years ago, but they’re doing so inconsistently across the system, experimenting with new ideas without fully committing to them. There are multiple design languages coexisting in iOS 10. At times, it’s hard to reconcile them.\nThe most notable changes mentioned above – the bold look of Apple Music and the revised look of buttons – aren’t new guidelines for a system-wide refresh. They’re isolated test-drives scattered throughout the system without a common thread.\n\nThere are multiple design languages coexisting in iOS 10.\n\nMusic, News, and Home have little in common from a functional standpoint, and yet they share the same aesthetic. Does Apple consider these apps the baseline of iOS interfaces going forward? Or should we prepare for an increasingly diversified constellation of Apple apps, each built around a design specifically tailored for it? What types of apps should adopt the “big and bold” style? Should developers read the tea leaves in Apple’s app redesigns this year and prepare for updated guidelines twelve months from now?\nTaken at face value, what we have in iOS 10 is a collection of design refinements. We also have a clique of apps that look different from the rest of Apple’s portfolio, which may portend future change.\nUltimately, we’re left asking: where do we go from here?\n\n\nProactive\niOS’ Proactive assistant, introduced last year as a set of suggested shortcuts for apps based on user habits and context, is expanding to locations and contacts in iOS 10, and gaining a foothold in the system keyboard.\nIf you’re in an iMessage conversation and someone asks you for a contact’s phone number or email address, iOS will automatically put that suggestion in the QuickType keyboard for one-tap insertion. It doesn’t have to be a reply to an existing message: if you compose a new email and type “[Name]’s phone number is”, QuickType will also proactively suggest the phone number from your address book.\n\nEven more impressively, if someone asks “Where are you?” on iMessage, QuickType will show a button to send your current location. Tap it, and a Maps bubble will be sent; the other person can tap it to open a full-screen preview and get directions.74\nSharing your current location from QuickType in iMessage.\nNSUserActivity plays a role in proactive suggestions, too. Apps can push out activities for places and have them appear as suggestions in other apps.\nA Yelp suggestion in Maps.\nA restaurant listing from Yelp, for example, can be suggested in Maps’ search view automatically; an app that displays hotel reviews can mark the location the user is viewing, and if the user switches to a travel planning app, that address can be proactively suggested without the need to search for it again.\n\nRecently viewed places can even be suggested as shortcuts when switching between apps to open directions in Maps.\nMaps shortcuts for places viewed in third-party apps.\nThe system is an ingenious spin on NSUserActivity – a framework that developers were asked to start supporting last year for Spotlight search and Siri Reminders. By leveraging existing APIs and work developers have already put into their apps, iOS 10 can be smarter and use location-based activities as dynamic “bookmarks” in the system keyboard.\nWhen these suggestions work, they’re impressive and delightfully handy. In my tests, I received suggestions for addresses listed on webpages in Safari (and properly marked up with schema.org tags) and Yelp inside Maps; iOS 10 suggested addresses for stores and restaurants when I was switching between Yelp, Safari, Maps, and Messages, and it removed suggestions after I closed the webpages in Safari or the listings in Yelp.\nI’ve found other QuickType suggestions to be more inconsistent. When talking in English on iMessage, QuickType was pre-filled with suggestions for trigger sentences such as “Let’s meet at” or “We’re going to” because I was viewing a location in Maps or Yelp. I couldn’t get the same suggestions for different phrases like “Let’s have dinner at” or “See you in 10 minutes at”.\nI couldn’t get proactive QuickType suggestions to work in Italian at all. This is an area where Apple’s deep learning tech should understand how users share addresses and contact information with each other. I’d expect Proactive to gain more predictive capabilities down the road, such as Calendar or Apple Music integration.\nThere are more instances of Proactive suggestions in iOS 10 that are subtle, but useful. When searching in Spotlight, QuickType will offer suggestions for locations and other content as soon as you start typing. Previous searches are listed at the bottom of Siri suggestions (and I haven’t found a way to disable them, which could be problematic).\nProactive shortcuts and previous searches in Spotlight.\nIf you’re already looking at a location in Maps or apps that markup addresses correctly, you can invoke Siri and say “get me there” to open directions to the address you’re viewing. ETA uses this feature to start directions to a place you’re viewing in the app.\nOpening directions from ETA with Siri.\nIt’s no Google Now on Tap, but it’s easy to see how Apple could soon replicate some of that functionality through various types of NSUserActivity.75\nApple is moving towards making Proactive more than a standalone page of shortcuts. Rather, Proactive is becoming an underlying feature of iOS, connecting an invisible web of activities when and where they make the most sense.\n\n\nKeyboards\nWhile last year’s software keyboard improvements focused on iPad productivity, iOS 10 brings pleasant enhancements that will benefit every iOS user.\nMultilingual Keyboard\nThe most unexpected change in iOS 10 will be as important as copy & paste for millions of international users. iOS 10 adds support for multilingual typing without switching between keyboards.\nThe annoyance of alternating keyboards isn’t an issue everyone can relate to. Most single-language speakers only deal with emoji as a separate “keyboard” that requires switching from the QWERTY layout. Those users probably don’t even see emoji as an additional keyboard but just as a special mode of the main system one. Millions of people have never seen iOS’ old keyboard system as a problem.\nHow most English speakers deal with the system keyboard.\nFor speakers of multiple languages, the experience couldn’t be more different. As soon as a third keyboard is added to iOS, the emoji face turns into a globe button to switch between keyboards. Tapping it repeatedly cycles between all keyboards; alternatively, holding the globe button brings up a list of them.\nHow international users switch between keyboards.\nAnyone who uses an iOS device to hold conversations in multiple languages is subject to a slower experience. When you’re constantly jumping between iMessage conversations, Twitter replies, Facebook, email, Slack, and Notes, and when you’re staying in touch with friends in multiple languages, and after you’ve been doing it every day for years, those seconds spent cycling through keyboards add up. Millions of people see this as one of the biggest flaws of iOS.\nIn iOS 10, Apple is taking the first steps to build a better solution: you can now type in multiple languages from one keyboard without having to switch between international layouts. You don’t even have to keep multiple keyboards installed: to type in English and French, leaving the English one enabled will suffice. Multilingual typing appears to be limited to selected keyboards, but it works as advertised, and it’s fantastic.\nThe idea is simple enough: iOS 10 understands the language you’re typing in and adjusts auto-correct and QuickType predictions on the fly from the same keyboard. Multilingual typing supports two languages at once, it doesn’t work with dictation, but it can suggest emoji in the QuickType bar for multiple languages as well.\nSwitching between English and Italian from the English keyboard.\nI started testing multilingual typing on my iPhone and iPad Pro on the first beta of iOS. The best part is that there’s very little to explain: suggestions retain the predictive nature of QuickType based on the context of the app or conversation, and you can even switch between languages within the same sentence. There’s no training or configuration involved: it’s as if two keyboards were rolled into one and gained the dynamic context-switching of a multilingual person.\nKnowing which languages can work with multilingual typing is a different discussion. Apple hasn’t updated their iOS feature availability page with details on multilingual typing yet. My understanding is that only keyboards with support for QuickType predictive suggestions and with a traditional QWERTY layout support multilingual typing. You should be able to mix and match Italian and English, or Dutch and French, or German and Spanish, for instance, but not Chinese and English within the same keyboard due to differences in the alphabet and characters.\nI’ve been having conversations with my family in Italian while talking to colleagues and readers in English. I’m impressed with iOS 10’s ability to detect languages on a word-by-word basis. I assumed the system could be confused easily, particularly with typos or words that are similar between two languages, but that only happened a couple of times over three months. Switching mid-sentence between Italian and English (as I often do when talking about work stuff with my girlfriend, for example) is fast and accurate.\n\nMultilingual typing is as important as copy & paste for millions of international users.\n\nMost new iOS features take some time to get used to; multilingual typing isn’t one of them. After years spent fighting the keyboard switcher and auto-correct with multiple languages, multilingual typing is a huge relief. It’s an elegant solution to a difficult problem, and it makes conversations flow naturally. I’m happy to see Apple catering to users who speak multiple languages with a feature that others will never (understandably) care about.\nMultilingual typing has already become an essential feature of my iOS experience. I love it.\nEmoji Suggestions\nApple’s improvements to typing and QuickType don’t stop at text and Proactive – they include emoji as well.\nEmoji suggestions in QuickType. (Tap for full size)\nIf you’ve typed a word or expression that iOS 10 associates with an emoji, such as “pizza” or “not sure”, a suggested emoji will appear in QuickType. You can either put the emoji next to the word you’ve typed (by putting a space after the word and then tapping the emoji) or replace the word with the emoji itself (don’t add a space and tap the emoji). If emoji suggestions don’t immediately appear in an app, try inserting at least 5 emoji from the emoji keyboard first.76\n\n \nEmoji suggestions\n\nIn my tests, emoji suggestions have been good, often impressive. I’ve received emoji suggestions in both English and Italian, for a variety of common expressions (like “yum”, “love you”, or “I’m fine”) and with up to three suggestions for a single word (such as “lol”). Popular emoji like the thumbs up/down, clapping hands, and high five can be suggested if you know the trigger word/expression. From this point of view, emoji suggestions are visual text replacements – for instance, I now type “great” and replace the word whenever I want to insert a thumbs up in a message.\nHowever, because the predictive engine is young and there’s so many different ways to describe an emoji, the dictionary is still growing. Italian doesn’t support as many suggestions as English (“think”, for instance, brings up the Thought Balloon emoji in English; the Italian equivalent, “penso”, doesn’t – but the infinitive form, “pensare”, does); some expressions don’t show an obvious emoji suggestion (try with “blue heart” or “friends”).77\nAccording to Apple, their Differential Privacy technology will be used to understand how iOS users type emoji. Hopefully, such system can learn and improve its emoji definitions over time as it looks at how people in aggregate use emoji in the real world. If it works, it’s going to make one of the best tweaks to iOS even better.\nCustom Keyboards\nDespite the creativity shown by developers, third-party keyboards haven’t received much love from Apple since their debut in iOS 8. Even without meaningful improvements to the API, two small adjustments in iOS 10 make using custom keyboards slightly better than iOS 9.\nThe first change sounds like an Apple engineer remembered about a bug and found the time to fix it. In iOS 10, custom keyboards transition on screen with the same bottom-up slide of Apple’s keyboard. Thanks to this, opening a custom keyboard isn’t as jarring as before.\n\n \niOS 10’s new slide transition for custom keyboards\n\nFurthermore, iOS 10 lets third-party keyboards display the system keyboard switcher (the globe key menu) with the same options you get in the Apple keyboard.78\nGboard (left) with a custom keyboard switcher; TextExpander updated for iOS 10 (right) with the new system one.\nI still don’t think Apple is particularly invested in the idea of custom keyboards (the lack of any new features is telling), but at least they’ve done the bare minimum to ensure that a third-party keyboard can be used as a primary one without too much struggle. Apple must have recognized the value of some custom keyboards for accessibility purposes, languages iOS doesn’t support, and sharing features for messaging apps that aren’t iMessage.\nThe likes of Google and Microsoft benefitting from these improvements is the kind of trade-off Apple will have to consider as they keep opening up iOS for everyone.\n\n\nThe iPad\niPad users who were craving the same attention of last year will be disappointed by iOS 10’s scarcity of iPad-only features. There are some iPad changes, but none of them have the impact of Split View or Picture in Picture.\nAs mentioned before, there are new three-panel modes for Mail and Notes, a Now Playing sidebar in Apple Music, and in-app split view for Safari. There’s also a different look for alarms in the Clock app. Everything else is a basic adaptation of iPhone layouts or a refinement of the same views in iOS 9.\nApple brought a few tweaks to the Camera viewfinder in iOS 10. On the iPhone, the camera flip button has been moved to the bottom, which makes it easier to switch between rear and iSight camera as you don’t have to reach to the top of the screen. On the iPad, most of the interface has been redrawn with circular buttons on the right and a persistent zoom slider on the left.\nThe bigger iPad-only interface changes in iOS 10 can be collected in a single gallery:\n\nMoving on to other features, Spotlight search invoked from an external keyboard with Command-Space will now open on top of the app(s) you’re currently using without exiting back to the Home screen. When in Split View, this can be used as a quicker app switcher for the primary app on the left side.\nSpotlight now opens modally on top of the apps you’re using.\nIt’s nice to use a Spotlight that behaves more like the Mac. Unfortunately, apps (including Apple’s Messages, Mail, and Notes) don’t restore cursor position after dismissing Spotlight. If you’re typing a message, open Spotlight, and then close it, you’ll have to tap the screen to focus the cursor on the last active app and continue typing.\nThere are more external keyboard enhancements that are steps in the right direction. A Home screen icon has been added to the Command-Tab app switcher, so you can return to the Home screen without having to use Command-H. And, Command-tilde (~) can now move backwards in the app switcher, like on macOS.79 Last, you can take a screenshot with Command-Shift-3, which will be saved in the Photos app.\nA Home screen shortcut has been added to the Command-Tab app switcher.\nI’d be remiss if I didn’t mention Playgrounds. Apple hasn’t brought a full Xcode suite to the iPad, but the more streamlined Playgrounds environment feels like a better solution to introduce a new generation of iOS users to programming. Playgrounds isn’t a built-in app (it’s available from the App Store), and it’s got some surprising innovations in terms of code interactions and in-app multitasking. It’s also more powerful than you imagine if you know your way around Swift and native iOS frameworks. We’ll have a separate story on Playgrounds later this week.\nOn Hold\nThe lack of deeper iPad improvements in iOS 10 amplifies problems Apple still hasn’t fixed.\nOn the 12.9-inch iPad Pro, the Home screen is still a wasteland of icons that don’t take advantage of the space offered to them. This year, the contrast is especially harsh given how iPhones with 3D Touch have received Home screen widgets in addition to quick actions.\nManaging multiple files at once across different apps is still a task that will test the endurance of the most patient users. The Open In menu, untouched in iOS 10, continues to be limited to moving one file at a time from one app to another. The new ‘Add to iCloud Drive’ extension doesn’t help when even a basic task such as saving multiple attachments from an email message isn’t supported.\nMore importantly, it’s obvious that Split View could be so much more. Having the clipboard and extensions as the sole data sharing mechanisms between two apps feels too limited when iOS is clearly suited for a system drag & drop framework. And that’s not to mention the Slide Over app picker – unchanged from last year and in desperate need of a redesign.\nApple says that “there are great iPad features” in iOS 10, but that’s not accurate. There are great iOS features in this update, and, sure, they also work on the iPad, but the iPad-only changes are minor and sparse – with the sole exception of Safari. iOS 10 doesn’t share the same commitment to the iPad as iOS 9, when Apple was willing to reinvent the device’s most fundamental aspects. In many ways, this feels like a regression to the days of iOS being barely “optimized” for the iPad.\niOS 10 is by no means “bad” on the iPad, it’s just not particularly exciting or what the platform deserves right now. If Apple is planning their own tick-tock schedule for iOS releases going forward, the iPad’s tock had better be a good one.\n\n\nMore Extensions\nFollowing last year’s focus on iPad, built-in apps, and performance, iOS 10 marks Apple’s return to opening up the platform to developers with extensions. After Messages, Maps, and Siri, iOS 10 has got a few more extensibility tricks up its sleeve that are also significant.\nMarkup\nAfter its debut in Mail with iOS 9, Apple’s Preview-like annotation tool has graduated to a system extension for images and documents.\nUsing Markup in Photos.\nThe tools available in Markup haven’t changed. You can draw colored lines of varying thickness80, add magnification loupes, and place text annotations. Notably, Markup can be used in Photos as an editing extension; it doesn’t offer the advanced tools of Annotable, but it should be enough for most users.\nAdd to iCloud Drive\nFollowing iOS 9’s inconsistent use of an iCloud Drive extension (which was only available for attachments in Mail), iOS 10 makes “Add to iCloud Drive” a system-wide option that can be used anywhere, for any file.\n\nAdd to iCloud Drive is an action extension that copies a file passed to it into iCloud Drive. It works for individual files shared from apps as well as media from Photos.\nUnfortunately, the extension is hindered by questionable design decisions. When saving a file, the dialog box shows every folder and sub-folder in your iCloud Drive without a way to collapse them. There’s no quick way to open a specific destination: you’ll have to scroll a seemingly endless list of folders every time you want to save a file. There are no recent locations, no bookmarks, no search. No person who deals with documents on iOS would ever want to save them with an interface like this.\nI appreciate Apple making iCloud Drive a system extension, but its design is amateur hour. It makes me wonder if anyone at Apple has ever used iCloud Drive with more than a handful of folders. It’s such an obvious misstep, it almost looks like a joke.\nVoIP Apps and CallKit\nApple is granting third-party developers access to another part of the OS through extensions: telephony.\nFor years, VoIP apps for audio and video calling have been relegated to a second-class experience. Apple created an API six years ago to bless VoIP apps with background execution privileges, but without a framework to integrate calls with the rest of the system, apps still needed to maintain their own contact lists and use standard push notifications for incoming calls.\niOS’ old VoIP calling experience.\nIt was too easy to miss a call from apps like Skype or WhatsApp; accepting a call from a third-party app was also slow and confusing (why would you pick up a call from a banner alert?). Plus, developers couldn’t get access to functionalities such as blocked contacts, which remained exclusive to Apple’s Phone app.\nAll this is changing with CallKit, a framework that elevates third-party VoIP apps to a front-seat spot on iOS, allowing them to plug into advanced controls that have complemented Apple’s Phone and FaceTime services for years.\nThe CallKit framework permits an incoming call from a third-party VoIP app to take over everything else (including the Lock screen) with a full-screen view, just like Apple’s Phone and FaceTime apps. In a foremost example of dogfooding, Apple itself has adopted CallKit in all of their telephony services.\nCallKit’s interface and behavior are consistent with Phone and FaceTime calls on iOS, with some differences. The calling UI is the same as Apple’s, with a label that describes which app the call is happening with, and the icon of the app replacing the dialer button. Tapping the icon takes users directly to the app for additional features. Developers can customize the in-call UI with a camera icon that indicates whether an app supports video calling or not.\nLike Phone and FaceTime, CallKit boosts the priority of third-party VoIP apps. Other apps can’t interrupt a call during a CallKit session; routing for Accessibility features, CarPlay, and Bluetooth connections is handled by the system automatically without developers having to optimize for them.\nA demo CallKit app on iOS 10.\nCallKit’s integration with iOS’ calling infrastructure goes beyond a shared UI. VoIP apps built with CallKit get access to the same block list and Do Not Disturb settings used by Apple’s apps, they can support switching between multiple calls, and they can even appear in Contacts via the Recents and Favorites views.\nApple doesn’t seem to be religious about pushing users to FaceTime anymore. If iOS 10 sees that the same contact is also registered with other VoIP services, buttons to initiate calls through third-party apps will be embedded in the contact card.81 Users only need to give an app permission to be used as a Service Provider, and it’ll be promoted to a first-class calling experience by iOS 10.82\nApple’s embrace of third-party services with CallKit isn’t an admission of defeat. Rather, it’s a recognition of the fact that millions of people use iPhones to communicate with their friends and families through apps that aren’t FaceTime – that the App Store has reinvented communications beyond FaceTime and iMessage.\nAs platform owners, Apple understands that they have to help customers who are seeking alternative calling services. With CallKit, they’ve created a secure and consistent framework that takes advantage of every feature that makes an iPhone the ultimate communication device.\nUsing VoIP apps through CallKit feels and works like any other normal phone call. It’s refreshing to see this happen, and it’s a testament to the power of Apple’s extensibility APIs. I’m looking forward to seeing WhatsApp, Skype, and others update their apps for CallKit.\nCall Directory\nCall Directory is a surprising inclusion in the CallKit framework. With this extension type, apps can label phone numbers for incoming calls on the Lock screen.\nApple described the use case for call directory extensions at WWDC: spam calls. According to the company, robo-callers and spam calls are particularly problematic in China (though I can vouch for their annoyance in Italy, too), and they’ve set out to improve upon this problem by letting developers maintain a database of phone numbers known to be spam.\nCraig will tolerate no spam.\nIn Apple’s examples, a company like Tencent could build a call directory extension. When a call from a spam number comes in, the extension could add a label that identifies it as potential spam so the user can decide to reject the call without answering it.\nCall Directory is another instance of Apple letting developers take over key bits of iOS in areas where the company doesn’t want to be involved.\n\n\nEverything Else\nWith the breadth and depth of iOS, it’s impossible to list every single change or new feature. Whether it’s a setting, a briefly documented API, or a subtle visual update, there are plenty of details and tidbits in iOS 10.\nDifferential Privacy\nAs a branch of cryptography and mathematics, I want to leave a proper discussion of Apple’s application of differential privacy to folks who are better equipped to talk about it (Apple is supposed to publish a paper on the subject in the near future). See this great explanation by Matthew Green and ‘The Algorithmic Foundations of Differential Privacy’ (PDF link), published by Cynthia Dwork and Arron Roth.\nHere’s my attempt to offer a layman’s interpretation of differential privacy: it’s a way to collect user data at scale without personally identifying any individual. Differential privacy, used in conjunction with machine learning, can help software spot patterns and trends while also ensuring privacy with a system that goes beyond anonymization of users. It can’t be mathematically reversed. iOS 10 uses differential privacy in specific ways; ideally, the goal is to apply this technique to more data-based features to make iOS smarter.\nFrom Apple’s explanation of differential privacy:\n\n Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.\n\nIf Apple’s approach works, iOS will be able to offer more intelligent suggestions at scale without storing identifiable information for individual users. Differential privacy has the potential to give Apple a unique edge on services and data collection. Let’s wait and see how it’ll play out.\nSpeech recognition\niOS has offered transcription of spoken commands with a dictation button in the keyboard since the iPhone 4S and iOS 5. According to Apple, a third of all dictation requests comes from apps, with over 65,000 apps using dictation services per day for the 50 languages and dialects iOS supports.\niOS 10 introduces a new API for continuous speech recognition that enables developers to build apps that can recognize human speech and transcribe it to text. The speech recognition API has been designed for those times when apps don’t want to present a keyboard to start dictation, giving developers more control.\nSpeech recognition uses the same underlying technology of Siri and dictation. Unlike dictation in the keyboard, though, speech recognition also works for recorded audio files stored locally in addition to live audio. After feeding audio to the API, developers are given rich transcriptions that include alternative interpretations, confidence levels, and timing details. None of this is exposed to the microphone button in the keyboard, and it can be implemented natively in an app’s UI.\nSame API, different interfaces.\nThere are some limitations to keep in mind. Speech recognition is free, but not unlimited. There’s a limit of 1 minute for audio recordings (roughly the same of dictation) with per-device and per-day recognition limits that may result in throttling. Also, speech recognition usually requires an Internet connection. On newer devices (including the iPhone 6s), speech recognition is supported offline, too. User permission will always be required to enable speech recognition and allow apps to transcribe audio. Apple itself is likely using the API in their new voicemail transcription feature available in the Phone app.\nVoicemail transcription in iOS 10, possibly using speech recognition as well.\nI was able to test speech recognition with new versions of Drafts and Just Press Record for iOS 10. In Drafts, my iPhone 6s supported offline speech recognition and transcription was nearly instantaneous – words appeared on screen a fraction of a second after I spoke them. Greg Pierce has built a custom UI for audio transcription inside the app; other developers will be able to design their own and implement the API as they see fit. In Just Press Record, transcripts aren’t displayed in real-time as you speak – they’re generated after an audio file has been saved, and they are embedded in the audio player UI.\nI’m looking forward to podcast clients that will let me share an automatically generated quote from an episode I’m listening to.\nDo Not Disturb gets smarter\nDo Not Disturb has a setting to always allow phone calls from everyone while every other notification is being muted.\n\nAn Emergency Bypass toggle has been added to a contact’s editing screen for Ringtone and Text Tone. When enabled, it’ll allow sounds and vibrations from that person even when Do Not Disturb is on. If you enable Emergency Bypass, it’ll be listed as a blue button in the contact card to quickly edit it again.\nTap and hold links for share sheet\nApple is taking a page from Airmail (as I hoped) to let you tap & hold a link and share it with extensions – a much-needed time saver.\nParked car\nI couldn’t test this because I don’t have a car with a Bluetooth system (yet), but iOS 10 adds a proactive Maps feature that saves the location of your car as soon as it’s parked. iOS sends you a notification after you disconnect from your car’s Bluetooth, dropping a special pin in Maps to remind you where you parked. The feature also works with CarPlay systems.\nSpotlight search continuation\nSearches for app content that began in Spotlight can now continue inside an app with the tap of a button.\n\nDrafts uses the new Spotlight search continuation API to let users continue looking for content on the app’s own search page. Maps has also implemented search continuation to load places in the app.\nBetter clipboard detection\niOS 10 brings a more efficient way for apps to query the system pasteboard. Instead of reading clipboard data, developers can now check whether specific data types are stored in the pasteboard without actually reading them.\nFor example, a text editor can ask iOS 10 if the clipboard contains text before offering to import a text clipping; if it doesn’t, the app can stop the task before reading the pasteboard altogether. This API should help make clipboard data detection more accurate for a lot of apps, and it’s more respectful of a user’s privacy.\nPrint to PDF anywhere\nA hidden feature of iOS 9 was the ability to 3D Touch on the print preview screen to pop into the PDF version of a document and export it. iOS 10 makes this available to every device (with and without 3D Touch) by pinching on the print preview to open Quick Look.\nVideos cellular playback quality settings\nIf you use Apple’s Videos app to stream movies and TV shows, you can now choose from Good and Best Available settings. I wish this also affected playback quality of YouTube embeds in Safari.\nHLS and fragmented MP4 files\nApple’s HTTP Live Streaming framework (HLS) has added support for fragmented MP4 files. In practical terms, this means more flexibility for developers of video player apps that want to stream movie files encoded in MPEG-4.\nI tested a version of ProTube – the most powerful third-party YouTube client – with HLS optimizations for iOS 10. The upcoming update to ProTube will introduce streaming of videos up to 4K resolution (including 1440p) and 60fps playback thanks to changes in the HLS API.\n\nIf your favorite video apps use HLS and deal with MP4 files, expect to see some nice changes in iOS 10.\nTouch ID for Apple ID settings\nSettings > iTunes & App Store > View Apple ID no longer requires you to type a password. You can view and manage your account with Touch ID authentication. This one deserves a finally.\nNo more App Store password prompts after rebooting\nIn a similar vein, the App Store will no longer ask you for a password to download a new app after rebooting your device. You can just use Touch ID instead.\nContinuity Keyboard for Apple TV\nIf your iPhone is paired with an Apple TV, you’ll get a notification whenever the Apple TV brings up a text field (such as search on the tvOS App Store).\n\nYou can press (or swipe down) the notification on iOS to start typing in the quick reply box and send text directly to tvOS. A clever and effective way to reduce tvOS keyboard-induced stress.\nApp Store categories, iPad, and search ads\nThe App Store’s Explore section, launched with iOS 8 and mostly untouched since, has been discontinued in iOS 10. Categories are back in the tab bar, with the most popular ones (you can count on Games always being there) available as shortcuts at the top.\n\nApple had to sacrifice the Nearby view to discover apps popular around you, but categories (with curated sections for each one of them) seem like the most popular choice after years of experiments.\nOn the iPad, the App Store now supports Split View so you can browse and search apps while working in another app.\n\nThis has saved me a few minutes every week when preparing the App Debuts section for MacStories Weekly.\nApple is also launching paid search ads on the App Store. Developers will be able to bid for certain keywords and buy paid placements in search results. Ads are highlighted with a subtle blue background and an ‘Ad’ label, and they’re listed before the first actual search result – like on Google search.\nApp Store ads. I’m not sure about these.\nIt’s too early to tell how beneficial App Store ads will be for smaller studios and indie developers that can’t afford to be big spenders in search ad bids. Apple argues that the system is aimed to help app discovery for large companies and small development shops alike, but I have some reservations.\nAs a user, I would have liked to see Apple focus on more practical improvements to App Store search, but maybe the company is right and all kinds of developers will benefit from search ads. We’ll follow up on this.\nNew ‘Add to Favorites’ UI\n\nSimilar to the 3D Touch menu for a contact card, the view for adding a contact to your favorites has been redesigned with icons and expandable menus.\nMore responsive collection views\nExpect to see nice performance improvements in apps that use UICollectionView. iOS 10 introduces a new cell lifecycle that pre-fetches cells before displaying them to the user, holding onto them a little longer (pre-fetching is opt-out and automatically disabled when the user scrolls very fast). In daily usage, you should notice that some apps feel more responsive and don’t drop frames while scrolling anymore.\nSecurity recommendations for Wi-Fi networks, connection status\nIf you connect to a public Wi-Fi network (such as a restaurant hotspot), iOS 10 will show you recommendations to stay secure and keep your wireless traffic safe. There’s also better detection of poor connectivity with an orange “No Internet Connection” message in the Wi-Fi settings.\nAccessibility: Magnifier and Color Filters\nThere are dozens of Accessibility features added to iOS every year. I want to highlight three of them.\nA new Magnifier app allows you to use the iPhone’s camera to magnify what’s around you and zoom into objects or text. The Magnifier isn’t another Apple app on the Home screen: if enabled in the Settings, a triple-click on the Home button will launch Magnifier as a custom app (it even shows up in the multitasking switcher) with options to control zoom level, color filters, color inversion, and turn on the camera flash. You can opt to adjust brightness and contrast automatically based on ambient light.\niOS 10’s new Magnifier app.\nWhile in Magnifier, you can move the camera around and apply filters in real-time. If you don’t want to hold up your iPhone for more than a few seconds, you can capture a still frame to zoom into the image and adjust colors.\n\niOS 10’s Magnifier is technically impressive and it’s going to help millions of people with vision impairments. I’d suggest everyone to keep it installed as a quick way to use the iPhone’s camera as a magnifier – it’s incredibly well done and convenient.\nUnder Display accommodations, a Color Filters menu can help users with color blindness or who have difficulty reading text on the display. Apple has included filters for grayscale, protanopia, deuteranopia, tritanopia, and color tint. It’s also a good reminder for developers that not all users see an app’s interface the same way.\n\nFinally, you can now define custom pronunciations to be used when iOS reads text aloud. Available in Settings > Accessibility > Speech > Pronunciations, you’ll be able to type a phrase and dictate or spell how you want it to be pronounced by the system voice.\n\nDictating a pronunciation is remarkable as iOS automatically inserts it with the phonetic alphabet after recognizing your voice. You can then choose to apply a custom pronunciation to selected languages, ignore case, and pick which apps need to support it.\n\n\n10\niOS 10 is characterized by an intrinsic duality: an acknowledgement of the platform’s maturity; and a relentless, yet disciplined pursuit of what’s next. Both depend on each other, and they’re the lens through which iOS 10 is best explained.\nThe iMessage App Store, SiriKit, rich notifications, CallKit, and Maps extensions are a display of Apple’s willingness to let apps be more than disconnected silos. iOS 10 is continuing what iOS 8 started: third-party apps are becoming system features.\nIt’s not just a matter of nurturing developer goodwill: the App Store ecosystem can be leveraged to increase the functionality of iOS, building features that appeal to how people want to use their iPhones and iPads. For Apple, such effort is a nod to the App Store’s strengths and progress. For developers and users, it means apps can have ramifications in the most important parts of iOS.\nAt the same time, allowing apps to reach further into iOS shows how the concept of “app” itself is evolving.\nWhen different features of an app can be experienced throughout the system, the app becomes more of a collection of services, broken into atomic units. They’re pervasive. Providing apps with more extensibility hooks results in moving more interactions away from the traditional app experience and into single-purpose mini interfaces. Whether it’s an interactive notification, a widget, an iMessage app, or a SiriKit extension, iOS 10 has a clear vision of apps as contextual helpers in addition to being standalone utilities. It’s only reasonable to expect Apple to follow this path going forward.\nSigns of maturity include fixing what isn’t working, too. The redesigned Apple Music makes the case for a simplified streaming interface that addresses what many found confusing in its debut release. The pagination of Control Center is a welcome enhancement to its capabilities as much as it’s an admission of its original complexity. I’d argue that letting users remove Apple apps falls under the same category.\nAlas, not every glaring problem has been remedied by iOS 10. File management continues to feel like a chore due to cumbersome document providers, and Apple managed to ship an incomprehensible iCloud Drive extension that doesn’t help at all. Mail is lagging behind a competition that is shipping useful integrations and modernized email features. The Slide Over app picker – one of the worst design decisions of iOS 9 – is still with us.\nThe most disappointing aspect of iOS 10, in fact, is the treatment the iPad received, with uninspired adaptations of iPhone UIs and a lack of attention that’s in stark contrast with last year. In iOS 10, the iPad feels like a second-class citizen again, put on hold in the backseat, waiting for resources to be devoted to it. Perhaps all this will be resolved as Apple’s plans on iPad updates are revealed, but we can’t know yet. Today, iOS 10 isn’t the big milestone for iPad users that iOS 9 was.\nAn acceptance of iOS’ grown-up status – and the responsibility that comes with it – isn’t the sole driver of its advancements. iOS 10 demonstrates how, at a fundamental level, change is the only constant in Apple’s software. Ironically, the company’s approach to change is what hasn’t changed at all: it’s iterative, divisive, farsighted, often surprising, and, frankly, never boring.\nLooking at iOS 10’s features in isolation, we can spot every shade of change that has steered Apple so far. The need to make iMessage a platform and rethink Control Center. The patient expansion of the extensibility framework, done gradually – some might say too slowly – to ensure good performance and security. The first steps towards AI as a feature of our devices, built in a unique Apple way around privacy and laying the groundwork for the future.\nBut these changes are more than discrete improvements. They’re no islands. As the tenth anniversary of the iPhone and its software draws closer, it’s time we take a holistic view of what iOS has become. iOS’ changes are simply a reflection of our own changes – whether it’s how much time we spend messaging with friends, how many pictures we take, the sensors we put in our homes, or the music we listen to. The memories we cherish, our conversations, the songs we listen to.\nApple understands that, beyond technology, to improve iOS is to realize how much our lifestyles have changed. How software, after all, is nothing but our extension. From such perspective, iOS is never quite finished – it can only be relevant.\nAnd even at its tenth version, iOS is still forging ahead.\n\nCredits\nThis review wouldn’t have been possible without the help, feedback, and existence of the following people, animals, beverages, and pieces of software:\nMy girlfriend Silvia, for her patience, love, and design skills\nMy two dogs, who are adorable\nAlessandro Vendruscolo, who squashed many bugs and brought this web layout to life\nJohn Voorhees\nGraham Spencer\nBrett Terpstra\nMyke Hurley\nStephen Hackett\nFrank Towers, who created 1-2-3 Trip Planner (don’t tell developer Myke Hurley, though)\nCGP Grey\nJeremy Burge\n_David Smith\nCasey Liss\nJohn Gruber\nDiego Petrucci\nWorkflow, Pythonista, Scrivener, iThoughts, and Editorial – essential apps that helped me create this story\nSketch and Meng To’s Angle Mockups\nEvery app developer who sent me betas\nEvery engineer at Apple who always makes reviewing iOS each summer fun\n@TiccisEspresso, for a daily dose of energy\nEvery Club MacStories member\nAnd finally, every MacStories reader, for allowing me to do what I love. Thank you.\n\n\nFor instance, deleting a Mail message from a notification on the Lock screen requires the user to authenticate with Touch ID or passcode. ↩︎\n\n\n\nIt has a thicker font in iOS 10, a subtle but noticeable change from iOS 9. ↩︎\n\n\n\nA detail you can't miss: swiping up on the Search screen will make the clock move to the status bar, next to the padlock. Simple and tasteful. ↩︎\n\n\n\nThere is one technical aspect I wish Apple handled differently. There's no way for apps to request a temporary exception to programmatically expand a widget when it would be appropriate, collapsing it again when a task is finished.As an example, consider the Workflow widget and running a workflow from compact mode. If the workflow needs to display a longer list of items while it's executing, it won't be possible for the app to ask iOS to temporarily expand the widget until the workflow is complete, reverting it back to compact in the end. The user will have to tap a workflow, notice that the list is being cut off by compact mode, and manually toggle expanded mode. As of iOS 10.0, compact and expanded modes aren't dynamic, and I think Apple could add some flexibility to the API without taking control away from the user. ↩︎\n\n\n\nOn iPads, older iPhones, and if you have 3D Touch disabled in Settings, notification banners have a \"handle\" to suggest you can drag them downwards to expand them. ↩︎\n\n\n\nIn iOS 10, an action can also invoke the system keyboard to type a response. ↩︎\n\n\n\nExcept Messages notifications (where the conversation transcript can be scrolled) or the play button of videos embedded in notifications. ↩︎\n\n\n\nSome actions, such as replying with media or opening an iMessage app, require the notification to launch the Messages app. In the API, apps can specify which actions can be performed from a notification, and which ones need to be managed from the main app. ↩︎\n\n\n\nRotation Lock is the only toggle that doesn't carry the color of the screen in Settings where it belongs, but it looks nice and stands out nevertheless. ↩︎\n\n\n\nIf no audio is playing, Control Center shows a button for the last app that played audio. ↩︎\n\n\n\nInterestingly, Apple isn't using 3D Touch to expand accessories into detail views. A long tap on a button triggers haptic feedback and pops into a view, but you can't apply multiple levels of force to watch the button expand and shrink (like you can in the first page of Control Center). It's not the only case of Apple coupling haptic feedback (once exclusive to 3D Touch) with long taps in iOS 10, though. ↩︎\n\n\n\nOur original iMessage review mentions BBM as a competing service. That's a long time ago. ↩︎\n\n\n\nYou can even hold a photo, drag it around, and drop it in a conversation to send it. ↩︎\n\n\n\nIn which case, the link preview will be smaller and only display the domain name. ↩︎\n\n\n\nIt's kind of harsh on the iPad as lasers don't affect the message list, which remains white. ↩︎\n\n\n\nThe only way I've found to disable screen effects is to enable Reduce Motion, which unfortunately deactivates other animations throughout the system (including bubble effects). When Reduce Motion is turned on and someone iMessages you with an effect, you'll get a separate text message saying \"Sent with [effect name]\". My friend Stephen does this ironically with fake effect names sometimes. ↩︎\n\n\n\nYou can also tap with two fingers to send kisses, which weren't available on watchOS before. ↩︎\n\n\n\nPerhaps even open up selfie effects to developers? ↩︎\n\n\n\nA detail that I love: look closely at how ink spreads out on the \"page\" once it's absorbed. Realistic. ↩︎\n\n\n\nYou can also tap & hold a message to show Tapback plus Copy and More buttons. ↩︎\n\n\n\nWhich is a nice use of a private API by Apple (the same is true when deleting recent handwritten messages in handwriting mode). Third-party apps can't override the standard behavior of the Home button, which always exits an app when clicked once. There's another Apple precedent for this: clicking the Home button while configuring Touch ID won't exit Settings, but it'll show you a message instead. ↩︎\n\n\n\nA long tap on the apps icon next to the input field would be a nicer way to open the grid. ↩︎\n\n\n\nIn testing over 50 iMessage apps, I also ran into performance issues with the iMessage app drawer dropping frames when swiping between pages and other visual glitches. I don't think installing a lot of iMessage apps will be an edge case given their novelty factor. ↩︎\n\n\n\nA simple way to prove this: real emoji can coexist with text in the same string because emoji are Unicode characters. You can't send text and a KIMOJI in the same message on iOS because text and images are two separate entities. ↩︎\n\n\n\nThey did. ↩︎\n\n\n\nI'm sorry, Jeremy. ↩︎\n\n\n\nIf someone sends you a sticker from a pack you don't have installed, Messages will show a 'From' button underneath it to take you to the iMessage App Store. This should help discovery of sticker packs as they propagate across users. ↩︎\n\n\n\nNote: deleted stickers do not sync across devices with iCloud. ↩︎\n\n\n\nStickers can be static images or animated illustrations: iOS 10 supports PNG, APNG, JPEG, and GIF. Stickers can be displayed at three sizes (the default being medium at 136x136 points), they can't be smaller than 100x100 points, and they have a maximum file size of 500 KB. These options should give developers plenty of room for experimentation. ↩︎\n\n\n\nDevelopers can also include a sticker pack extension inside an existing iOS app. For example, KIMOJI could continue to ship their standalone app and offer both a custom keyboard and an iMessage sticker pack as separate extensions inside it. I'd expect apps that already offer custom \"emoji\" keyboards to go down this route. ↩︎\n\n\n\nI'm curious to see how Apple will handle the inevitable copyright claims for sticker packs featuring popular characters. ↩︎\n\n\n\nIt's already catching on. ↩︎\n\n\n\nSticker lock-in. It's a thing. ↩︎\n\n\n\nIn our case, 1-2-3 Trip Planner wouldn't know about anyone's name – it'd only see their identifiers and the interactive message from the current session.Identifiers are unique to each user's device and they are scoped to the iMessage app currently in use; if John removes 1-2-3 Trip Planner from his device and reinstalls it, the app will attribute a different set of identifiers to each participant. Apps can store these identifiers and Messages will match them to local contact names – that's how 1-2-3 Trip Planner can use labels such as \"Stephen's Available Times\". The \"Stephen\" part is a decoded identifier. ↩︎\n\n\n\nIn practice, the use of local identifiers and the fact that apps don't see the contents of individual messages but only a representation of objects in a session could hinder the feasibility of collaborative apps. We'll have to see if Apple's privacy-conscious approach will allow developers to program collaborative environments spread across multiple devices and instances of the same conversation. ↩︎\n\n\n\nIt reminds me of when I was in high school and didn't pay attention in my physics class, playing tic-tac-toe with a friend on my notebook. It's a time-filler. ↩︎\n\n\n\nThere's actually an eighth domain – restaurant reservations – but it requires official support from Apple. It also works with Maps in addition to Siri. Intents for restaurant reservations include the ability to check available times, book a table, and get information for reservations and guests. Apple has set up a webpage to apply for inclusion here. ↩︎\n\n\n\nWhich is used to continue an intent inside an app if it can't be displayed with a Siri snippet, such as playing a slideshow in a photo app. ↩︎\n\n\n\nIf split view is active and you initiate drag & drop, a tab won't expand to the preview as you cross over the separator in the middle of the screen. You can, however, slide it horizontally and watch as existing tabs move on the X-axis to show you where the new tab will be placed. This also works to rearrange tabs when split view isn't active. ↩︎\n\n\n\nThe only action that takes over the other side is the share sheet, which can only be active in one Safari view at a time. ↩︎\n\n\n\nSuch system would have to provide a unified UI for moving content across apps in Split View, consistently offer feedback to the user, and handle conversion of data formats between apps (say, dragging rich text into a plain text editor or a photo into Safari's address bar). It's probably nothing that Apple hasn't already figured out at least since the days of Drag Manager on System 7. ↩︎\n\n\n\nI came across articles where Reader couldn't fetch the author's name, for example. ↩︎\n\n\n\nThere's still no download manager like on macOS. ↩︎\n\n\n\nI'll refer to the app as Apple Music, even if it can be used without the streaming service, because I've been streaming music for years and I no longer have a local music library to manage. ↩︎\n\n\n\nThe only place where translucency lives on, given how content scrolls under it. ↩︎\n\n\n\nUnfortunately, Apple removed the progress bar from the bottom widget, which makes it harder to see your position in a song at a glance (you have to either use Control Center or open Now Playing). ↩︎\n\n\n\nI'm nitpicking, but I've found some mistakes in Apple's lyrics. Nothing major – things like inverted prepositions or abbreviated verbs that shouldn't have been – but worth mentioning. I noticed it about 10 times out of hundreds of songs I tried. It's probably something Apple can't fix because they're licensing lyrics (I'd love to know from whom). ↩︎\n\n\n\nWhen recording an iOS device's screen with QuickTime on macOS, the menu says \"System Capture\". ↩︎\n\n\n\nThe user retains the ability to preview a Uber car arriving to their location on the map and make a payment with Apple Pay. ↩︎\n\n\n\nProps to Apple for including cartoonish versions of the party and heart emoji. ↩︎\n\n\n\nNew in iOS 10, this allows you to group multiple accessories together as if they were a single accessory. ↩︎\n\n\n\nIf accessories require an additional wireless bridge, such as Philips' Hue lights, you'll be able to quickly open bridge settings, assign it to a room, and exclude it from favorites because it has no user-facing features of its own. ↩︎\n\n\n\nDepending on the speed of your Internet connection, iOS might need a few seconds to ping a remote HomeKit hub. ↩︎\n\n\n\nI've noticed that Wikipedia results aren't always suggested, even for topics that are available on Wikipedia. My understanding is that Look Up (and other search suggestions on iOS) attempt to find the most popular/relevant result for the current query. For instance, Look Up will suggest an artist, but not always an artist's album or single. ↩︎\n\n\n\nThese sources would have to be sanctioned by Apple. ↩︎\n\n\n\nOddly enough, Family Sharing hasn't been baked into Notes collaboration at all. ↩︎\n\n\n\nThere's a yellow badge in the note's list to tell you that a shared note has been modified since you last opened it. ↩︎\n\n\n\nShared notes can't be locked with passcode or Touch ID. ↩︎\n\n\n\nThe summer's really not a good time to test new Activity functionalities for me. ↩︎\n\n\n\nWhich isn't the case anymore with the iPhone 7. ↩︎\n\n\n\nThe Game Center app, on the other hand, is gone for good. With iOS 10, Game Center is only a framework apps can use – you'll see Game Center appear inside games as a system feature. I wouldn't know how to motivate Apple's decision other than they never really paid much attention to Game Center and its many technical woes. ↩︎\n\n\n\nIf you try to remove the Apple Watch app and an Apple Watch is currently paired to your iPhone, iOS will tell you to unpair the Watch first. ↩︎\n\n\n\nApple has already confirmed that, due to iOS' security model, you won't be able to update individual apps through the App Store, as some claimed earlier this year. Updates to system apps will be bundled with OS updates, as it's always been. ↩︎\n\n\n\nIn addition to unprocessed RAW capture, iOS 10 supports simultaneous delivery of RAW and processed images in JPEG. ↩︎\n\n\n\nThe duration and timing of a Live Photo can't be edited by apps – drastic modifications to the nature of a moment isn't what Apple wants third-party apps to do. ↩︎\n\n\n\nFrom what I've been able to try, any Apple Music track downloaded for offline listening should be supported in Memories. ↩︎\n\n\n\nYou can turn this off in the grid view to show every item assigned to the memory by toggling Show All/Summary. ↩︎\n\n\n\nI do wish Apple's Photos could proactively generate animations and collages like Google does, but it's nothing that can't be added in the future. ↩︎\n\n\n\nYou can also 3D Touch a face and swipe on the peek to favorite/unfavorite or hide it from the People album. ↩︎\n\n\n\nIf you'd rather not use the Photos app to browse faces, you can ask Siri to \"show me photos of [person]\", which will open search results in Photos. These are the same results you'd get by typing in Photos' search field and choosing a person-type result. ↩︎\n\n\n\nWe need to go deeper. ↩︎\n\n\n\nAccording to Apple, Memories, Related, People, and Scene search are not supported on 32-bit devices – older iPhones and iPads that don't meet the hardware requirements for image indexing. ↩︎\n\n\n\nScrolling date pickers have received a subtle new sound effect, too. ↩︎\n\n\n\nUnder the hood, it's an Apple Maps URL with your coordinates. ↩︎\n\n\n\nIn the future, Apple could allow apps to mark activity types such as \"topic\", \"song\", \"actor\", \"movie\", and more to let Siri look up content displayed on screen. ↩︎\n\n\n\nMy interpretation is that iOS wants to make sure it'll suggest emoji in an app where you typically use them. I guess you wouldn't want emoji suggestions in your bank's iPhone app. ↩︎\n\n\n\nWhat's interesting about emoji suggestions is that Apple's isn't only relying on Unicode names and annotations. They're maintaining their own list of definitions and expressions, which is likely the product of years of refinement. I'd love to see a full list of emoji trigger words and check how frequently it'll be updated. ↩︎\n\n\n\nApple is advising developers to position the globe key in the same spot as the default keyboard. They've also noticed that developers include a button to manage settings directly in the keyboard, and they're suggesting to put it where the system dictation key would be. I expect every custom keyboard to be updated with revised layouts after iOS 10. ↩︎\n\n\n\nIf you're European and don't have a tilde character on your keyboard, try it on one of the original, American-based Smart Keyboards for the iPad Pro. ↩︎\n\n\n\nWhen drawing in Markup, you can press (3D Touch) on the screen for thicker lines, though there's no haptic feedback to accompany the increase in pressure. ↩︎\n\n\n\nBased on the same Intents framework used by SiriKit. ↩︎\n\n\n\nApps that have requested permission will be displayed under Settings -> Phone. ↩︎\n\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2016-09-13T10:30:17-04:00", "date_modified": "2018-03-20T13:18:00-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iOS 10", "iOS Reviews", "stories" ], "summary": "Sometimes, change is unexpected. More often than not, change sneaks in until it feels grand and inevitable. Gradually, and then suddenly. iOS users have lived through numerous tides of such changes over the past three years." }, { "id": "https://www.macstories.net/?p=40226", "url": "https://www.macstories.net/stories/ios-9-review/", "title": "iOS 9: The MacStories Review, Created on iPad", "content_html": "With iOS entering the last stage of its single-digit version history, it’s time to wonder if Apple wants to plant new seeds or sit back, maintain, and reap the fruits of the work done so far.
\nLast year, I welcomed iOS 8 as a necessary evolution to enable basic communication between apps under the user’s control. With extensions based on a more powerful share sheet, document providers, widgets, and custom keyboards, I noted that iOS had begun to open up; slowing down wasn’t an option anymore.
\nIn hindsight, many of the announcements from last year’s WWDC were unambiguous indicators of a different Apple, aware of its position of power in the tech industry and willing to explore new horizons for its mobile operating system and what made it possible.
\nFollowing the troubled launch of iOS 6 and subsequent rethinking of iOS 7, Apple found itself caught in the tension between a (larger) user base who appreciated iOS for its simplicity and another portion of users who had elected iPhones and iPads as their primary computers. Alongside this peculiar combination, the tech industry as a whole had seen the smartphone graduate from part of the digital hub to being the hub itself, with implications for the connected home, personal health monitoring, videogames, and other ecosystems built on top of the smartphone.
\nWWDC 2014 marked the beginning of a massive undertaking to expand iOS beyond app icons. With Extensibility, HealthKit, HomeKit, Metal, and Swift, Tim Cook’s Apple drew a line in the sand in June 2014, introducing a new foundation where no preconception was sacred anymore.
\niOS’ newfound youth, however, came with its fair share of growing pains.
\nWhile power users could – at last – employ apps as extensions available anywhere, the system was criticized for its unreliability, poor performance, sparse adoption, and general lack of discoverability for most users. The Health app – one of the future pillars of the company’s Watch initiative – went through a chaotic launch that caused apps to be pulled from the App Store and user data to be lost. The tabula rasa of iOS 7 and the hundreds of developer APIs in iOS 8 had resulted in an unprecedented number of bugs and glitches, leading many to call out Apple’s diminished attention to software quality. And that’s not to mention the fact that new features often made for hefty upgrades, which millions of customers couldn’t perform due to storage size issues.
\nBut change marches on, and iOS 8 was no exception. In spite of its problematic debut, iOS 8 managed to reinvent how I could work from my iPhone and iPad, allowing me – and many others – to eschew the physical limitations of desktop computers and embrace mobile, portable workflows that weren’t possible before. The past 12 months have seen Apple judiciously fix, optimize, and improve several of iOS 8’s initial missteps.
\nEight years1 into iOS, Apple is facing a tall task with the ninth version of its mobile OS. After the changes of iOS 7 and iOS 8 and a year before iOS 10, what role does iOS 9 play?
\nIn many cultures, the number “10” evokes a sense of growth and accomplishment, a complete circle that starts anew, both similar and different from what came before. In Apple’s case, the company has a sweet spot for the 10 numerology: Mac OS was reborn under the X banner, and it gained a second life once another 10 was in sight.
\nWhat happens before a dramatic change is particularly interesting to observe. With the major milestone of iOS 10 on track for next year, what does iOS 9 say about Apple’s relationship with its mobile OS today?
\nAfter two years of visual and functional changes, is iOS 9 a calm moment of introspection or a hazardous leap toward new technologies?
\nCan it be both?
\nAn eBook version of this review is available to Club MacStories members for free as part of their subscription. A Club MacStories membership costs $5/month or $50/year and it contains some great additional perks.
\nYou can subscribe here.
\n(Note: If you only care about the eBook, you can subscribe and immediately turn off auto-renewal in your member profile. I’d love for you to try out Club MacStories for at least a month, though.)
\nDownload the EPUB files from your Club MacStories profile.
If you’re a Club MacStories member, you will find a .zip download in the Downloads section of your profile, which can be accessed at macstories.memberful.com. The .zip archive contains two EPUB files – one optimized for iBooks (with footnote popovers), the other for most EPUB readers.
\nIf you spot a typo or any other issue in the eBook, feel free to get in touch at club@macstories.net.
\nFrom an aesthetic perspective, iOS 9 doesn’t drift away from the focus on clarity, content, and color that debuted in 2013. iOS 9’s design builds upon the new course of iOS 7, with some notable differences that epitomize Apple’s penchant for refinement this year.
\nAfter a short and public affair with Helvetica Neue, iOS 9 brings a new system font called San Francisco.
\nDesigned in-house by Apple and introduced at WWDC as a family of typefaces that is both “inconspicuous and beautiful”, San Francisco is a sans-serif typeface which unifies the typographic voice of iOS, OS X, and watchOS, bringing visual consistency with two distinct sub-families and new APIs for developers to enhance the textual presentation of their apps.
\nFrom an average user’s perspective, the change to San Francisco may not appear as a drastic modification to the iOS interface. To people who are not familiar with the intricacies and details of type design, San Francisco may look subtly different, but mostly in line with the neutral and utilitarian nature of Helvetica Neue. This isn’t meant to sound as a slight to design experts and type connoisseurs, but a lot of people won’t notice the technical detail and years of work behind San Francisco.
\nFrom the look of the clock and date on the Lock screen to the time in the status bar and bold labels in title bars, San Francisco refreshes the textual appearance of iOS without clamoring for undivided attention. With San Francisco, iOS 9 doesn’t suddenly look like a jailbreak tweak that changes the system font to impractical typefaces just for the sake of customization. San Francisco looks nice – and it is objectively better than Helvetica Neue in some cases – but it doesn’t stand out and shout “Look at me, I’m new!”, and I believe that’s exactly what Apple set out to attain in their effort to craft a modern typeface for all their platforms.
\nWith iOS 7 (and 2014’s work on OS X Yosemite), Apple’s design team strove to build a structure that could forgo realistic representations of objects and textures in favor of a new hierarchy of text and color. While different colors and the interplay of layers have been used to suggest interactivity and depth, the job of communicating hierarchies and relationships between interface elements has largely fallen upon text weights and sizes.
\nThat’s why, for instance, title bars feature non-tappable bold titles and regular-sized colored buttons next to them, or why the same font used across three lines, but in different weights, can lay out an email’s sender, subject line, and body text preview. Apple discovered that once you get rid of shadows and textures to embellish UIs and push people towards interaction, text itself can be the texture; possibly an even more versatile one, because of its programmability, scalability, and widely recognized properties.
\nWe’ve seen how third-party apps such as Overcast and Twitterrific gained an identity of their own thanks to the typefaces they employed. From this standpoint, is it really a surprise that Apple – a company with a knack for controlling the primary technologies of their products – chose to design a new family of typefaces to use everywhere?
\nThat’s where San Francisco is worth exploring. Understanding its technicalities can help us comprehend Apple’s decisions and appreciate the details that will be shared between the OS and apps.
\nSan Francisco comes in two sub-families: San Francisco, used on iOS 9 and OS X El Capitan, and San Francisco Compact, used on Apple Watch. The two variants are related but not equal, with similar designs but different proportions and shapes that can adapt to multiple screen sizes and UI needs.
\nEach family has, in Apple’s parlance, two cuts – Text and Display – with six weights for Text and nine weights for Display. Apple’s goal with San Francisco is to bring a consistent voice and reading experience to all platforms; while the typeface stems from a shared set of rules and guidelines, each flavor of San Francisco has been designed and engineered for the platform it’ll be displayed on.
\nAs an artistic expression and software feature, fonts stand at the intersection of craftsmanship and engineering. In designing San Francisco, Apple was tasked with creating a typeface that looked good and was consistent in and out of itself, but that could also scale across three OSes and, more importantly, provide users and developers with controls over font size and readability.
\nThis is one of the core aspects of San Francisco: with an increasing array of screen sizes and ways to personalize the appearance of fonts on iOS, a new family of typefaces ought to solve a problem bigger than nice looks. Text is everywhere, but there’s no perfect font size or style that can fit everyone.
\nConsider Dynamic Type, an Accessibility feature introduced in iOS 7 that allows users to change the size of the system font. With a single slider available in the iOS Settings, users can make text bigger or smaller at a system-wide level, achieving a more comfortable reading experience in Apple’s apps but also third-party apps which integrate with Dynamic Type.
\nSan Francisco expands upon the idea of ever-changing font sizes with Text and Display, two cuts which are intelligently applied by the system when needed. The main trick employed by Apple is that iOS 9 switches automatically from Display to Text at 20 points. When this threshold is met, Apple “cheats” by altering the font ever so slightly so that it remains readable at smaller sizes. This makes Text suitable for text content displayed at 19 points and below, and Display the best option for labels and other text content displayed at 20 points and above.
\nThe difference between Text and Display is subtle and it likely won’t be noticed by most users, but it contributes to keeping San Francisco readable at any size, in any app, with any Dynamic Type setting. This also means that Text and Display are not equal: when text gets smaller, details of round shapes (such as the terminal of a lowercase “a”) get shorter and simplified, while the arm of a lowercase “t” and the shoulder of a lowercase “r” get slightly wider or longer to ensure shapes of letters can be quickly identified when reading. Then, size-specific tracking (the space between letters throughout an entire word, not to be confused with kerning) makes text further spread apart, in order to avoid confusion when reading sentences at a smaller point size.
\nApple explained the decision as having to adjust visual perception through illusion. Sometimes you have to cheat to make text look good to the user, and altering some details of San Francisco Display in its dynamic transformation to Text allows the same font to always produce readable text no matter the size.
\nIf you really want to spot the differences, you can go into Settings > General > Accessibility > Larger Text and have fun moving the slider to see how San Francisco adapts to smaller and bigger sizes. Or, you can pick San Francisco in the new Safari Reader and tweak its size to see how Display and Text come into play and change according to your preference.
\nTo understand the importance of increased detail at a smaller size in a different context, think about the setup process of videogame consoles or accessories like a Chromecast or an Apple TV. When viewed from a distance, lowercase text on TV keyboards can be hard to recognize, especially when your eyesight isn’t as good as it used to be. Now, consider that millions of people with visual impairments or low vision may come across similar readability problems on their iOS devices on a daily basis. When bigger text or zoomed UIs aren’t an option, having smaller text that is still recognizable and legible becomes essential.
\nSan Francisco’s smaller optical size feels airy and with fewer or more details depending on the anatomy of a character. Some of its subtleties will be lost to the average user, but that’s the point. People don’t have to know how fonts work. They just have to find them readable. And this is true for all kinds of people, with all kinds of needs.
\nIn my tests with apps updated to support the new system font, I found San Francisco to be more comfortable and readable at smaller sizes than other sans-serif typefaces like Helvetica Neue and Avenir. More importantly, San Francisco strikes a good balance of adding fresh personality to the system and keeping an underlying familiarity with previous versions of iOS. It feels more lively, it’s not too sharp and geometric, and I find it legible enough at any size.
\nTwitterrific with Helvetica Neue (left) and San Francisco (right).
For developers, San Francisco offers APIs and features that can be enabled in apps without having to resort to Unicode tricks or fall into limitations of previous system fonts.
\nFeatures are behaviors embedded in San Francisco via code; developers can choose to use them or opt out. Some of these features include built-in support for fractions (so developers aren’t limited by the choice of fractions as Unicode glyphs and don’t have to write custom code to display them), native superscripts and subscripts, alternate 6 and 9 symbols for smaller sizes, proportional numbers by default (but developers can opt into monospaced if they want to), uppercase forms for various math symbols, and a vertically centered colon.
\nOn top of this, San Francisco has comparable vertical metrics to old system fonts for basic compatibility with existing third-party apps and UIKit; it covers Polish, Hungarian, Cyrillic script, Greek script, and more, allowing for consistent localization of apps for international users; and, developers have been given new APIs to access all of the weights available in San Francisco.
\nI’d like to point out three of the features available in San Francisco, as they have nice, observable properties that anyone can appreciate with enough attention.
\nThe vertically centered colon is used by Apple in typesetting the system clock, and it can always be seen in action in the iOS 9 Lock screen and timestamps in Messages. This is a nice detail which makes for a pleasing viewing experience.
\nAlternate symbols for 6 and 9 are also interesting as they’re already in use by Apple in the Stopwatch app for Apple Watch and on the back of the Watch itself for the serial number. At smaller sizes, the similarly curved shape of these two numbers can be easily confused, and the alternate look (opt-in for developers) enables flatter, discernible shapes with lower cognitive load.
\nMonospaced and proportional numbers aren’t exactly new in iOS 9: Helvetica Neue supported switching from monospaced to proportional in iOS 8, but the default behavior has changed in iOS 9. Now, the system defaults to displaying proportional numbers: in most cases, proportional numbers (with variable width) are more evenly spaced with fewer unusual gaps between them than monospaced (fixed width) counterparts. If a series of numbers is rendered proportionally, a skinny “1” won’t take up the same width of a larger “5”, leading to a more pleasant effect.
\nThere are instances in which monospaced numbers would be preferable (such as columns in spreadsheets or animations in progress bars), but for most cases of body text and labels (and the system clock), proportional numbers as the default option is the right move.
\nTo better showcase the capabilities of San Francisco in iOS 9, I asked Daniel Breslan, an independent developer who works on Departure Board for iPhone, to create a sample app for iOS 9 and compare San Francisco to the same text strings and layout of an iOS 8 app with Helvetica Neue.
\niOS 8 (left) and iOS 9 (right).
The same custom app, running on iOS 9 for iPad.
This was a fun experiment as it shows how tracking, Text and Display cuts, and font features affect apps in practice. In the screenshots above, you can see how San Francisco adds a bit of personality to an otherwise neutral typeface, with details such as fractions, superscripts, subscripts, and alternate 6 and 9 making for a superior reading and formatting experience in iOS 9.
\nToday, it seems obvious that Apple wanted to control the typographic destiny of its ecosystem. Our experience with iOS devices primarily involves reading text. Whenever we pick up an iPhone or iPad and we look at something, text is part of it. Text is communication, texture, and call to action. Text has established a new visual hierarchy since iOS 7, but Apple wasn’t directly in control of its appearance and functionality. With San Francisco, Apple has set out to design a typeface that’s familiar, flexible, elegant, and accessible.
\nThe details of San Francisco may go unnoticed in the general public, but by claiming control of the system font, Apple is allowing developers to spend less time dealing with code required to build features that are native in San Francisco, letting them focus on other parts of their apps. This is a common thread in iOS 9, and an expected next step given the maturity and reach of the iOS ecosystem in 2015.
\n\nAnother San Francisco-related change in iOS 9 that demands a standalone section is Apple’s response to criticism on the Shift key design of iOS 8.
\nThe solution proposed by the company in iOS 9 is twofold. As far as the Shift key is concerned, iOS 9 introduces a simpler design that makes its On/Off state more obvious. When turned off, the Shift key has a gray background with a hollow glyph that matches the adjacent keys. When turned on, the entire key turns white with a black, filled glyph.
\nThe new Shift key design in iOS 9.
The new design clearly indicates the activation state of the Shift key, and it goes a long way in removing doubts on whether Shift is enabled or not, solving a major usability issue of the iOS 7/8 keyboard.
\niOS 9 also brings a feature which has been a staple of keyboards on Android, other OSes, and even videogame consoles for decades: lowercase and uppercase letters on the keyboard.
\nApple has added lowercase and uppercase variations of San Francisco letters to the iOS keyboard, turning on the option by default. With a default configuration, this means that when typing on iOS 9 – on either the iPhone or iPad – keys will constantly switch from uppercase to lowercase letters.
\nCritics of this design choice will argue that switching between uppercase and lowercase characters looks garish and fiddly, that Android has featured a similar behavior for years, and that such keyboard design loses the purity of the original iPhone’s keyboard. I understand the conceptual premise of this argument, but it dismisses the actual benefit of the new keyboard design in iOS 9.
\nAnimating characters between uppercase and lowercase transitions may not look as polished and precise as an always-uppercase keyboard, but it’s more practical. I’m used to looking at uppercase and lowercase characters every day, and I don’t mind seeing this on the iOS keyboard. The affordance is simply stronger and more natural.2 It’s faster to look at lowercase keys and expect lowercase characters to be entered on screen as a result of my direct manipulation than having to guess the output of my typing based on the Shift key alone.3
\nThis is the very advantage of software keyboards: they can be updated via software. For eight years, the iPhone’s keyboard was stuck on imitating the hardware keyboards of Macs, with uppercase keys that look bigger and better but that can’t be updated or changed dynamically. Alternating between lowercase and uppercase characters is just another case of a right thing to do because iOS doesn’t have to be like a PC.
\nApple is making this an option, at least for now. If you’re unhappy with the new keyboard design in iOS 9, you can go into Settings > General > Accessibility > Keyboard and turn off ‘Show Lowercase Keys’. This will turn off alternate lowercase and uppercase letters, but it won’t apply to third-party custom keyboards from the App Store.
\nLeft: Character Preview turned on.
Apple has also added an option to disable Character Preview, the little popup that appears every time you tap a letter on your iPhone’s keyboard. This is likely a consequence of having lowercase and uppercase letters – it’s no longer of paramount importance to confirm the character that is being typed when you’re looking at a keyboard that adapts to each case.
\nIf you’re okay with this option, you can turn off Character Preview in Settings > General > Keyboard.4
\nWith the exception of San Francisco, the overall look of iOS isn’t changing in version 9.0, but floating menus and sheets have received rounder corners and stronger drop shadows that make them stand out against the background of an app.
\nAs a result, buttons in menus are taller and they look more like buttons, inviting users to tap on them.5
\nThe change is subtle but noticeable, and I’ve been thinking about why Apple has decided to bring this to iOS 9 when there doesn’t seem to be any official update to the design guidelines of iOS.
\nSweet, sweet buttons.
The explanation I came up with is that every major Apple software redesign tends to overshoot and dial back over time, dampening the most radical aspects to readjust what was probably exaggerated in the first version. Alerts and sheets had a serious depth problem in iOS 7 and iOS 8, and they could be confused with the rest of an app.
\nBut I also believe the iPad motivated this change: with the device now capable of showing multiple apps and sheets on screen at the same time, making each element independent from what’s underneath it is going to be crucial for clarity. Bigger, rounded menus with drop shadows look nicer and they also serve a purpose. I wouldn’t be surprised to see more updates along these lines in future versions of iOS.
\n\nSince switching to Safari as my default browser two years ago, I’ve often called it Apple’s best app on iOS. While the “best” monicker may be up for debate, Safari is my favorite Apple app: its elegant interface hides a considerate collection of gestures and menus that make reading and browsing a pleasure, especially on the iPad. Alongside Editorial, Safari is where I get most of my work done on iOS. Unlike some, as a heavy user first and a publisher second, I believe that Safari showcases Apple at its best on iOS.
\nWith iOS 9, Apple isn’t rethinking any core aspect of Safari the app: they have brought some nice changes and additions to how you use Safari and what you can do with it, but together they make for an iterative update that doesn’t include any major visual tweaks.
\nWhere Apple has taken a bold new direction is in what you can see when using Safari and how the power of Safari can be extended to other iOS apps. Here lie two of the biggest surprises of iOS 9 – and, potentially, a harbinger of the future of web publishing.
\nSafari Reader, the company’s tool to clean up articles on the web for a clutter-free reading experience, has been updated in iOS 9 to include new font options and backgrounds. Functionally, Safari Reader is still the same: you tap a button in the address bar (if available) to clean up an article, and the app generates an elegant, text-and-images-only version that strips out all extraneous content.
\niOS 9 provides a total of eight fonts (including San Francisco), four backgrounds, and 12 font sizes you can use in Safari Reader. The configuration you pick applies to every instance of Safari Reader on your device.
\nWhile Safari Reader doesn’t offer the typographic controls of Instapaper (my preferred reading experience for web articles), the improvements in iOS 9 make it more versatile and capable of adapting to different users. Font options and backgrounds could help users with visual impairments tweak Safari Reader to yield a superior combination of text and colors for higher readability.
\nI don’t use Safari Reader every day – I prefer reading long-form articles in dedicated apps like Instapaper and Pocket – but when I come across the occasional article that I want to read right away, I remember how nice Safari Reader is. I may not be a heavy user of it, but Reader is a solid Safari feature and the changes in iOS 9 are good ones.6
\nTap & hold, Paste and Go.
A great little touch in Safari for iOS 9 is a way to quickly open a URL or a new search query. If you have a link copied in the clipboard, you can tap & hold the address bar and hit “Paste and Go” to navigate to the copied link directly. This is a fantastic shortcut that has made me save several seconds every day: before iOS 9, I had to select the address bar, select all text, paste, and hit Return. In iOS 9, the same result can be achieved in two taps.
\nSafari is also smart in recognizing whether you’re holding the address bar with a URL or some text in your clipboard. If what you’ve copied isn’t a URL, “Paste and Search” will be an option instead. This lets you look up what you’ve copied on the default Safari search engine.
\nEven if you don’t need to paste URLs or search terms, you can still tap & hold the address bar to get a ‘Copy’ button that lets you copy the current webpage to your clipboard.
\nWith iOS 9, Apple has also changed two other Safari features – “Request Desktop Site” and “Find on Page”. Both options are now available as action extensions in the share sheet; the updated placement makes them more visible and puts the spotlight on the share sheet, which too many users are still unaware of.
\nBoth features can also be activated in other ways. “Find on Page” can still be accessed by typing any word in the address bar and tapping the ‘On This Page’ result at the bottom of the list.
\nApple’s designers got more creative with the alternate location for “Request Desktop Site”: if you tap & hold the refresh icon in the address bar, you’ll get a shortcut to get the desktop version of the webpage you’re viewing. This is another handy time-saving tweak, and it makes sense to bundle it with the refresh icon as requesting a desktop site does refresh the current page.
\nAn important addition to iOS 9’s Safari (that is hopefully not going to be seen a lot) is detection of phishing attempts. When visiting a website that has been reported as phishing and is likely trying to trick users into disclosing personal information, Safari will turn red. The browser’s not blushing – it’s automatically preventing the website from loading. In this special view, Apple has included links to learn more about phishing scams, report an error, go back in navigation, or continue anyway.
\nPro tip: never visit this site. Unless you have to write a review of iOS 9.
This is a good move, and changing the color of the UI helps in paying attention to what’s happening on a potentially malicious webpage. The Safari team put a lot of thought into this: the share sheet will be disabled as well, preventing users from sharing the URL with others or mistakenly adding the website to their bookmarks.
\nThe other changes in Safari for iOS 9 are in line with the “small and nice to have” nature of updates in this release. If you’re using Safari with an external keyboard, you can select results in the address bar with the Up/Down arrows and confirm your choice using Return – this change alone allows me to use Safari with my Belkin keyboard more consistently as I rarely have to touch the screen.
\nIf you have multiple usernames and passwords saved for a website, Safari’s autofill will put up a new Passwords button to view the logins you’ve stored and pick a password to log into the current webpage.
\nUnfortunately, this collection of minor changes is all there is to Safari in terms of core app updates. There’s still no way to properly manage file downloads – truly an inexplicable decision now that iOS 9 offers plenty of choice with extensions and an iCloud Drive app to manage files. It’s absurd that, in 2015, if you want to download a bunch of files from the web on an iPad, you need a third-party app.
\nUnlike its counterpart on El Capitan, Safari on iOS 9 doesn’t offer the ability to pin tabs to the leftmost side of the tab bar – a feature I would have liked to see on the iPad. Among all the new keyboard shortcuts in the app (there’s even one to show and dismiss Safari Reader), there still isn’t one to activate the share sheet and use extensions with the keyboard – an oversight I’d like to see fixed in the near future.
\nThankfully, what Apple didn’t bring to Safari in iOS 9 is offset by features to extend the browser in new ways.
\n\nSafari on iOS 9 supports a new type of extension: Content Blockers. By content blocking, Apple means the ability for Safari to identify subsets of content or resources on a webpage to hide them or prevent them from being loaded altogether. Through an extension that provides a set of triggers and actions, Safari can block cookies, images, scripts, pop-ups, and other content with CSS overrides and full resource blocking.
\nOn the surface, Content Blockers may be viewed as ad blockers for iOS, and that’s likely going to be their most popular use case, but they can do much more than simply blocking ads on webpages.
\nContent Blockers, like any other extension on iOS, are installed from apps you download from the App Store. An app can offer a Content Blocker extension, which you can activate in Settings > Safari > Content Blockers.
\nContent Blockers are based on a fast, memory efficient model that informs Safari on what to hide or block ahead of time rather than requiring Safari to consult with an app while loading a webpage. To create a Content Blocker, developers have to provide a JSON object that contains dictionaries of triggers and actions; this is then compiled into bytecode, which can be evaluated very efficiently by Safari. For performance reasons, Content Blockers are only available for apps compiled to run on 64-bit architectures, supported by the following devices:
\nIn addition, Content Blockers are supported in Safari and the modern Safari View Controller; they’re not available in legacy web views built using UIWebView and WKWebView.
\nThe model behind Content Blockers is aimed at being faster and more private than existing content blocking extensions for desktop browsers. In addition to not having to consult with a full app while loading a webpage – a task that would increase RAM and battery consumption – what the user does inside Safari is never exposed to Content Blockers. The URLs of webpages that users view in Safari with a Content Blocker installed are never passed by Safari to the Content Blocker itself. The only job of a Content Blocker is to provide a dictionary of rules; Safari takes care of everything else.
\nThe actual rules that build a Content Blocker are organized in triggers and actions. Triggers can be filters for specific URLs, resource types (such as scripts or media), load types (first-party for resources from the same domain, or third-party from external domains), as well as filters to apply to a specific domain or to other domains except a specific one. Filters can be combined in a dictionary by a Content Blocker: for example, an app could assemble a trigger for all third-party scripts on amazon.com or a trigger that uses a regular expression to hide instances of the word “espresso” on every website except macstories.net.
\nAs for actions, Content Blockers can either apply a CSS override (using css-display-none
) to hide content or block matched content defined in the trigger. When blocking a resource, Safari won’t just load it and hide it in the background: it’ll completely prevent the matched resource from loading. From a technical standpoint, the Safari team has done a great job at making Content Blockers easy to build, fast, and private. Even non-developers can look at the JSON object powering a Content Blocker and grasp what’s going to happen once activated.
Content Blockers are a byproduct of the modern web. Given the astronomic rise of web advertising and businesses based on tracking user behavior and personal information through scripts embedded on webpages, there has been a considerable growth in the average size of webpages in recent years. It’s not uncommon to come across a webpage that goes over 10 MB of data for all the external scripts, resources, and media that a browser needs to load even if they aren’t necessarily part of the content the user is interested in.
\nThe name ‘Content Blocker’ is something of a misnomer, since this feature isn’t meant to hide or block content, as most people would define it. Instead, they act on ads or scripts which often do block content, such as a popup image which covers what you are trying to read.
\nContent Blockers can also hide any page content with CSS overrides. If you’ve ever wanted to be able to hide comments, social buttons, sidebars you don’t care about, or interactive galleries from your favorite websites, Content Blockers will offer a way to do so with a few rules.
\nRight or wrong, this is what Content Blockers are: Safari extensions that can hide or block content with good performance and user privacy in mind. Rather than dwelling on the moral aspects of this technology, I set out to discover if Apple’s claims were reflected in Content Blockers built by developers for iOS 9.
\nSince June, I’ve been testing eight Content Blockers for iPhone and iPad. There was a lot of overlap between them: most developers I spoke with were building Content Blockers aimed at blocking ads and trackers, using public rules available from EasyList and EasyPrivacy. Content Blockers that only focused on blocking ads and scripts through those databases worked fine, but one was comparable to the other – and I suspect we’re going to see a lot of Content Blockers that use EasyList and EasyPrivacy with a minimal UI to wrap it all together.
\n1Blocker is a highly customizable Content Blocker.
Curious to have a better understanding of Content Blockers, I tried some that went beyond simple ad/tracker blocking and that incorporated CSS overrides, specific resource blocking, and user-defined filters. I ended up using two Content Blockers for this review:
\nWith these Content Blockers installed, I ran tests on a few websites I frequently visit that heavily rely on third-party ads, tracking networks, and external resources. For some of them, I also applied custom CSS overrides to make the content more pleasing to my eyes, which involved hiding sidebars, comment sections, footers, and other boxes of content I didn’t care about.
\nApple makes it easy to test Content Blockers and measure their performance: with a Content Blocker installed, a tap & hold on the refresh icon in the Safari address bar will reveal a second option to reload the current webpage without Content Blockers. This shortcut comes in handy if a site becomes problematic after blocking some of its content, but it can also be used with Safari’s Web Inspector on OS X to measure resources, size of resources, and time until the load event fired with and without Content Blockers.
\nAll of the following tests were performed on an iPad Air 2, on a Wi-Fi network, with caching enabled.
\nFor context, you can also take a look at screenshots of a few websites before and after applying Content Blockers. Changes include both blocked resources and elements hidden via CSS.
\nLeft: Content Blockers disabled.
The numbers above speak for themselves. I started using Content Blockers on my iPhone and iPad in June. Two weeks ago, out of curiosity, I decided to disable them for a few days to monitor my reaction. That’s when I realized I’m not going to disable Content Blockers if the web doesn’t improve in the foreseeable future.
\nI’ve gotten used to seeing webpages load in two seconds again. For the first time in years, I haven’t exceeded my monthly data cap, with hundreds of MBs of data left in my plan. In the long run, Content Blockers will give me back dozens of minutes I haven’t spent waiting for content to load alongside ads and trackers.
\nEffectively, using Content Blockers on iOS 9 feels like getting a new web. Lighter, faster, more private, and less cluttered.
\nAnd that’s where I feel bad, and where the entire discussion about Content Blockers lies. I don’t feel good knowing that, by using Content Blockers, I’m not supporting my favorite sites through the ad and tracking-based business model they opted for. After all, if an article on the web is given to me for free, what right do I have to complain about the means used to monetize it?
\nOn the other hand, though, this is my time and my cellular data – money out of my pocket – and the ad-supported web has gotten out of hand. As a user and website owner, a simple article with three paragraphs of text and one image shouldn’t weigh 5 MBs and take 20 seconds to load. I can’t accept webpages that continue processing network requests in the background for over a minute, consuming battery on a device that I may need at the end of the day for an emergency. I’m all for ad-based business models – this very website is partially supported by sponsors – but is there really no better, more tasteful, less intrusive way to run a website than what we’re seeing on the web today?
\nAs a user, am I supposed to feel bad about Content Blockers and not use them at the risk of wasting data and battery life, or should I fight back?
\nSome websites can change a lot with Content Blockers.
Apple’s answer seems clear. With iOS 9, Apple is not pre-installing any Content Blockers in Safari, and I’d be surprised if they’ll ever actively promote one on the App Store. But they have made it ridiculously easy to install and use one, with benefits that aren’t conceptual or limited to power users – everyone understands more battery and less data. When the majority of iOS users will realize that Content Blockers make Safari and the web faster and lighter, I can’t imagine why they wouldn’t want to use one. And, I bet, most of them won’t feel as bad as I did when writing this review. They won’t add sites to whitelists like I did, to enable ads on them. They won’t care about the business model of a website. They’ll just enjoy the cleaner, lighter web made possible by Content Blockers.
\nApple is killing the proverbial two birds with one stone here. By building Content Blockers as extensions bundled inside apps and prioritizing user privacy and performance over anything else, they are leveraging the App Store to let users discover and install Content Blockers, which can’t observe what a user does in Safari. This is quite a change from popular desktop blockers that, by evaluating rules while a webpage is being loaded, can track user behavior and URLs.
\nAlso, by making them exclusive to Safari, Content Blockers will likely create a powerful network effect: when users of third-party browsers on iOS (possibly made by a company based on ads) will know that only Safari supports Content Blockers, I bet that millions of them will switch to Apple’s browser.
\nApple described Content Blockers as a way to add something to the experience of a webpage “by taking something away”. I’m okay with what’s being taken away, and my devices are better because of it. Content Blockers are a win-win scenario for Apple and users. As for web publishers…The times, they are a-changin’.
\n\nWith iOS 9, Apple wants to reimagine in-app web views using the interface, features, performance, and security of Safari. To do so, they’ve built Safari View Controller – a Safari view that lives inside an app and that runs in a secure, isolated process.
\nLeft: standard Safari View Controller. On the right, Safari View Controller with a custom tint color set by an app.
With Safari View Controller, Apple is giving developers a way out from custom web views, allowing them to stop building miniature web browsers with less functionality than the system browser. As I detailed earlier this year, this has major implications for how the web is embedded and experienced in iOS apps.
\nSafari View Controller has been closely modeled after Safari with consistency and quick interactions in mind. Safari View Controller looks a lot like Safari: when users tap a web link in an app that uses it, they’ll be presented with a Safari view that displays the address bar at the top and other controls at the bottom or next to it.
\nSafari View Controller looks a lot like Safari.
Safari View Controller mostly looks like Safari for iPhone and iPad: on the iPad, navigation controls, the address bar, and a share icon live in an address bar at the top; on the iPhone, navigation and share buttons are displayed at the bottom. Both flavors of Safari View Controller come with a Done button to dismiss the browser in a consistent position at the top right.
\nThere are two minor differences with Safari: when opened in Safari View Controller, the URL in the address bar will be grayed out to indicate it’s in read-only mode; while users can navigate to other links in a webpage displayed in Safari View Controller, they can’t tap the address bar to manually type a URL or start a new search. And, a Safari button is available in the toolbar, so that users can jump to Safari if they want to continue viewing the current webpage in the browser.
\nIn addition, Apple is making sure that user privacy and security are highly valued in how Safari View Controller operates. Safari View Controller runs in a separate process from the host app, which is unable to see the URL or user navigation happening inside it (same concept of Content Blockers). Therefore, Apple has positioned Safari View Controller as entirely safe: private user data stays in Safari and is never exposed to a third-party app that wants to open a link with Safari View Controller. This is an important difference from custom web views (built using legacy APIs such as UIWebView and last year’s WKWebView) that are adopted by millions of apps, including those of big companies like Twitter, Facebook, and Google.
\nThe considerations behind Safari View Controller have allowed Apple to port much of the polish and functionality that users know (and expect) from Safari to any app that uses Safari View Controller in iOS 9.
\nSafari View Controller in Twitterrific. Notice the WordPress toolbar because I’m already logged into MacStories.
Safari View Controller shares cookies and website data with Safari. If a user is already logged into a specific website in Safari and a link to that website is opened in Safari View Controller, the user will already be logged in. Safari View Controller has access to iCloud Keychain’s password autofill, plus contact card and credit card autofill, Safari Reader, and the system share sheet for action and share extensions. If Private Browsing mode is enabled in Safari and a user opens a webpage in Safari View Controller, it’ll open in Private Browsing as well.
\nSafari View Controller in Private Browsing mode, showing Safari Reader and the system share sheet.
Like Safari, Safari View Controller supports detection of phishing websites, it has a nice built-in progress bar, and it also provides custom views for errors and other issues in webpages. To top off everything that’s shared between Safari and Safari View Controller, Content Blockers you activate in Safari will be enabled for Safari View Controller, too – by itself, a huge incentive to start using it.
\nSafari View Controller isn’t without its fair share of issues and questionable design decisions. First and foremost, it doesn’t give developers too much control over the visual appearance and behavior of the in-app web browser.
\nDevelopers can style Safari View Controller with a custom tint color: they change the color of buttons to match their app’s main color, and that’s it. They can’t change the color of the entire UI or insert custom interface elements in the toolbars. This helps with consistency in that Safari View Controller closely resembles Safari – therefore giving users a sense of security and familiarity – but the effect can be somewhat jarring if the host app’s interface wholly differs from the look of Safari.
\nAt a technical level, because Safari View Controller doesn’t expose anything about the webpage being loaded, apps that use it have no control over caching and they can’t use webpage information in any way beside letting users share what they’re viewing with a share sheet (which is also sandboxed). Again, this was done in the name of security and performance: there are trade-offs involved to provide users with a way to open webpages with a Safari view using shared credentials, the Nitro JavaScript engine, Content Blockers, and everything that comes with Safari.
\nSafari View Controller opens modally on top of the host app, with the only way to dismiss it being a ‘Done’ button at the top of the screen. Developers can’t use custom gestures to dismiss Safari View Controller – such as the popular swipe from the left edge – and they can’t put it inside an app’s view so users can move back and forth between other views and Safari View Controller. If you’re used to being able to open a web view in Tweetbot and switch to another tab while the webpage is open, prepare to say goodbye to that ability if Tweetbot implements iOS 9’s new web view. Safari View Controller is modal, and the webpage completely takes over the host app functionally and visually.
\nAside from the shortcomings I’ve already mentioned for Safari, my main problem with Safari View Controller is that it lacks one essential Safari shortcut I’ve come to rely upon. Unlike Safari, you can’t tap & hold the refresh icon in the Safari View Controller address bar to request a desktop site or reload a webpage without Content Blockers.
\nAnother issue is that, currently, the 1Password extension won’t show up in Safari View Controller as it’s unable to fill the contents of a webpage with user information. This can be a major annoyance if you don’t use iCloud Keychain in addition to 1Password, but it hasn’t been a problem for me as I always save passwords filled by 1Password in iCloud Keychain as well. If you’re already logged in via Safari, you will be logged in via Safari View Controller.
\nDespite these limitations, the benefits of Safari View Controller are too convenient for developers to ignore.
\nI’ve been able to test beta versions of my favorite apps with Safari View Controller support, including Twitterrific, Dispatch, and Terminology. After getting used to the speed and convenience of Safari View Controller, I don’t want to go back to custom web views.
\nWhen I tap a link in my Twitter stream and it opens the same webpage I’d see in Safari, it feels like the web is better integrated across the entire OS rather than split in a bunch of proprietary renditions of it. Webpages open fast; I’m already logged into websites I know; and, I have native access to features like Safari Reader, extensions, and Content Blockers. With Safari View Controller, I don’t have to learn a new web view every time I use a different app. Safari – and everything I like about it – comes along for the trip, giving me performance and functionalities I expect from my iOS browser.
\nThere are apps in which Safari View Controller won’t be a suitable option. Editorial’s built-in web browser, for instance, supports multiple tabs and access to webpage content with JavaScript, both of which can’t be implemented in Safari View Controller. I wouldn’t want to see apps like Editorial abandon their advanced web views just for the sake of adopting Safari View Controller. But in most cases, apps that only display one webpage at a time will be better off using Safari View Controller, as it offers a consistent interface, speed, and a safer environment for users. At this point, I’m just annoyed whenever I come across an iOS 9 app that doesn’t use Safari View Controller to open links.
\nEditorial’s browser has tabs and custom integrations that won’t be available in Safari View Controller.
Though I’d like Apple to address them, the potential issues of Safari View Controller haven’t been showstoppers in daily usage. At the cost of losing custom branding and custom menus in web views, I now prefer following links that open in Safari View Controller because I feel safe knowing that it’s Safari, and because I know the webpage will be displayed exactly like I want it to be.
\nIf anything, the biggest problem of Safari View Controller is that I doubt companies with a vested interest in custom features and user tracking will use it. Just like iOS 8 extensions didn’t provide developers with data about user behavior, which resulted in the likes of Facebook, Twitter, Google, and Pinterest not using them (or taking a long time to support them – with a different tactic), I wonder if Safari View Controller adoption will be slowed down by an unwillingness to support a sandboxed web view.
\nWe’re already seeing the kind of argument these companies have against Safari View Controller, and I’m afraid it could remain a niche feature of iOS 9 for millions of users. I’ll be happy to be proven wrong here.
\nSafari View Controller solves two problems. Apple wants to give developers a way to focus on other parts of their apps’ code, leaving the job to display web content to a browser that has been refined and improved over the years. At the same time, their goal is to give users a consistent, safe, and fast way to open webpages with the functionality they already know.
\nContent Blockers and Safari View Controller draw a line in the sand between web companies and users. Apple has chosen to give users the power to fight back and enjoy a faster, safer web. The ball is in the developers’ court now.
\n\nLast year’s most notable addition to iOS – action and share extensions – came from an unsuspected place: the share sheet Apple had long used to let users share photos and text with preinstalled actions.
\nIt wasn’t an optimal solution, though: the one-size-fits-all approach of the share sheet meant developers had to cope with issues related to how different inputs were passed to extensions; the share sheet’s design caused discoverability concerns as many users couldn’t understand how to enable extensions. Apple built a framework for action and share extensions, tied it to the share sheet, and left developers and users to deal with figuring out how it worked.
\niOS 9 only partially addresses these issues. For developers, iOS 9 has a new dictionary API that lets extensions show up even if they only accept one of multiple input representations shared by an app. This is aimed to fix the paradox of developers who, to be good platform citizens, made their apps share data in multiple formats and ended up punishing users because iOS 8 extensions had to support every single input format to appear in the share sheet.
\nImagine an app that can share text in multiple formats such as TXT, PDF, and HTML. In iOS 8, an extension needed to support all formats to be an option in the share sheet triggered in the host app; in iOS 9, the new dictionary API enables extensions to be available in the share sheet even if they only support one of multiple representations shared by the host app.
\nIf adopted by developers (the feature is opt-in), iOS 9 extensions should be available in more places. For users, this could mean fewer mysterious disappearances of icons from the share sheet depending on the app they use. Alas, I wasn’t able to test this during the summer as I couldn’t get my hands on enough betas of apps with support for the new extension dictionary API.
\nAnother share sheet-related change in iOS 9 is the ability to open it from a new “Share” option of the copy & paste menu. This way, apps that chose to avoid share sheet integration in iOS 8 (such as Apple’s own Mail and Reminders) will feature extension support in iOS 9 as long as they allow users to select text. It also makes it more obvious that it’s possible to share selected text through extensions instead of having to reach for a separate share icon.
\nThis is a welcome tweak, with some reservations. The new menu-triggered share sheet doesn’t expose all the information that is available from the regular share sheet. For instance, if you select some text on a webpage in Safari and hit the share icon in the top toolbar, extensions will be able to see a variety of data about text selected on the webpage, the URL, HTML code, and more.
\nA share sheet in Mail for iOS 9.
If you select text in a webpage and share with the new menu button, only selected text will be passed to extensions as plain text, with no additional webpage information embedded in it. The same happens for any app that works with formatted text: you can try this yourself by sharing some rich text from an email message to the Mail extension itself (Mail inside Mail is now possible, too).
\nApple isn’t backing away from the share sheet, nor are they making considerable changes to the way extensions can be discovered, managed, and activated. Developers still have no way to programmatically trigger specific extensions or take users directly to the share sheet’s configuration screen; users need to manually open the share sheet’s setting view to toggle action and share extensions on and off.
\nExtensions remain one of the most powerful technologies of iOS. But as I’ve argued before, they’ll have to break out of the share sheet sooner or later to embrace a wider audience. I was hoping iOS 9 would mark the framework’s maturity with the ability to customize how extensions are activated while keeping everything safe and consistent, but I’ll have to postpone my hopes until next year.
\nApple’s Reminders app hasn’t witnessed a meaningful alteration in iOS 9, carrying over the same design introduced with iOS 7 in 2013 (paper texture included) and keeping a structure based on lists. What Reminders does gain in iOS 9, though, is a new way to capture any kind of information from apps through a universal interaction layer – Siri.
\nIn iOS 9, you can create smart reminders in any app by summoning Siri and saying “remind me about this”. Through that sentence, Siri will capture the state of your current app activity – a webpage in Safari, a message in Mail, a view in a third-party app – and it’ll save it as a reminder that includes the app icon and a link to reopen the view from the originating app. Smart reminders are supported across every iOS app, and a due date or time can be appended to the Siri query, too. After issuing the command, Siri will immediately display the newly created reminder, carrying the icon of the app where you’ll want to go back to.
\nThe way Apple has implemented this feature is clever, and it shows quite a bit of foresight on their part. Smart reminders via Siri use Handoff, introduced in iOS 8, to capture app state through the NSUserActivity API: the same technology employed to continue an activity from one device onto another can now be used to save that activity as a reminder that contains a deep link to an app.
\nDevelopers that started supporting Handoff last year have already done most of the work required to expose their apps to Reminders via Siri: the app icon and relevant section are fetched by NSUserActivity; optionally, developers can add a title to any activity available to the API. The title will become a todo’s name in Reminders if the activity is captured via Siri.
\nA note opened from Reminders.
The beauty of this system is that iOS 9 makes virtually any app interaction a potential source for Reminders. Any app view can (theoretically) be an indexable user activity. The latest message in an iMessage conversation can be saved as a reminder. A user profile in Twitterrific is an activity you can save. A point of interest in Maps is something you can be reminded about. An article from Pocket or Instapaper is a view that Siri understands because it’s a user activity. The list goes on and on, and developers don’t have to adopt any new technology to restructure their apps around this concept. As long as they use Handoff and set the right properties, apps can give Siri a way to create reminders off of them.
\nAs a user, it’s nice to be able to open Siri, create a reminder for anything I’m doing in an app, and rest assured that it’ll be easy to get back to it later. In the Reminders app, todos created this way display an app icon on the right, which can be tapped to reopen the app in the view you asked to be reminded about. Similarly, if a smart reminder has a due date, its alert will have a button to launch the app with one tap.
\nSometimes, Siri can’t remind you about apps.
Smart reminders won’t work everywhere7, but they’re as close to being a universal glue between Reminders and apps as they can possibly be. I don’t use Reminders as my primary task management system (I like Todoist, and I’ve been experimenting with 2Do lately), but I recognize the utility of capturing the context of my activity for a task.
\nTreating app views as points of interest that can become todos and restored at any time is a powerful idea, unavailable to other todo apps on iOS. It facilitates a new type of interaction: rather than saving some generic text as a task, you’ll be saving a reminder that takes you back to the rich experience of an app.
\nI have two problems with smart reminders in iOS 9. The first one is that you can only create them with Siri: if you find yourself in a situation where you can’t talk to Siri but still want to create a smart reminder, you can’t. It’s an odd omission, especially given the new Reminders share extension available to any app that wants to use it.
\nThe new extension can create normal reminders with the same details screen (for dates, location, etc.) of the Reminders app, with no concept of captured user activities and deep links whatsoever. Creating new reminders from Safari through the extension does put a Safari icon in the reminder, but it’s an exception.
\nThis leads me to my second problem. The deeper Safari integration of the Reminders extension and a lack of public APIs for smart reminders suggest that the feature is exclusive to Apple’s app.
\nSmart reminders in Reminders (left) and Fantastical, without deep link metadata.
Developers of Calendar and Reminders clients such as Fantastical or Sunrise won’t be able to read the deep link information (app icon and linked activity) stored in smart reminders; instead, they’ll display them as text-only items. There might be workarounds8, but they’re not official solutions – which makes the entire system a little less useful if want to use the Reminders backend without the Reminders app.
\nThe other addition to Reminders in iOS 9 is the ability to create reminders for when you’re getting in or out of your car. These reminders use a Bluetooth connection in your car to determine when your device has entered or left the vehicle, which can be useful to remember to do something before driving or immediately after stopping. To this day, Apple hasn’t clarified whether car reminders in iOS 9 require CarPlay or can work with generic car Bluetooth and third-party accessories.
\nIn my tests and informal polls on Twitter, iOS 9 car reminders didn’t work with Bluetooth dongles used to add Bluetooth support in a car (such as my Aukey one), but they work with built-in Bluetooth configurations (by the car manufacturer) as well as CarPlay ones (of course).
\nI wasn’t able to test this feature, but I’m intrigued by the idea of Reminders getting more savvy about user context and location. I wouldn’t be surprised to see HomeKit devices, iBeacons, and Apple Watches from friends nearby becoming future “reminder points” for the app.
\nReminders continues to be a basic todo app that works for a lot of people but that falls short of advanced options for users like me. That’s okay: Apple doesn’t need to provide a full-fledged task manager for the majority of iOS users. The idea of turning app activities into reminders is interesting, and I guess that it’ll become a fixture of many people’s habits going forward. It hasn’t been enough to lure me back into Reminders, but, between Siri reminders and car alerts, its smart simplicity is starting to look a lot more attractive.
\nFor years, certain circles of tech observers argued that Apple should have added a visible filesystem on iOS to make it more like OS X. And for years, Apple went in the opposite direction, doubling down on sandboxing and secure communications between apps.
\nLast year, we saw the culmination of those efforts with document provider extensions and a revamped document picker that enabled users to pick (via an extension) any app as a storage service. Even so, don’t be fooled by the new iCloud Drive app in iOS 9: this is not Apple relenting and bringing a Finder to iOS. It’s an app with a document provider extension – and even a mediocre one.
\nThe iCloud Drive app in iOS 9 is a wrapper for the iCloud Drive documents that users have been able to view in the iOS document picker since last year. There is nothing new or surprising about it: it’s the same interface you know from iOS 8 document pickers and OS X, with app folders, various sorting options, and buttons to create folders and move files around. The app is installed by going to Settings > iCloud > iCloud Drive > Show on Home Screen, but in testing I also received a prompt to install it as soon as I made a modification to a file on my Mac and the change propagated to iOS.
\nI guess the reason the app exists is to give people a simple way to view and manage iCloud Drive documents – and that’s good, given Apple’s baffling decision not to offer an iCloud Drive app last year. But everything else in the app is mostly unsurprising, confusing, or frustrating.
\niCloud Drive versions in MindNode.
The app is extremely basic: it doesn’t let you search from the root view into subfolders, there’s no way to view and restore versions of files9 (or recover deleted files, for which you’ll need the iCloud website), and you can view tags, but you can’t create them because they’re still exclusive to OS X.
\nTo my knowledge – and I’ve looked around – there is no way to add photos from the Photos app to iCloud Drive. If you take a screenshot and want to organize it in a folder outside of Photos, I don’t know how you would do it. This stems from the fact that there is no system-wide iCloud Drive extension to save files from anywhere. Well, there is one, but it’s only available for some types of attachments in Mail, and you can’t use it anywhere else.
\nIn iOS 9, document-based apps can support a new LSSupportsOpeningDocumentsInPlace
property that adds the ability to open files from other apps “in place” to edit them without creating a copy. For the iCloud Drive app, this means that documents from iCloud-enabled apps can be tapped in iCloud Drive and they’ll automatically open in the associated app without generating a duplicate file.
I ran some tests with an iOS 9 version of MindNode, a popular mind-mapping app for iPhone and iPad. When tapped in iCloud Drive, MindNode documents opened the MindNode app in-place, taking me straight to editing. This can also be done by long-pressing a file and tapping the ‘Info’ button to open a file in an associated app.
\nFrom iCloud Drive to editing in MindNode. Will this work with other apps?
However, soon after installing MindNode, iCloud Drive started automatically opening Numbers files into MindNode. My understanding is that, given the lack of an iOS 9 version of Numbers, iCloud Drive sent Numbers documents to an app that supported the file format and the open-in-place feature. What’s going to happen once multiple apps support open-in-place and the same file type? Will iCloud Drive display a prompt to pick an app for each file? I had no way to test this.
\nIdeally, this is a good user experience. The iCloud Drive app is a container of files that lets users jump to editing documents in other apps, avoiding copies and confusion with duplicates. Will the system work once (and if) more developers adopt the open-in-place functionality? It’s a good addition, but it’s too early to tell.
\nCan you guess what this means?
The user experience of document provider extensions is still problematic, with too many steps required to install and navigate external services and a confusing mix of modes for copying and opening files. In addition to open-in-place, another change in this area of iOS is a new label for document providers that don’t let other apps open and edit their documents (such as Dropbox and Google Drive). I’m not sure, though, that users will understand what this means, and, like open-in-place, I haven’t been able to properly test this functionality with different apps.
\nThe iCloud Drive app joins a complicated array of features, and it does little to improve iOS file management workflows except being an icon on the Home screen and supporting new search APIs in iOS 9.
\nApple has a lot of work to do with iCloud Drive on iOS – for both the app and the extension. iOS 9’s iCloud Drive doesn’t offer the same document management and search features of the Finder of OS X or the clarity and reliability of competing services such as Dropbox, Google Drive, and OneDrive. I know I’m not going to put critical work documents – such as this review – in iCloud Drive any time soon. The tools it offers aren’t enough to make me feel safe about it.
\nDon’t believe anyone who says the iCloud Drive app for iOS 9 marks the arrival of proper file management on iOS: the road ahead is long and winding.
\nSince moving past tape reels and realistic buttons in 2013, Apple’s Podcasts app has received a variety of incremental updates that have made it a decent solution for people who are not seeking the advanced options of third-party clients such as Overcast and Pocket Casts. Apple’s work on Podcasts culminated in September 2014 with the app being pre-installed on iOS 8, further cementing it as the default option for millions of people looking for a way to listen to their favorite shows. Apple’s Podcasts didn’t destroy the market for third-party alternatives – many of which are still thriving – but it brought a good-enough solution for everyone.
\nWith iOS 9, Apple seems to acknowledge that the renaissance of podcasting has to be sustained by a more capable listening experience, and they’re enhancing the Podcasts app with welcome improvements on both the iPhone and iPad, particularly for viewing new episodes and controlling playback.
\nThe main addition to the app is a Music-inspired mini player that is available at the bottom of the screen and that can be used to pause and play episodes, see what’s playing, and access options without going into the full-screen Now Playing view. The mini player has worked well for iOS 8.4’s Music app in that it provides a convenient shortcut for playback controls; on big iPhones, it’s easier to reach the mini player than stretching your thumb to tap a Now Playing button at the top.
\nThe mini player works well for Podcasts, too: it can be tapped or dragged to reveal a modal Now Playing view, which, just like Music, comes with a semi-translucent bottom half and a tappable artwork in the upper part that contains episode descriptions and show notes (what would be lyrics in Music). Like Music, Podcasts for iOS 9 offers queue management with Up Next/Play Next to build a list of episodes that can be controlled separately.
\nThe sleep timer has been moved to center of the Now Playing screen, sitting between a share icon and a contextual menu to save an episode or remove it from downloads, view its description, and share it.
\nUnfortunately, Podcasts also borrows the bad parts of Music’s Now Playing view: there are two ways to share an episode’s link (with an obnoxious “Check out this cool episode” prepended message), the progress bar is too thin to be dragged comfortably on smaller screens, and the view doesn’t take advantage of the iPad’s big screen. Episode descriptions aren’t displayed next to an episode’s artwork or below it on the iPad, so you’ll still be tapping a large artwork in the middle of the display to show a small box containing text and links – not exactly a good use of the device.
\nMusic-based changes aside, the second major addition to Podcasts is in the show organization department with a cleanup of the My Podcasts and My Stations views. In iOS 9, Apple has removed the ability to show all podcasts in a grid view and it has rolled stations into My Podcasts. List view will be the only option in the app now, and any station will be available at the top of the list with a play button to start listening. You can still tap a station to view episodes inside it and tweak settings, which offer the same options of iOS 8.
\nWith stations becoming part of My Podcasts, Apple freed up a slot in the app’s tab bar, which is now used by an Unplayed view that is my favorite change in this release. Like other podcast clients before, Podcasts defaults to showing a reverse chronological list (newest to oldest) of all episodes from all podcasts, grouped by day, regardless of their download status.
\nSupertop’s Castro (left) and Podcasts on iOS 9.
Heavily inspired by Supertop’s Castro, the Unplayed view of iOS 9 provides an easy way to view what’s new and Unplayed without dabbling in stations and search. As a result, finding episodes to play is faster and their presentation is better as they indicate what’s already been downloaded with an icon.
\nA nice touch of this view is that, as you keep scrolling into past episodes, they will be grouped by extended periods of time such as This Month, Last 3 Months, Last 6 Months, and This Year instead of standalone weeks and days. Also, I like how you can search for text in descriptions – a great option for those who often remember specific episodes by links mentioned in it.
\nTo cap it all off, Apple has refreshed individual show pages with more options as well. When a podcast has downloaded episodes on the device, a Saved tab will sit next to Unplayed and Feed to group all episodes stored locally. Podcast and storage settings can be accessed with a new gear icon, and you can choose to remain subscribed to a podcast without receiving notifications of new episodes from it.
\nI love Marco Arment’s Overcast, and I find its many thoughtful details and listening features unparalleled in any podcast app for iOS. After using Podcasts extensively, though, I admit that I’m liking the mix of simplicity and moderately advanced options it has. The Unplayed view is a terrific addition, and the Music-inspired design and interactions create a cohesive audio experience on iOS 9 that make the two apps feel as part of the same ecosystem. Even when searching on the iTunes Store, iOS 9’s Podcasts displays a new tab to get individual podcast episodes, which is fairly similar to Apple Music in that you can start streaming or download right there from search.
\nI’ve been listening to my favorite shows with the Podcasts app since June, and I’m fine, but I’ll probably go back to Overcast because of its audio effects and upcoming features. The temptation to keep using Podcasts and its system integrations – like Siri, or controls on Apple Watch – is strong, and I can safely recommend the app to anyone looking for a default podcast client. Apple’s work on Podcasts for iOS 9 is solid, and the app is a good option for everyone at this point.
\nMail updates in iOS 9 are far from what I wished for, but there are two changes worth mentioning.
\nSaving an attachment from Mail in iOS 9.
You can now add attachments to new messages from any document provider extension on your device, including iCloud Drive. To do so, you need to tap & hold in the message body to show the copy & paste menu and choose the new ‘Add Attachment’ option.10 This will show the document picker to select files from installed apps such as Dropbox and Google Drive, which will be inserted as inline attachments in a message.11 It’s another example of the idea that, in lieu of a traditional filesystem, extendible apps are the modern filesystem of iOS.
\nThe second addition is Markup, a feature borrowed from OS X that brings the power of Preview annotations to iOS. Available for PDF files and images and exclusive to Mail, Markup lets you annotate a file when sending a message or reply with Markup to a file sent by someone else. To annotate a file, you can select it or place the cursor next to it and choose Markup from the copy & paste menu; for replies, you can use an action extension or hit the toolbox icon in the lower right corner of a Quick Look preview in Mail.
\nMarkup works like the eponymous feature introduced in Yosemite last year: you can add text and shapes, choose between various colors and stroke sizes, add a magnification loupe with adjustable size and zoom, and add a signature. Two nice details that I appreciate: iOS detects freehand shapes and it offers to replace them with a precise version; and, multiple signatures can be added to Markup and they’ll be previewed in a menu so you can choose one. Markup signatures will also be synced with iCloud across devices.
\nMarkup on iOS 9 is good and it works well for annotating images and documents, pointing out ideas and issues to someone over email. The feature has been built with collaboration in mind: annotations won’t be flattened onto the document after sending it, so you’ll be able to remove Markup annotations from another person, add your own, and send the document back with your notes.
\nThe only problem of Markup, in fact, is that it’s too good to be limited to Mail. Apple needs to make Markup compatible with Photos, and even better it should be an option of native Quick Look previews or an action extension for system-wide annotations. I wish I could use Markup anywhere, but I can’t.
\nIn terms of minor enhancements, Mail’s swipe gestures have received new icons to more easily distinguish actions. In-app search has been slightly updated as well, with new search tokens that can be tapped to reveal additional filters. There’s also a progress bar that indicates when search is loading (unfortunately, it’s still slow for Gmail accounts).
\nTap search tokens to reveal options.
Mail is a fine default client, affected by the same problems I’ve been covering for the past two years. Search inside the app is slow, there’s no way to make messages actionable with extensions, and the inbox lacks the smart organizational tools found in popular third-party clients such as Outlook, Spark, Inbox, and CloudMagic.
\nMail for iOS is a desktop-class app, but that’s starting to become a liability. I’m hoping to see more iOS-first features next year.
\nApple News, the company’s vision for the future of Newsstand mixed with a response to Flipboard and Instant Articles, is launching this week in the U.S., U.K., and Australia. While I was able to try the service in Italy by changing my device’s region format to American, that’s proven to be utterly inconvenient12, and I’ve chosen to leave the job of reviewing Apple News to my Australian colleague Graham Spencer. I still have a brief thought on the app, though.
\nAn example of the content I am recommended in Apple News.
Apple News lets you follow individual sources (websites) and topics; the latter is reminiscent of Zite (acquired by Flipboard). Supposedly, the app learns from your reading habits by monitoring what you read on-device, but it also syncs your data via iCloud for convenience. Thanks to machine learning, Apple News should, in theory, understand what an article is about, leading to further exploration of topics via tags, and it should also give you more articles to read in a Music-inspired For You section. The majority of websites in News work by sharing an RSS feed with the service (like MacStories does), which is reformatted to look nice in the app; others are composing special articles for Apple News with a native format that offers more control and customization.
\n“Discover Card”?
In my experience, Apple News has been comically bad at recognizing topics from articles and it has provided me with recommendations that rarely piqued my interest. At one point I thought it was getting better, but that was just the luck of two interesting articles that appeared in For You. Unlike Apple Music, you can only like stories in Apple News with no way to say “I don’t like this”, and that’s proven to be a major issue for me. I can easily ignore 80% of the content recommended to me in the For You section, but there’s no way to tell the app about it.
\nIt’s not like I haven’t tried to like News or use it every day. I have added dozens of blogs and topics to my favorites. I have read articles on both the iPhone and iPad, liked them, and shared them with people. I have explored topics and checked out native articles, which are pretty cool (but the examples were limited). In spite of my dedication, Apple News just didn’t get any better as a recommendation engine. And if I have to use it as an RSS reader, then I’m just going to keep using NewsBlur, which gives me more control over my sources.
\nAh, yes, interactive media.
I am an RSS power user, but the problems I noticed in Apple News aren’t minor annoyances that only people who subscribe to 200 feeds will notice. You can’t reorder items in the Favorites view. You cannot teach the app what is good and what’s not interesting. You come across awkwardly computer-generated topics such as Central Processing Unit, Mobile App, Interactive Media, and the all-encompassing Business, which is often complemented by a cover photo of an American businessman with a peculiar hair style. The app doesn’t use Safari View Controller for viewing articles on the web, which means that Content Blockers aren’t supported either.
\nThere may be a market for Apple News, but this first version feels too unfinished, slow, cluttered, and computer-y for me to take it seriously when it comes to my daily news workflow. I suppose you could appreciate Apple News as a way to browse a few favorite sites and topics in a simple, visual fashion, and I continue to be intrigued by the Apple News Format, which I’ll experiment with for MacStories. But for now, I’m back to my trusted NewsBlur.
\n\nWhenever Apple announces new features and improvements for iOS’ built-in Messages app, they like to brag about its status of “most used” app on the iPhone and iPad. While that’s probably true given the popularity of iMessage and the importance of mobile messaging in our lives, I wouldn’t be surprised to see another Apple app as the runner-up in a daily usage chart: Notes.
\nEverybody takes notes, but the concept of a “note” varies deeply from user to user. A note can be exactly what the name says – a short text annotation jotted down for later – but it can also be intended as a list of things to remember, a collection of products to buy, reference material, and more. The versatility of apps and data types supported by iOS has spurred the creation of an entire ecosystem of note-taking apps that can serve different purposes. There are apps to save notes as text, locations as notes, images as notes, and even create notes you can share with others or automate for yourself. In the age of iOS, a note is more than text.
\nFor all the third-party apps that promise a superior management of notes, though, I’m willing to bet that Apple’s pre-installed Notes app takes the crown for the note-taking app used by millions of users on a daily basis. And unsurprisingly so: the Notes app offers basic formatting and note creation functionalities that most people are okay with, and the integration with the system (namely through Siri) and cross-platform availability via iCloud makes it a good-enough choice for the average iOS user. I couldn’t use Apple’s Notes app for what I needed to do in the past year with MacStories and Relay FM, but I understood why most of my friends were perfectly content with it. In spite of its awkwardly retro interface, Notes is dependable.
\nWith iOS 9, Apple has taken aim at note-taking apps that allow users to expand notes beyond text and is supercharging its Notes app with brand new features that make it a serious player in the game and a better option for all users. While it was Messages’ turn to receive an overhaul with iOS 8, Notes is getting much deserved attention this year, with some surprising and unexpected results. I’ve switched to Notes full-time since the first beta of iOS 9, and I don’t see myself having to use another note-taking app any time soon.
\nThe first visible change in the new Notes app is the ability to organize notes in folders13 and, like the Photos app, access recently deleted notes for 30 days with an option to restore them.
\nThe organizational revamp is made possible by Apple migrating from the old, IMAP-based backend of Notes (which relied on an email protocol to sync notes across devices) to a modern, faster CloudKit-enabled infrastructure that gives the company more control and flexibility.
\nWhile it’s still possible to sync Notes with an email account configured in the iOS Settings, the ability to organize notes in folders is only exposed to users who set up Notes in local mode (no sync) or iCloud. It’s safe to assume the latter will turn out to be the most popular option: once migrated to the new Notes app, iCloud accounts will be able to create folders and keep them in sync between devices – which is obviously not available with local mode.
\nThe possibility to create folders for notes is hinted in the main screen of the app: named ‘Folders’ and featuring a ‘New Folder’ button at the bottom of the account list, this is where you’ll be able to create a new folder, give it a title, rename it, and delete it when it’s no longer needed. In the Folders page, accounts are grouped by name and they list each folder contained inside them. In this review, I’m going to use iCloud as the main example as it’s what I’ve been using for the past three months and what I believe the majority of iOS users will upgrade to after iOS 9.
\niCloud puts every note in the ‘Notes’ folder – the default destination for new notes created via Siri (this can’t be changed as there’s no setting for a different default folder) and a top-level folder that can’t be deleted. Notes can be moved across folders by swiping on an individual note and revealing a Move menu. The interface for this is simple enough and gets the job done, but it lacks the polish of Mail’s swipe menu.
\nFrom an organizational perspective, folders in Notes are likely to serve most users sufficiently well. I’ve created folders for Home and MacStories, and I found myself being okay with the ability to have notes in distinct places and access them with one tap from the main screen of the app. For the average iOS user who relies on Notes for short bits of text, folders will be a small revolution – and yet another case of iOS users deriving the greatest joy from the simplest features adopted from OS X.
\nThis doesn’t change the fact that folders in Notes will be too limiting for advanced users who are accustomed to deeper management in alternative note-taking apps. Most notably, Notes’ search feature (available by swiping down on a folder to reveal a search bar) can’t restrict search to a single folder, and even when looking for a string of text that belongs to a note in a folder, the app will simply match that text as part of the top-level ‘Notes’ folder.
\nPerhaps more perplexingly, Notes’ search always looks for every note across every folder and every account. Typing into the search bar of the general All iCloud “folder” (a filter that is created by default and that collects all notes from all folders in iCloud) has the same effect of trying to search in a folder that only contains a subset of notes.
\nWhile I understand why Apple may not want to put advanced search options in the search bar, starting a new search from inside a folder should at least attempt to limit the scope to that folder. The ability to view recent searches and limit search to the current account somewhat helps in retrieving specific notes more quickly, but the aforementioned misreporting of the source folder in search only adds insult to the injury for users who are going to keep several dozens of notes. This is only one of the many issues with Notes for users who would like to do more with the app.
\nIn the grand scheme of things, where you can move notes is, after all, one of the less significant changes in the Notes app. Apple has put great emphasis on what you can do with Notes in iOS 9, and that’s where the update feels most impressive – in some cases, even when compared to third-party alternatives.
\nA note in iOS 9 can contain images, lists and checklists, sketches, link snippets, files, and more. The new Notes app wants to be more than a single-purpose container of text – it aims to become an everything bucket for the iOS user who doesn’t want to forget anything. This is accomplished with a refreshed set of controls and system integrations, with a few missteps along the way but, overall, with a new direction for the app that feels like the right move at the right time.
\nSince version 5.0, iOS has provided system-wide formatting controls for text in the copy & paste menu to make text bold, italic, or underlined in any app that supported rich text, including Mail and Notes. In iOS 9, Apple is taking a page from its iWork suite (specifically, Pages – no pun intended) to offer a brand new formatting view on the iPhone and iPad that considerably extends the text style options available in Notes.
\nOn the iPhone, new formatting controls are revealed by tapping a “+” button above the keyboard that turns into a special row with a cute animation reminiscent of Dashboard’s +/x transition of yore.14 This bar contains additional controls for checklists, photos, and sketches (more on this in a bit); on the iPad, it’s integrated as part of the new Shortcut Bar that features customizable shortcuts on both sides of the keyboard’s QuickType bar. On both devices, the Formatting view is accessed by tapping the “Aa” button next to checklists.
\nFormatting options in Notes include Title, Heading, Body, three types of lists (bulleted, dashed, and numbered, which can be indented), plus shortcuts for bold, italic, and underlined text. On the iPhone, these controls are displayed in the lower half of the screen like Pages, with a scrollable list of styles and visible shortcuts for bold, italic, and underlined. While it’s fairly apparent that Apple modeled this screen after Pages and the old Google Docs app, there are some key differences.
\nFormatting controls in Pages and Notes.
First, Notes doesn’t have the same wealth of controls available in Pages and Docs – Apple doesn’t believe users should be able to tweak the font size of body text or the line spacing in Notes, which is meant to be a one-size-fits-all note-taking app for quick interactions and no concept of custom layout whatsoever. Secondly, Apple seems to have learned from its mistakes and put the formatting button towards the bottom of the UI (just above the keyboard) instead of opting for the title bar as they did with Pages. It’s clear that Notes has been rethought for the age of big screens, while Pages shows the last vestiges of a pre-iPhone 6 era. The result is that accessing formatting controls in Notes on an iPhone 6 Plus is easier than trying to do the same in Pages.
\nOn the iPad, formatting controls are displayed with a popover floating on top of the Shortcut Bar.
\nWhile bold, italic, and underline still require text selection to change the appearance of a text string, making some text a heading or a list can be done by tapping next to it and picking an option in the formatting list without selecting it first. In the months I’ve spent using Notes for research and personal notes, this has dramatically sped up the process of going from a list of plain text lines to a nicely formatted note with a clear structure. In the new Notes, I can start typing to get thoughts out of my head, then open formatting, tap to place the cursor next to lines I want to make different, and tap away on titles, headings, and lists to make something more relevant or structured.
\nNotes may not have all the formatting options of Pages, but that’s the point. By not being too complex, Notes can appeal to users who don’t want or need Pages but that would also like the ability to mark up notes easily.
\nSomething I’ve always noticed when taking a look at how people I know in real life use iOS is that a vast portion of them uses the Notes app as a todo system. In spite of iOS having its own Reminders app with support for alerts and geofences, a lot of people jot down things they have to do or remember in Notes. With iOS 9, Apple is catering to this use case with the ability to create checklists.
\nThe implementation of checklists is straightforward: a new button next to formatting controls allows you to start a checklist or convert selected lines of text into one. As you’re adding items to a checklist, Notes offers automatic list continuation and it can also convert other types of lists into a checklist. You can check off items in a checklist, which gives you an indication of things that have been completed.
\nThis list may or may not be real.
The way Notes treats checklists isn’t similar to how any todo app would work with lists and tasks. There’s no concept of dates or reminders in Notes. It’s not a smart database that remembers completed items when you convert back and forth between formats and styles. Checklists are just another formatting option in Notes with a stronger visual cue that makes text lines look like a todo list even if, after all, it’s just text with a checkbox.
\nThe key difference to keep in mind is that Apple isn’t seeking to replace Reminders with Notes in iOS 9. Checklists in Notes can’t be given dates or any type of task-related metadata – if you want to organize your todos in proper lists with alerts and sharing settings, you’ll still need Reminders.
\nThat is precisely why I believe checklists are such a clever, cunning idea. Apple may not be looking to replace Reminders directly, but a lot of people are going to be ecstatic about the addition of checklists in Notes. Those people already use the app to save things they need to act on, but until iOS 9 they’ve saved them as lines of text that later needed to be manually selected and deleted. This may sound absurd to tech-inclined folks (myself included) who often use multiple dedicated reminder and task management apps, but the reality is that millions of people don’t need the overhead of our systems.
\nI’d be willing to bet that a lot of folks don’t want or have to attach metadata like priorities, locations, dates, and notes to tasks; they just want to type them out and get to them eventually. What better system than an app where there are no strict format requirements and where a note can be an image, a list of rich text, a drawing, or an interactive checklist?
\nFor those people, I’d argue that the richness of Notes in iOS 9 will be superior to Reminders, with checklists being the epitome of Apple adjusting to unexpected use cases and the way people use their apps. By definition, something that needs to get done should be saved in Reminders, even without a date. But if millions of customers prefer to mix and match notes with text that loosely represents a todo and if that system can scale to incorporate nicer formatting for todos alongside other media, then it’s only fair to make Notes more versatile and yet easier to use than Reminders.
\nNotes’ improved text abilities are complemented by a set of image and video-related features aimed at letting users capture more types of information.
\nPhotos and videos can be interspersed between text and lists in Notes for iOS 9: at any point during editing, you can tap the camera button to choose an item from your photo library or take a new photo or video. Whatever you pick will be displayed inline between text and other media in a note, so you can tap on a video to play it back inside Notes or tap an image to view it in full-screen.
\nNotes is smart enough to reformat text when an image is inserted (for instance, a checklist is discontinued if you insert an image after some text) and you can also paste images copied from other apps – thus making Notes an ideal companion for Safari when you want to reference images from the web without saving them to Photos.
\nTo keep things simple, Notes doesn’t have any sort of image resizing or text reflowing options to build more complex layouts as you can in Pages and other word processors. Again, Apple’s goal is to offer a more powerful note-taking app and not necessarily a slimmer Pages, so this makes sense to me. I didn’t miss such options in my tests, and I like that I can’t go wrong by inserting a bunch of pictures alongside text in a note.
\nMessages’ photo picker (right) is a superior implementation.
Alas, the image picker itself leaves a lot to be desired: rather than adopting the useful picker found in Messages (which displays recent media in a swipeable tray), Notes comes with a dull menu to open a picker or take a new picture.
\nMore interestingly, Apple has built a sketching mode into Notes, enabling users to mix text, media, and other content with interactive sketches that can be updated at any time and exported as images.
\nSketching is accessed by tapping the drawing icon in the bottom toolbar, which will kick Notes into a drawing screen that offers a pen, a sharpie, a pencil, a ruler, an eraser, and a color picker. A drawing’s background defaults to the same paper texture used throughout the app, and pages can be flipped horizontally or vertically. Sketches can be shared with other apps by tapping the share icon in the top right, which will export a static image.
\nYes, I am an artist.
When playing around with sketches in Notes, you’ll likely notice two things: the limited tools available when compared to standalone apps like Paper, and the solid performance of drawing on screen in Apple’s app.
\nSimplicity shouldn’t come as a surprise: the entire app hinges on the idea of enriching a traditional note-taking environment with just enough more stuff, and, overall, Apple is doing a pretty good job at covering the basics. Performance, though, is a whole other topic.
\nIn any form of interactive video output, latency is an issue to consider. Whether it’s controlling videogames with buttons or touching an iPad’s screen to tap something, the relationship between humans and software depends on latency – the delay in how long it takes for input to translate to output. And because our fingers are better input methods than we tend to believe, even the slightest amount of higher latency can lead to a disruptive user experience when the human eye is able to discern a visible delay.
\nEver swiped quickly on a multitouch display and noticed virtual ink struggling to catch up with your finger sliding across the screen? That’s latency.15
\nVisual latency can be ascribed to various factors, but most notably in modern software, CPUs are to blame for the delay we observe between our actions and the expected result. With iOS 9, Apple has set out to drastically reduce latency to make apps more responsive, cutting down the amount of time required to compute user touches and render them on screen. This initiative – which comes with new APIs for developers – will have the biggest impact on games (which are heavily reliant on fast multitouch gestures) and drawing apps – not to mention the upcoming Pencil for iPad Pro.
\nOne of the downsides of drawing apps on iOS (and particularly on iPad) is the noticeable delay between swiping and seeing ink come up on screen. While developers have written entire custom engines dedicated to making ink appear as naturally as possible, laws of physics and intrinsic iOS limitations have made it nearly impossible to replicate the feeling of a real pen on a multitouch display.
\nApple’s goal with iOS 9 isn’t to make drawing on iOS exactly like using a physical pen, but to get very close to it.
\nApple is introducing Touch Coalescing, an API that leverages the iPad Air 2’s twice-as-fast 120Hz touch scan update rate to double the number of touches registered by the OS and therefore the touch information exposed to apps. Thanks to the higher touch scan rate of the Air 2 (other iOS devices can scan for touches at 60Hz), iOS 9 can accumulate twice the number of touches per second, but also coalesce those updates without wasting computational work in an app. Coalesced touches enable a single frame on the iPad Air 2 to scan for two touches, which are available to developers on demand via an API.
\nBut that’s not all. On top of doubling the number of touches iOS can recognize on each refresh, Apple has built a predictive touch engine that can look into the future of user touches and guess where a user’s finger may be going next. Using some highly tuned algorithms, iOS 9 can provide developers with a fresh set of predicted touches all the time, which can be used to further decrease latency as they’re added to the iOS graphics pipeline, preceding the work needed to scan for touches, animate them, and pass them to an app. Built into UIKit, predicted touches are independent from coalesced touches, but together they can be used to make latency on iOS even lower to get closer to the idea of direct manipulation and fast performance.
\nBy Apple’s estimates, the work done on iOS 9 has allowed input recognition to go from four frames to a frame and a half. If the above paragraphs are too technical: this is a massive performance improvement for drawing apps and games on the iPad Air 2, and it bodes well for the iPad Pro and Pencil accessory.
\nDrawing in Notes for iOS 9 is fast, smooth, and natural. As you swipe across the screen using the pen tool or the pencil, ink renders smoothly (this is the only area where the app’s paper texture is justified) and, more importantly, animates quickly and follows the tip of your finger unlike any other drawing app. In testing Notes drawing on my iPad Air 2 and iPhone 6 Plus, I noticed no visible difference between the two, with solid performance on both devices in terms of rendering speed and animations. What Apple has done for advanced touch input in iOS 9 can be noticed when drawing in the Notes app, even if saying “it’s fast” doesn’t do justice to the fascinating complexity behind it.
\nAs a user, it seems clear that drawing in Notes isn’t aimed at artists. Notes doesn’t want to replace Paper: rather, sketches are used as complements to text and images, useful in those occasions where the human finger can express shapes and ideas that would take too long with apps and images.
\nThe ruler is an obvious candidate for education and for whoever is seeking to use an iPad to plan future house redecorations: once placed on the screen, the ruler can be rotated with two fingers to tilt its angle, then you can pick a tool and swipe across it to draw a straight, precise line that wouldn’t be possible without it. It’s an incredibly fun and reliable implementation and one of Apple’s finest details in iOS 9.
\nSketches in Notes are a good example of why Apple likes to control the entire stack. If the company wasn’t in charge of every single aspect of its hardware and software, it wouldn’t have been able to optimize iOS to take advantage of the display of the iPad Air 2 and build a predictive touch engine aimed at reducing latency.
\nWhen you control every facet of the experience, you can focus on seemingly unimportant aspects of software such as going from four frames to one and a half for input recognition, even if most users only want to produce crummy drawings in Notes. And that’s okay, because knowing how that works and why it performs so well is part of the fun in these articles, and I’m excited to see what developers do with it.
\nApple’s willingness to turn Notes into iOS’ everything bucket is perhaps best exemplified by its new share extension. From anywhere on iOS, you can now capture text, links, and files and save them into a new note or, even better, append them to an existing note. This is, alongside iPad multitasking, one of my favorite features of iOS 9 and it has allowed me to drop several workflows I built for iOS 8.
\niOS 9’s Notes extension lives in the share sheet, and it’s a share extension that lets you capture anything you come across that can be shared by an app. The extension is a floating popup that carries the same paper texture of the Notes app, with the same yellow UI accents and letterpress effect on text. The Notes extension defaults to saving shared items in a new note, but you can tap the “Choose Note” button at the bottom to pick an existing note where you’d like to save something into. The extension will also remember the last note you saved an item into if you bring it up after a few seconds.
\nIf you’ve used extensions such as Evernote’s or 2Do’s since their debut last year, you’ll be familiar with the thinking behind the Notes extension. Any text, URL, or file you can share on iOS through the native share sheet can be passed to the Notes extension, which will preview it inline whenever possible. If you’re sending text to the extension, it will prefill the Notes sheet; an image will get a thumbnail preview on the right; a web link will get a nice snippet preview with a title and the first image found on the webpage.
\nThere are two ways the Notes extension will attempt to save content in a note. For data that can be rendered inline such as videos, images, and text, the extension will either show an editable text field (text can always be edited manually upon saving through the extension) or a thumbnail preview. You can try this out by saving a photo or a video from Photos, or some text selected from Mail via the new Share button in the copy & paste menu: both media and text will be tappable or editable in a note – as if you created them from Notes in the first place.
\nAlternatively, Notes will save links or files it can’t render inline as small units of content that appear as standalone, tappable items in between body text. In some cases, tapping these note attachments will show a Quick Look preview, or it’ll open Safari, or – and this is where the extension and the app get more creative – it’ll adjust the UI to preview the attached content with native media controls.
\nLet’s start with web links. For me, saving links from apps like Twitter clients or news readers accounts to one of the activities I perform the most on a daily basis on my devices. I save links because I want to cover them on MacStories, or because I need to share them with someone, or perhaps they’re reference material I’ll have to find again in the future. The apps I use to manage this daily avalanche of links tend to treat them for what they are: hyperlinks that open Safari. This is the case for apps like 2Do, Messages, Mail, NewsBlur, and countless others. The richness of the web doesn’t apply to its resource locator, which is just a link.
\nNotes’ extension doesn’t work like that. In the new app, Apple has devised a way to offer a basic preview of the information available at the source URL with rich link snippets that display the associated webpage’s title, description, and lead image. Web link previews in Notes are further proof of Apple’s commitment to web metadata technologies such as Open Graph16, and they provide a fantastic way to give meaning to a URL.
\nHey, Underscore!
When saving a link with the Notes extension from any iOS app capable of sharing URLs (it doesn’t have to be Safari), the extension will fetch the link’s metadata and display them in the compose sheet. This is a good way to preview URLs without opening a web view: if you’re in, say, a Twitter client and want to know what a link is about without giving the website a page view, you can send the link to the Notes extension and it’ll fetch the webpage’s title and description for you.17
\nThe extension’s parser is capable of following multiple domain redirects, too: if you give the extension a shortened URL, it won’t try to resolve it to its source domain, but it’ll still follow all redirects until it can preview the final webpage’s information. In my tests, the extension took less than a second to present a link snippet for unfurled links, but it could take a couple of seconds for shortened links with multiple redirects (such as Bitly or Buffer links).
\nAs far as I know, no other note-taking app on iOS offers a smart web capture feature that can parse a link’s metadata to give more context to a link. Apple’s implementation could have used an indicator to show when Notes is trying to parse a link’s title and thumbnail preview18, but, overall, it’s an invisible, it-just-works kind of feature that performs admirably.
\nI’m in love with the way the Notes extension saves links. As a writer on the web, the link is my currency, but sometimes its value can’t be easily assessed because URLs are fundamentally meaningless. With the Notes extension, I can assemble a note with a bunch of links and be presented with a series of small previews that have titles, two-line descriptions, and image thumbnails. And because the Notes extension can create new notes or append content to the bottom of an existing note, I can keep my lists of links as separate notes where web previews are saved in chronological order, without formatting issues, without having to create complex workflows to manage it all.
\nThis seemingly minor addition fixes a serious annoyance of mine. Every week, I collect links in separate lists for this site, our MacStories Weekly newsletter for Club MacStories members, Connected, and Virtual. Until a few months ago, I used to go through each list to evaluate the importance of each link, which usually meant reopening it in Safari to recall what it was about. This was a time-consuming process, especially because apps like Evernote would sometimes fail at opening a link in Safari when tapping it.
\nWith Notes, I now keep lists of links in the main Notes folder and going through the list simply involves taking a look at a link preview, then long-tapping to delete or copy. Thanks to the built-in link previews, my workflow has been reduced to a nimble visual reassessment and a tap & hold, which is better than what I used to do to process links.
\nI didn’t think having a preview of a link’s title and a thumbnail would do much, but, in practice, the way Notes presents links saved from the extension is a superior solution to other note-taking apps for iOS in every way. I’ve been spoiled by web link previews in Notes, and now I want them everywhere.
\nAnd that’s exactly the problem: this new behavior is exclusive to Notes, and specifically to the Notes extension. If you copy a link from, say, Messages (which, like other Apple apps, still doesn’t display the system share sheet for tap & hold) and paste it into a note, it won’t be expanded to become a rich snippet – it’ll be an old fashioned tappable URL. Similarly, while Apple has built a way to better preview URLs by attaching visual metadata to them, this system isn’t used across other Apple apps on iOS 9, which are still limited to displaying links as URLs with no extra information.
\nThere’s a clear winner here.
As services such as Twitter, Slack, and Facebook have shown, there’s value in presenting URLs as cards of content that push information from the web to the user. Apple is thinking along these lines with rich snippets in Notes and Spotlight for iOS 9, but while the results are commendable, such effort isn’t consistent throughout the OS. I’m hoping iOS 10 will offer new APIs to present URLs as rich snippets like Notes does today.
\nThe other benefit of a smarter, more versatile Notes app is a wider array of options for saving attachments into it. Images and videos aside, the Notes extension is capable of accepting any file that can be shared via the share sheet on iOS 9, making it an intriguing solution for folks who have relied on database-style apps such as OneNote and Evernote for similar workflows. I have some reservations on Notes’ attachment handling, which ranges from “very good” to “mysteriously half-baked”.
\nAttaching files to Notes.
In theory, you should be able to send any file to the Notes extension and choose whether you want to create a new note for it or append it to an existing one – all while retaining the ability to add a text comment from the share sheet. Files saved to Notes via the share extension will appear as links: units displayed inline within a note, with filename, size, and a thumbnail preview whenever possible. In spite of Notes’ rich engine, some file types will be rendered as Quick Look attachments that need to be previewed in a separate modal window. A .txt file won’t be attached with text, but you’ll get a .txt icon you have to tap to view plain text in Quick Look; PDFs will be previewed in the body of a note, but you’ll have to tap them to swipe through pages.
\nWhat’s even stranger is the inconsistency of the attachment/preview system and fantastic little touches pitted against glaring omissions. You can search for text contained in PDFs saved in Notes, but you can’t use Markup from Quick Look previews. iOS has an underlying engine to render rich text consistently between apps, but selecting some formatted text in Mail to share it with the Notes extension will strip all formatting and save it as plain text in the app.
\nThen, you come across voice memos, which can be saved into Notes as attachments and that have a Play button that transforms the app’s title bar in a media player with playback controls.
\nWait, what?
I understand why Apple may want to rely on extensions to extend Notes beyond its advertised capabilities. In avoiding buttons to attach voice recordings, documents, and other files from apps, Apple isn’t only ensuring the Notes UI isn’t too cluttered (unlike Evernote) – they’re also reducing the potential cognitive load of having to know what all those buttons do.
\nThis argument doesn’t preclude Apple from having some basic consistency for Quick Look previews in Notes or properly teaching users that they can save a variety of file types into the app. How should a college student know that a voice memo can be saved (and played back) inside Notes? Why do PDFs come with different preview features in Mail and Notes?
\nApple has done good work with the Notes extension in iOS 9. It’s been a fantastic addition to my workflow for saving links and appending content to existing notes. But if Apple truly wants to make Notes a versatile repository for all kinds of user content without an overbearing UI, the extension and Quick Look frameworks need to be reworked to always maintain formatting and have better previews for files inside Notes. Today, you can save files to Notes, but its previews are lacking in several ways.
\nWith Notes taking on new attachment-handling duties, Apple has chosen to give the app an additional view to browse all attachments saved across all folders. Accessed by tapping the grid icon in the bottom left corner of the notes sidebar, the Attachments Browser lets you view a cluster of photos and videos, sketches, websites, audio clips, and documents in a single screen.
\nYou can tap each item to preview it and go to its associated note, or you can tap & hold it to go to the note directly. The main screen displays the most recent attachments for each category, with a ‘See All’ button on the right to view the full grid in a separate view.
\nI have mixed feelings about the Attachments Browser. On one hand, it’s the perfect showcase for Notes’ attachment abilities as it can cull non-text items from all notes and present them in a view that brings them front and center. On the other hand, that’s also its biggest downside: you can’t view attachments per folder – you can only view all attachments from all notes. If you’re the kind of user who adds a lot of images to notes and would like a way to filter them by folder, you won’t be able to do that in this version of Notes.
\nThe Attachments Browser also shares the same limitations of the extension when it comes to file types it doesn’t understand: .zip archives saved from other apps will be categorized under ‘Documents’, for instance. On the other hand, audio clips are playable from the main view of the browser by tapping a large Play button (cool) and links can be opened in Safari with a single tap.
\nIn spite of its shortcomings, the Attachments Browser is a clever addition to Notes. While the entire app relies on blurring the difference between text and non-text content for a seamless experience, the Attachments Browser allows you to filter out everything that isn’t text and that you can interact with.
\nThe few issues I have with Notes’ search and the extension don’t change my overall take on this update. With iOS 9, Notes isn’t just a powerful alternative to third-party note-taking apps – it joins Safari on the podium of Apple’s best work on iOS, period.
\nI’ve been an Evernote user for years. I’ve often talked about the service’s adaptability to rich text and file attachments. I’ve relied on integrations with external apps through the Evernote API. I’ve shared notes and entire notebooks with others. Notes doesn’t have any of this. There’s no API for third-party apps and services to plug into; no sharing of notes and folders, not even with family members; no sorting options, no tags, no subfolders. From a power user’s perspective, Notes is the wrong choice. So why all this enthusiasm?
\nAt some point, consistency and reliability trump automation and feature richness. Should I use something that offers the potential benefit of dozens of features, or am I better served by an app that covers the basics elegantly, works expectedly, but that has very clear limitations for automation and advanced use cases? Do I like the thought of power user features in my notes more than their actual practicality?
\nAfter years of Evernote changes, feature additions, and Work Chat prompts, the simplicity of Notes is refreshing. It doesn’t cover many aspects of what Evernote is capable of, and many will be perfectly happy to stick with Evernote because they truly need all of its features.
\nBut I don’t. In using Notes for iOS 9, I realized that I’m okay with the ability to intermix rich text and images, file attachments and sketches, all while taking advantage of one of the best share extensions on iOS, Siri integration, Spotlight search, and multitasking on iPad. All the workflows I created to append links to a note pale in comparison to the effectiveness of Notes’ extension and link previews. Evernote’s sync can’t be as fast or frequent as Notes’ iCloud backend. The bloat that Evernote accumulated through the years has been replaced by a basic yet powerful note-taking app that does everything I need, and I feel relieved knowing I no longer have to fight Evernote’s tendency for more. This is all there is, and it’s okay.
\nThat’s not to say Notes can’t get better. Besides the aforementioned problems with search, the extension, and file attachments, I miss a way to pin specific notes at the top of my list, or to sort them alphabetically when I want to. Siri can’t delete notes, which I don’t understand. The entire app is still stuck on a paper texture and letterpress text that makes sense for sketches, but that, like Reminders, no longer has a reason to exist on iOS and that makes text harder to read sometimes.
\nIt’s been three months since I started using Notes on iOS 9, and I can’t imagine going back to any other app for my needs, which involve rich text, images, links, and documents. As users increasingly rely on iPhones and iPads as their primary (and often only) computers, the decision to turn Notes into a central location for all kinds of content was a good one. Notes on iOS 9 is an extremely intelligent, focused, and useful update.
\n\nAfter years of little attention paid to the user interface and features of iOS for iPad, Apple wants to correct its course with iOS 9. A combination of the OS’ maturity and willingness to reignite interest in the platform amid declining sales, Apple’s initiative encompasses app interaction, multitasking, text input, and external keyboard integration.
\nThe result is the most important iOS release for iPad to date, as well as a stepping stone for the future of the device as an everyday computer.
\nWhen the iPad launched in 2010, few in the tech press knew what to make of it. If it’s a tablet, why does it run iPhone OS instead of a desktop OS? Is it a big iPod touch or a small Mac?
\nAs it turned out, the preoccupations of tech bloggers were the very factors that contributed to the record-breaking first years of iPad. It was a bigger iOS device that ran familiar software specifically designed to make you feel like you were holding and using a physical object. The iPad could be a book and a newspaper. A calculator on your desk and a portable typewriter. An agenda. A diary. By design, the full-screen nature of apps on the iPad had been engineered to convince you of one simple truth: This device can be anything. And because millions already knew how to use it thanks to iPhone OS, it did offer something for everyone.
\nThe biggest problem that has affected the iPad in the past three years stems from Apple itself. After the launch of iOS 6, the company began a long and tortuous journey towards a new identity for iOS. During this period, iPad got the short end of the redesign stick: while Apple was busy rethinking the core structure and visual appareance of the iPhone, the iPad got unimaginative adaptations and other UI leftovers.
\nThree years after the iPad’s launch, Apple didn’t seize the opportunity to make iPad features and apps unique and tailored to the platform. They just scaled them up. The same consistency that was a smart move in 2010 didn’t make much sense in 2013 after iOS 7 and the chance of a fresh start.
\niOS 7 wasn’t just a visual disappointment for iPad users who were craving for attention. From a functional standpoint, the iPad had evolved to an appealing computer replacement for many, albeit with too many compromises. Tasks that were trivial on a PC were too difficult, if not downright impossible, on an iPad. iOS apps were unable to communicate with each other. Apple had ushered users in the post-PC era with the original iPad and then left them halfway there.
\nOn the iPad, iOS 7 felt like a rushed conversion that had forgotten about the promise of a revolution.
\nBig changes, however, often come in small doses. With last year’s iOS 8, we caught a glimpse of what Apple’s thought process might have been: if iOS 7 laid a new visual foundation, iOS 8 was going to spread a stronger technical layer on top of it. We witnessed how Apple was willing to modularize the concept of app – the long-sacred silo – into multiple functionalities and screen sizes connected by a common, secure thread. iOS 8 came out as the yin to iOS 7’s yang: free of their (sometimes forced, frequently derided) photorealistic appearance, apps were granted an out-of-sandbox permission, too.
\nIt’s not uncommon to rely on hindsight to understand the iterative changes behind Apple’s products. iOS 7 brought a new, subdued look. iOS 8 introduced a framework to extends apps. These are not features designed in a vacuum.
\nExtensions make more sense with a design language that focuses on color rather than heavy textures and 3D graphics. Imagine if all your apps still looked like distinct objects and you had to interact with panels of leather on top of wooden backgrounds, metal slates, and paper sheets. Similarly, consider the new iPhones and iPads: without a design that eschews pixel-perfect object recreations, many developers would have to target new screen sizes with bitmap graphics that take time away from actual app development.
\niOS 7 and iOS 8 were deeply intertwined, two sides of the same coin that Apple revealed in the span of a year. In the iPad’s case, they still weren’t enough to complete the vision of what Apple had in store for the future of the device.
\nBut as they say: third time’s the charm.
\nApple’s big bet on the iPad with iOS 9 involves deep changes in multitasking and productivity enhancements that are both obvious and unexpected. To understand the gravity and consequential paradigm shift of these new features, it’s important to observe the iPad’s role today and reflect on why Apple is turning its attention to the device now.
\nThe iPad in 2015 is an incredible computer at the top of the line, powered by a more flexible OS that still struggles to accommodate some basic use cases and workflows. This is key to understand the changes Apple is bringing to the iPad this year. Everything new in iOS 9 for iPad ultimately comes down to this idea:
\nThe iPad is a computer in search of its own OS.
\nAs I noted in my review, the iPad Air 2 is a dramatically faster and more capable iPad than older generations, to the point where it’s fair to wonder why such power is needed at all.
\nIn the same product line, though, lies the ever-surviving iPad 2, a second-generation device released in 2011 and that can still run the latest version of iOS. The longevity of iPad hardware and Apple’s policy to support old devices with software updates has created a curious dichotomy for the company: the latest iPad, more powerful than traditional computers in some instances; and the iPad 2, still receiving updates but far from the user experience of the Air 2.
\nThe tension between new and old, modern and traditional is also quite apparent in iOS itself. With iOS 8, Apple debuted user features and developer frameworks that allowed an iPad to handle tasks that wouldn’t be possible on a Mac. For some people, an iPad running iOS 8 is preferable to a Mac with OS X. This is exactly why I elected the iPad Air 2 as my primary computer: besides form factor advantages, I like iOS and its app ecosystem better.
\nAt the same time, iOS 8 is still behind OS X when it comes to performing tasks that involve switching between apps, working with files, and editing text. These are the tentpoles of any personal computing experience from the past two decades and the functionalities added in iOS 8 have done little – if nothing – to address the concerns expressed by iPad users about them. Action and share extensions have helped in exchanging data between apps, but they’re not the solution to look at two things at the same time; custom keyboards have provided a novel way of input and data extraction from apps, but what the iPad needs is a faster way to select and edit text.
\nThe problem that Apple needs to solve with iOS 9 for iPad is complex. How can Apple make good of the post-PC promise with features that are drastically different from what came before – without the overhead and inherent complexity of forty years of desktop computers – but also capable of addressing modern user needs and workflows?
\nApple’s answer comes as a cornucopia of changes, with new Slide Over, Split View, and Picture-in-Picture features for multitasking, better support for external keyboard shortcuts, enhancements to the software keyboard, and even a gesture to navigate and select text using multitouch.
\nThe recurring theme of contrast finds its zenith in the multitasking and productivity additions to the iPad in iOS 9: some of them are brand new ideas previously unseen on OS X; others borrow heavily from the company’s desktop OS. Some of them are exclusive to the powerful Air 2; others have made their way to older iPads as well.
\nPrior to inspection, such peculiar mix begs the question: does Apple know new ways to think about old problems, or is this too much for an iPad to handle?
\nOne thing’s for sure: Apple is finally making what the iPad was looking for.
\n\nApple’s first big change to iPad multitasking requires a single swipe from the right edge of the screen.19 Called Slide Over, this is what you’ll want to use to view and interact with another app without leaving the app you’re in.
\nSafari and Notes in Slide Over.
Slide Over works by putting a secondary app on top of the app you’re currently using, called the primary app. It’s based on compact and regular size classes, and it works in both portrait and landscape orientations. Slide Over is supported on the following iPad models:
\nSlide Over can be activated from any app, regardless of whether the app you’re using has been updated for iOS 9 or not. The fact that the app you’re in may not support Slide Over in iOS 9 doesn’t have any influence on the secondary app that you’ll be able to invoke. Slide Over is all about the secondary app and cycling through apps that support it.
\nThere are two ways to activate Slide Over with a swipe from the right edge of the screen. You can swipe from the area around the middle of the screen (vertically centered) to open Slide Over directly; or, you can swipe from above or below the center of the screen to reveal a pulling indicator that you can then grab to fully reveal Slide Over.
\nThe right side of the iPad’s screen, showing the Slide Over pulling indicator.
The pulling indicator is the same that is used for Control Center and Notification Center when the app you’re in is running in full-screen mode (a common occurrence for games and other video apps). Slide Over joins Control Center and Notification Center in being a UI layer that is activated by swiping from the edge of the screen and that sits atop any running app.20
\nSlide Over also comes with its own app switcher to cycle through apps. Slide Over’s app switcher is a dark overlay with app icons contained inside light gray boxes; only apps that support Slide Over will be shown in this view.
\nSlide Over’s app switcher.
Think of Slide Over as a subset of recently used apps, specifically (and exclusively) those updated for iOS 9 multitasking. You can’t quit apps in Slide Over: you can only tap to open an app and make it the secondary app running on top of an app you’re already in.
\nThe cleverness of Slide Over lies in how its design dictates the experience of using it. When you pick a secondary app, it opens in what may be described as an iPhone app layout, stretched up vertically to fit the iPad’s screen. In both landscape and portrait mode, a secondary app is resized to a compact size class that resembles an iPhone app: in Slide Over, Safari moves the top toolbar buttons to the bottom of the screen like it does on the iPhone; Messages, Mail, and other Apple apps look exactly like their iPhone counterparts, only taller.
\nPictured above: Calendar, Mail, and Podcasts in Slide Over next to Safari.
To achieve this, iOS 9 uses size classes (a technology that developers have started supporting to make iOS apps responsive for multiple display sizes) to show a UI that’s appropriate for a narrow and elongated mode. This makes Slide Over easy to use and familiar (most apps feel and work like iPhone apps) and a great way to interact with another app without taking it full-screen.
\nDesign serves the experience in Slide Over, and it works. If you swipe to reveal Mail in Slide Over, you’ll be presented with a familiar view of messages in your inbox, resized to fit the Slide Over panel. If you open Notes, you’ll see a list of your notes; if you tap one, Slide Over will move to the subview required to display the note. Mail, Messages, Calendar, and other Apple apps rely on adaptive UIs and compact size classes to split app sidebars and navigation points into layouts that can be displayed in a single column with Slide Over.
\nTo truly appreciate Slide Over, we need to look back at Apple’s past iOS SDKs. Since 2012, the company has been advocating for APIs to create apps capable of responding to any screen size, orientation, or localization. With a greater matrix of iOS screen sizes available to customers in more countries, Apple felt it was appropriate to rethink the design and development process with a focus on adaptivity: Auto Layout, Dynamic Type, and Size Classes were seen as signs of smaller iPads and bigger iPhones back then; today, they provide the context necessary to understand iPad multitasking in iOS 9.
\nDevelopers who have been paying attention to Apple’s announcements and advice have already done most of the work required to support iPad multitasking: Slide Over uses the same compact size class that developers have grown accustomed to using on the iPhone. It’s not just easier to support Slide Over this way: it’s the best option when you’re dealing with this type of layout.
\nSlide Over rethinks the idea of looking up information or acting on something without leaving an app. Think of it as having an iPhone next to any app you’re using without the inconvenience of juggling multiple devices. Need to look up a word on Google while you’re reading a document? Open Safari in Slide Over, search, and return to what you were doing. Want to type out an email without closing Twitter? Slide Over, Mail, compose, send. Same for keeping a conversation going on iMessage, checking your schedule in Calendar, or glancing at how many emails are in your inbox.
\nThanks to the iPad’s large screen, you no longer need to launch apps to interact with them. A swipe is all it takes to get things done and be more efficient. This is Apple’s pitch for Slide Over.
\nAgile Tortoise’s Drafts and Terminology.
An important aspect to note about Slide Over is that while a secondary app doesn’t take over the primary app visually21, it does take over functionally. When Slide Over is open, you can’t interact with the primary app and the secondary app simultaneously: only the secondary app is active and able to receive touch input, with the primary one being dimmed in the background. A single tap outside the Slide Over area immediately dismisses the secondary app22. When Slide Over is shown, the software keyboard is exclusive to the secondary app. From a user’s perspective, the primary app is inactive underneath Slide Over.
\nKeyboard input is another interesting decision Apple had to settle on when designing Slide Over. When you tap into a text field that shows the keyboard in a Slide Over app, you’ll get the full-screen iPad keyboard, but it’ll only work with the secondary app. The layering makes sense – having a smaller, iPhone-sized keyboard just for Slide Over would be terrible on an iPad – but it introduces a new level of visual complexity that poses new challenges for Apple and developers.
\nSlide Over’s app switcher can be activated by swiping down from this indicator (pictured: Twitterrific).
Slide Over grants surprising freedom in terms of app switching. Slide Over uses a persistent app switcher indicator that can be dragged to move between the secondary app and the picker for other apps that support Slide Over. The indicator is aligned with the clock in the status bar and it can be swiped to move from app to app switcher. Effectively, this is another status bar indicator: this portion of the screen behaves like a traditional status bar in that you can tap on it to scroll to the top of lists in apps. The similarities end here, as you’ll primarily interact with the Slide Over status bar to swipe it and switch between apps.
\nThe animation to move across apps in Slide Over is some of Apple’s finest visual work in iOS 9. As you begin to pull down, the secondary app starts shrinking while following your gesture – first by adopting rounded corners, then by fitting the contents of the screen to a smaller box that sits below more app icons that come down from the top of the app switcher. It’s a smooth, rewarding animation that is fast and intuitive – exactly the kind of sloppy and comfortable gesture that can be performed in a second without looking. It feels right, and it doesn’t skip a single frame on the iPad Air 2.23
\nAn inconsistency I’d point out is that some Apple apps haven’t been updated with support for Slide Over. I can accept App Store and iTunes not having a compact mode (although it would be welcome), but why doesn’t Music support Slide Over? It’d be a good showcase of the feature’s raison d’être: swipe to open Music, pick a song, play, and you’re back in the primary app. This is an oversight that I’m expecting Apple to rectify soon.
\nSlide Over is a terrific addition to iPad multitasking. It’s easy to activate and it doesn’t compromise on the full functionality of a secondary app: when you open it, you’re not presented with a lite version of another app – you’re given the whole experience, with its full feature set, only in a compact layout.
\nThis is a powerful idea, as it noticeably cuts down the time required to jump between apps on an iPad. It makes the iPad’s screen feel like a large canvas of opportunities rather than a wasteland of bright pixels.
\nSlide Over is so good, I wish notifications could always open in it, and I wish I could have it on my 6 Plus as well. Double-clicking on the Home button feels so passé when you can swipe to peek at apps.
\nSlide Over is only a sliver of the iPad’s multitasking rebirth. What Slide Over enables is an even bigger change for iPad users, and a drastic new approach to app interaction on iOS.
\n\nIf there was still any doubt on the iPad graduating from utility to computer with iOS 9, Split View clears it all. Split View is a fundamental re-imagination of the iPad’s interaction model five years after its launch. More than any other productivity enhancement in iOS 9, Split View is the iPad’s coming-of-age feature.
\nAs the name suggests, Split View is Apple’s take on split-screen multitasking that lets the iPad display two apps simultaneously, enabling users to interact with both apps at the same time. Because of its toll on hardware and system resources, Split View is exclusive to the latest generation iPads.
\nSplit View can be considered Slide Over’s offspring: it can only be activated by entering Slide Over first and tapping a vertical divider that will put both apps (primary and secondary) side by side, active at the same time. In Split View, the app switcher for the secondary app is the same one used for Slide Over, too.
\nIn Slide Over, the divider can be tapped to enter Split View.
Safari in Split View, and the Split View app picker.
There is no other way to activate Split View in iOS 9: the feature is entirely based on Slide Over, both in terms of design and user manipulation. If you want, you can move between Slide Over and Split View by tapping the divider and iOS 9 will cycle through the two modes.
\nSplit View uses regular and compact size classes for three possible layouts. Before iOS 9, iPad apps always used regular size classes for both vertical and horizontal orientations as they ran in full-screen mode all the time. With Split View, the vertical size class is always regular, but the horizontal size class can change to compact. The diagram below shows how Split View affects size classes for iPad apps.
\nSize classes.
When in Split View, the user can control the size of the app window by dragging the divider to switch between layouts. This is best experienced with Split View in landscape, where the secondary app can be resized to use 25% or 50% of the screen.
\nFor the 75/25 layout in landscape, Apple apps that are primary tend to keep roughly the same full-size layout they’d normally have, shrinking and putting some buttons closer together where necessary; in the 50/50 mode, though, apps tend to resize more and switch to iPhone-inspired hybrid layouts, usually by moving some top buttons to a bottom toolbar (Safari) or by turning sidebars into cards (Reminders).
\nThe most important difference between Slide Over and Split View is that while Slide Over forces a compact app onto the one you’re in with no consequence on the primary app that stays in the background, Split View requires both apps to support multitasking with compact and regular size classes. Split View needs two iOS 9 apps updated for multitasking, otherwise you won’t be able to split the screen in two.
\nIf Split View isn’t supported in an app you’re using, you’ll notice that Slide Over won’t have a vertical divider running across the left side of the secondary app. When writing this review, I used Ole Zorn’s Editorial: the app didn’t support iOS 9 multitasking, so I could use Slide Over to interact with a secondary app, but I couldn’t enter Split View.
\nIf Split View is supported, you can tap the divider and iOS will bring the primary app in the foreground, prepare its layout, and present you with two apps on screen at the same time. The process takes less than a second on the iPad Air 2; after tapping and before entering Split View, the primary app (which was in the background) is blurred and its icon and name are shown on top of it to indicate which app you’re about to use Split View with.
\nWhen resizing apps in Split View, both app screens will be blurred (iOS doesn’t want to show you the app-resizing process in real-time), and you’ll get the Split View counterparts upon releasing the divider – again, it takes less than a second.
\nThere’s a nice detail worth mentioning about resizing apps. When you’re dragging the divider to resize apps in Split View, it will turn white and both apps will slightly recess in the background to communicate they’re being manipulated by the user.
\nDragging the divider is also how apps are dismissed and how you can return to a full-screen app. To leave Split View manually, you have to grab the divider and swipe right to put the secondary back into the app switcher, or swipe left to dismiss the primary app and make the secondary app full-screen (primary). The process works the same way in both directions, with an app undocking from Split View as you reach the edge of the screen through an animation that pulls it away from the adjacent app and that dims it. It’s a delightful transition, smoothly animated on the iPad Air 2.
\nIn my experience, Apple’s approach has been working well: when they have to adopt compact layouts in Split View, landscape apps are somewhat reminiscent of an iPad mini inside an iPad Air 2 – you can tell they’re using iPad layouts, only smaller.
\nOther times, Apple uses a few tricks to mix and match elements of iPhone interfaces with iPad UIs to save on space, but the end result is not annoying thanks to conventions of the iOS platform. And when they’re using narrow layouts for Slide Over and Split View, iPad apps almost transform into smaller iPhone versions with longer layouts, which is fine for quick interactions. The constraints that Apple has put in place ensure you never end up with odd or uncomfortable app layouts, and that is the best design decision behind Split View and the new multitasking initiative as a whole.
\nSplit View is an option. You’re not relinquishing control of the traditional iOS experience when switching to Split View, and you don’t have to use it if you don’t want to. Apps don’t launch in Split View mode by default: while on a Mac apps launch in windowed mode and full-screen is optional, the opposite is still true on iOS 9. Apps are, first and foremost, a full-screen affair on iOS, and the way Apple designed Split View doesn’t suggest that is changing any time soon.
\nSplit View isn’t like window management on a desktop: Apple wanted to eschew the complexities of traditional PC multitasking and windowing systems, and by rethinking the entire concept around size classes and direct touch manipulation, they’ve largely achieved this goal. They have created new complexities specific to iOS and touch, but it’s undeniable that Slide Over and Split View are far from the annoyances inherent to window management on OS X. The iPad is fighting to be a computer and Split View epitomizes this desire, but it doesn’t want to inherit the worst parts of multitasking from desktop computers.
\n“User control without unlimited freedom” would be a good way to describe Split View. While the user controls when Split View is activated and when it should be dismissed, Apple has (rightly) shied away from granting users the ability to resize app screens manually, have multiple overlapping windows on screen, or overcomplicate the divider with additional menus.
\nThe competition (and some jailbreak tweaks) had set a precedent for this; instead, Apple has shown remarkable restraint in building a split-screen feature which requires minimal management.
\nAs a result, apps in Split View don’t feel like windows at all: users never get to choose an app’s position beyond its primary or secondary state; they can’t drag an app in a corner and put three apps on screen with draggable resize controls. The screen splits to accommodate two apps – and that’s it. It’s easy to grasp, fast, and it feels natural on the Air 2.
\nFor a long time, I thought that Apple wouldn’t bring this kind of multitasking to the iPad because of the complexity therein, but Split View changed my mind. Unlike windows on OS X, I don’t have to worry about overlapping windows and apps to manage windows, which is liberating. I may not be able to look at five apps at the same time, but how often do I need that many apps at the same time anyway?
\nThe fact that Split View is going to be available on El Capitan as well speaks for itself. Windows are great, but managing them usually isn’t. Starting fresh with Split View feels like the best option for the iPad. This isn’t a case of Stockholm Syndrome: limitations and an intuitive design truly open up new possibilities that aren’t weighed down by confusion or complexity.
\nTake the iPad’s camera for example. In theory, by putting two apps that are capable of taking pictures and videos in Split View, you should be able to look at an iOS camera in two places. But that’s not how it works in iOS 9. If you try to snap a photo in two apps when Split View is on, iOS will pause any existing instance of the camera, so you won’t be able to look at the same scene from two different apps.
\nThis feels right given the relationship between hardware and software on iOS: there is only one camera that takes advantage of an iPad’s processor and volume controls to take pictures. Rather than bringing additional complexity to the camera APIs24, Apple designed Split View with clear boundaries that serve the user experience.
\nAt the same time, they’ve also given developers the ability to opt out of Split View if they don’t want their apps to support it. While I suspect that most iPad apps updated for iOS 9 will work with Split View, it’s likely that games meant to be played in full-screen with deeper access to the iPad’s hardware and resources won’t feature Split View integration. This is fair when you consider the increased memory pressure on a complex 3D game that needs to share the screen with another advanced app or game, but I’d still love to see a Split View-enabled Minecraft running alongside a guide to the game or a live chat.
\nSplit View breeds a new set of limitations and complexities, but it doesn’t fall in the trap of imitating PC multitasking. Taken at face value, Split View really is just a way to use two apps at the same time. As we’ll explore later on, that “just” is part of a bigger picture that goes beyond the idea of multiple app screens, with exciting possibilities worth addressing.
\n\nIf you ask any average tech blogger (or YouTuber), they’ll tell you that the iPad is all about “consumption”. While this simplistic reduction of the iPad’s role in millions of people’s lives has been proven inaccurate time and time again, it is true that people love to consume videos on their iPads. And for good reason: the iPad – even in the mini version – makes for a compelling portable player with a fantastic screen and a vast selection of video apps from YouTube and Netflix to VEVO, Plex, HBO, MLB, and thousands more. It wouldn’t be absurd to say that the iPad ushered the Internet in the era of modern, Flash-free video streaming and portable playback. But it also wouldn’t be too outlandish to argue that the iPad’s video player is a relic from five years ago. iOS 9’s Picture in Picture wants to address this problem.
\nThese days, we check our Facebook feeds while we listen to songs on YouTube and we tweet while we’re streaming the latest Game of Thrones or a live baseball game. Video is not a one-app experience anymore. This has been the case on the desktop for years since the advent of media players and YouTube, but our primary sources of video entertainment are smartphones, tablets, and TVs. Due to the absence of a great app experience on the TV we’ve come to rely on phones to multitask while watching video. Shouldn’t a better option be available on the iPad, a device that is big enough to host simultaneous video playback and other apps?
\nEven if we move beyond the tweet-while-watching-TV use case, there is an argument to be made about the utility of a video player that can coexist with apps in a multitasking environment. How many times have you wished you could rewatch any kind of technical session recorded on video while taking notes on your iPad at the same time? Follow an Apple live stream while tweeting, without having to use a Mac for that? How about watching a Minecraft tutorial while playing the game itself? For all the advances of the iPad and iOS platform in recent years, playing video on iOS is still a disruptive experience that requires complete attention. On iOS, video takes over everything else, and it’s easily interrupted as soon as you go back to the Home screen or tap a notification.
\nWith Picture in Picture, Apple is taking a page from Google’s YouTube app and they’re bringing a floating video player to the iPad.
\nIntegrated with the native media player and FaceTime in iOS 9, Picture in Picture turns a video into a resizable box that floats on top of everything and that follows you around everywhere. If you watch a lot of video on iPad, PiP (as it’s also affectionately called by Apple) is easily going to be one of your favorite features in this update.
\nLike Slide Over, Picture in Picture is available on the following iPad models:
\nFor developers, Picture in Picture support can be enabled for apps that use the AVKit, AV Foundation, or WebKit class for video playback.
\nPicture in Picture works like this: when you’re playing a video in an app updated for iOS 9 (such as Videos) or from Safari, a Picture in Picture button will appear in the lower right corner of the standard media player. Tap it, and the video will shrink into a floating player that you can drag around and that docks to the edges of the screen (or on top of Control Center and the keyboard if shown).
\nPicture in Picture shows a progress bar and offers buttons to play/pause, close the video player, and take the video back into the original app, dismissing the floating player. You can’t scrub through a video in Picture in Picture – you’ll have to go back to the player app for that.
\nPicture in Picture doesn’t only work by manually activating it: whenever you leave an app that’s playing a video by clicking the Home button or tapping a notification, the video will shrink to Picture in Picture and it’ll follow you across apps.
\nThe behavior is slightly different for FaceTime video calls. Depending on the orientation of the caller’s device, FaceTime’s PiP will be taller than regular video; it’ll also have buttons to mute, return to the FaceTime app, and end a call instead of controlling playback. Like normal Picture in Picture, FaceTime PiP is activated when leaving a video call on the FaceTime app by clicking the Home button or opening a notification.
\nUnder the hood, Picture in Picture works by integrating with three of iOS’ media player frameworks (including WebKit, so Safari and web views should get PiP support out of the box) and pushing the video player into an upper layer of iOS’ structure, giving the user control on positioning, size, and playback (as you can see, a recurring theme in iPad multitasking). Because it’s meant to be treated as always-playing background media, Picture in Picture floats on top of every app and menu; it sits on the same layer of Control Center, and it can only be obscured by Notification Center.
\nThe only places where PiP doesn’t show up are the system app switcher (although it does float over the Slide Over/Split View one) and the Lock screen. Picture in Picture has been designed to be available anywhere you go: it joins the likes of background audio, VoIP, navigation, and Phone/FaceTime calls as the only features capable of continuing until completion following a click of the Home button.
\nLike Slide Over and Split View, one of Picture in Picture’s tenets is the lack of complex management required to operate it. Playback controls disappear in the video player after a few seconds (you can reveal them again by tapping the PiP). You can move the PiP around by dragging it, and you’ll notice that it’ll snap to a corner of the screen as soon as you release your finger. This is meant to remove the burden of precise positioning that affects desktop apps: there’s no concept of grid spacing or pixel-snapping for windows here; just loose gestures and automatic docking to the closest edge and available space on screen.25
\nYou can pinch and rotate the PiP, but it’ll always reposition itself to the correct orientation; you can also pinch and zoom to enlarge it and pinch close to make the player smaller. You cannot make the PiP as big as you want – just like you don’t have that kind of precise control over the layouts of Split View.
\nHidden Picture in Picture, automatically placed above Control Center.
If you want to keep watching a video but it doesn’t require your undivided attention or if Picture in Picture is getting in the way, you can swipe it out of view and it’ll attach to the edge of the screen. When hidden, video will continue playing in the background, and the Picture in Picture box will display a pulling indicator along the side of the screen that you can grab to bring the video in the foreground again. This is a handy addition for all those times when you’d want a video to keep playing just for the audio while retaining the ability to watch it if needed.
\nWith a total of four screen regions where it can be shown26, four where it can be hidden, and five possible sizes, iOS 9’s Picture in Picture gives you the freedom to watch video anywhere while also ensuring this flexibility doesn’t make the iOS experience cumbersome and confusing.
\nPicture in Picture embodies many of the post-PC principles: it’s uniquely built for touch and it’s not burdened by the expectations of traditional PC window management.
\nPicture in Picture benefits from the clean slate of iOS and the direct interactions of multitouch. You can throw it around and it’ll gain momentum and stick to the closest corner in the direction of your swipe. It feels natural and credible.27 The combination of gestures, intelligent layering, drop shadows, believable physics, and, more importantly, great performance makes Picture in Picture feel joyfully material and, ultimately, practical.
\nWith Picture in Picture, my iPad has gone from being incredibly worse than a Mac to watch videos to drastically superior in one fell swoop. The simplicity and cohesiveness of Picture in Picture are remarkable: whenever I come across a video in Safari, I can click the Home button knowing that it’ll stay with me. I can even start watching another video, and Picture in Picture will automatically pause and resume later. If I tried to do the same on a Mac, I’d have to manually resize windows, perhaps install some third-party apps, and learn a combination of keyboard shortcuts. On the iPad, it’s just a button.
\nIt didn’t take me long to realize that Picture in Picture was going to be a terrific reimagination of video playback on iPad. I was watching John Gruber’s interview with Phil Schiller for The Talk Show at WWDC, and I noticed that I could put the HTML5 video player in Picture in Picture. That led to a fantastic experience: as I was watching and listening, I could open Twitter and tweet a few comments about it without stopping the video, and I could put Safari and Notes in Split View while also playing the video to simultaneously watch, research, and take notes about Schiller’s comments. That was true multitasking, and it helped me be part of a conversation around a live event without regretting my use of an iPad to watch video. The same experience wouldn’t have been possible with iOS 8.
\nI’ve been enjoying Picture in Picture to keep FaceTime video calls going while I do something else, too. Like other types of video, FaceTime used to require my complete attention: if I wanted to have a video call with someone on my iPad, I had to stop whatever I was doing. Now I’m always putting FaceTime video calls in Picture in Picture, which has turned out to be a fantastic lifehack to help my parents out with iOS issues (I can look at their computer screen and my iPad’s web browser at the same time).
\nPicture in Picture doesn’t only level the playing field between iPad and other platforms for video playback – it makes the iPad substantially better thanks to its integration with other multitasking features.
\nI still have some questions and concerns about Picture in Picture, though. Because of Apple’s implementation, developers can choose not to support it. Will YouTube support Picture in Picture if it helps avoiding ads and annotations on a video? Will Netflix and HBO?
\nThere’s always a tension between new iOS features and the best interests of large companies: the incentives of a system integration aren’t always aligned with the companies’ business model. Ideally, Picture in Picture will become so popular that it’ll be impossible for YouTube to ignore it, but as we’ve seen with Twitter before, big companies can be exceptionally resilient and shortsighted when the question turns to being good platform citizens. In the short term, I’m also curious to see if Apple itself will support Picture in Picture with its Music app: right now, videos available in Apple Music don’t support it at all.
\nPicture in Picture is to video what Control Center is to audio. As we increasingly rely on our iPads to watch video, Apple realized it was time to make video playback its own layer, always available across the system, always under the user’s control but free of the complexities that such design would entail on a desktop computer. Picture in Picture is uniquely suited for touch and iOS; I wonder if it’ll ever come to the iPhone as well.
\n\nWhen the iPad was introduced in 2010, Apple praised the full-screen, laptop-like keyboard that allowed for a comfortable and familiar typing experience. For years, the company stuck to that ideal, modeling the iPad’s keyboard after how a Mac’s keyboard would work, with some exceptions made possible by software.
\nApple brought over popup keys (for accented letters and other symbols) from the iPhone; they added a dictation button; they even got creative by demonstrating the awkward efficiency of the almost-forgotten split keyboard for iPad, introduced in 2011. Last year, we began to see the first cracks in the software keyboard wall with custom keyboards for iOS 8, which enabled users to use non-Apple keyboards on their devices.
\nApple has always taken its software keyboards seriously, and that’s resulted in a slow evolution compared to keyboards on other platforms. As we’ve seen with San Francisco, though, 2015 Apple is more open to the idea of tweaking the keyboard they’ve long held in high regard.
\nFirst up is the Shortcut Bar, an extension of the QuickType bar that enables apps to put actions and menus next to QuickType suggestions. Available as monochromatic glyphs, these shortcuts are used by Apple across the OS with Undo, Redo, and Paste actions, and they’re often customized for specific apps to offer access to features that aren’t as easily accessible in the UI – or that sometimes aren’t accessible at all.
\nAt its simplest state, the Shortcut Bar acts as a way to undo/redo operations and paste (both text and any other data). Because shortcuts are programmable, they can change depending on context: when you select text, for instance, undo and redo become cut and copy so they can work alongside paste.
\nBecause Shortcuts can be disabled, copy & paste options are still available in the classic popup menu, so you may have the same options in two places.
\nIt gets more interesting when Apple ties entire menus and iOS integrations to the Shortcut Bar. In Mail, the right side has buttons to bring up a popover for text styling, one to pick a photo or video, and another to show the iOS document picker to attach a file to a message. Both attachment options are also available in the copy & paste menu, but icons in the Shortcut Bar make them more visible and obvious: they’re always displayed even if you’re not selecting text.
\nIn Notes, the Shortcut Bar is used to offer a unified redo/undo/copy shortcut (it uses a popover), a button to create a checklist, and another to show a text style popover on the left side of the keyboard. On the right side, there are shortcuts to add a photo or video to a note and create a sketch.
\nI’m a fan of the Shortcut Bar. Fast access to a subset of app menus and actions trumps similar shortcuts available in the copy & paste menu, especially because they only require one tap and can present interfaces right above the keyboard. I wasn’t using QuickType suggestions before iOS 9, but the Shortcut Bar pushed me to enable the additional keyboard row.
\nThird-party developers will be able to provide their buttons for the Shortcut Bar, and I’m curious to see what they’ll do in their apps.
\nIn iOS 9, the UITextInput protocol has been enhanced with the ability to provide a UIBarButtonItem within a UITextInputAssistantItem object, which will be displayed in the Shortcut Bar. Developers can specify the placement of shortcuts, choosing between left (leading) and right (trailing). Any action can be associated to a button in the Shortcut Bar as it behaves like a UIBarButtonItem that is typically found in a toolbar or a navigation controller.
\nA demonstration of custom buttons in the Shortcut Bar.
The share sheet can be tied to the Shortcut Bar, too.
The few betas of iOS 9 apps I could test sported shortcuts to navigate between text fields, add attachments to a note, or share content with the tap of a button. Developer Daniel Breslan showed me a demo of popups and share sheets triggered from the Shortcut Bar; in Drafts 4.5, users will be able to turn custom keys into Shortcut Bar items, which will be displayed alongside QuickType to perform actions on the current draft.
\nThe Shortcut Bar is an ingenious way to use the extra space of the iPad keyboard to save a bit of time with preconfigured actions. Apple is providing some good examples in their default apps; hopefully, third-party developers will use their imagination for this aspect of iOS 9 as well.28
\nI’ve been covering iOS for a few years now, and I’ve regularly lamented the lack of faster text selection and editing controls on iPad. Since its inception in 2007 and update for iPad support in 2010, text selection on iOS has left much to be desired, particularly for those looking to compose and edit long pieces of text.
\nWhen it was posted on YouTube in 2012, the Hooper Selection (according to the creator, the single most duped radar at Apple) wasn’t only clever as most concept videos: it felt inevitable and it embodied the multitouch nature of iOS. As I wrote back then, it made a lot of sense.
\nThree years after Hooper’s popular concept, Apple has listened to the community and used the Hooper Selection as the basis for the new trackpad mode on iOS 9 for iPad.
\nSelecting text in trackpad mode with a two-finger tap & hold + swipe.
The core proposition of trackpad mode is that you can swipe with two fingers on the keyboard to freely move the cursor around and control its placement in a text field. It works like you’d expect from a trackpad on your Mac: characters disappear from keys as soon as you start swiping, indicating that you’re free to use the whole area as a trackpad.
\nThere’s no sound effect accompanying cursor placement in trackpad mode, but there is one subtle visual cue that hints at the connection between the native iOS cursor and trackpad mode. When first placing two fingers on the keyboard, the cursor will animate to split in two: the main cursor (which can have a custom color set by the developer) will loosely follow your swiping direction even if it’s outside the bounds of a text field, and a smaller gray cursor will more precisely track your intention and be attached to characters. It sounds more complex than it actually is in practice – a testament to the fact that intuitive touch text selection is a tricky problem to solve.
\nTrackpad mode isn’t limited to cursor placement, as it can also be used to control text selection. Tap with two fingers on the keyboard and iOS will select the word closest to the cursor; keep swiping left or right, and you’ll extend your selection. Or, hold two fingers on the keyboard, wait for the cursor to transform into the text selection controls (with another nice transition), and then start extending your selection by swiping.
\nEasier text selection through swipes on the keyboard fixes a major annoyance of editing text on iPad. While the direct relationship between words, selection, and touch could be appreciated in the early days of iOS, it slowed down the entire process after people figured it out and just wanted to be more efficient when selecting and editing text.
\nApple’s new trackpad mode works well with the Shortcut Bar: the iPad keyboard now encompasses selecting, editing, and performing actions with a unified interface. This shows how Apple is staying true to the multitouch promise of the original iOS keyboard: the unique advantage of touch keyboards is that you can always update them. Trackpad mode is a good example of that kind of mindset. It’s still the same keyboard, but now you can do more with it.
\nIn my tests, trackpad mode performance has been solid and gesture recognition fairly accurate with some instances of accidental text selections and the cursor becoming stuck in a text field. Nothing that couldn’t be fixed by swiping again or tapping a letter to “reset” the keyboard.29
\nI’ve been using trackpad mode to edit posts I publish to MacStories every day, and I believe it is superior to what we had before. Trackpad mode doesn’t make the iPad more like a Mac: it is only available for text editing, and rather than “putting a mouse on the iPad” it uses multitouch to bring a new behavior for the software keyboard. I find trackpad mode to be smooth and natural: I’m particularly fond of its precise character control, which has been a boon to fix typos in Editorial and manage text selections in Notes.
\nApple may have Hooper to thank for inspiring trackpad mode years ago, but iOS 9’s implementation is all theirs. Trackpad mode is well suited for the large iPad display, and it’s good to see Apple trying new things with multitouch again.
\n\nWith iOS 7, Apple introduced support for programmable shortcuts on external Bluetooth keyboards. While iOS supported system-wide commands for text formatting and undo back in the early days of the iPad, iOS 7 allowed developers to add custom shortcuts to their apps.
\nAdoption of the feature didn’t work out as expected. Apple was inconsistent in their usage of keyboard shortcuts: some apps didn’t support them at all; others had full sets of shortcuts matching those available in the same app for OS X; others, like Messages30, only supported some of the shortcuts available on the Mac. Even worse, there was no API to inform the user about keyboard shortcuts: while Mac apps could use the menu bar as a place where users could click around and learn shortcuts, the same wasn’t the case on the iPad. Support for keyboard shortcuts in iPad apps was sporadic, poorly documented, and we even tried to create a webpage to showcase shortcuts supported by popular apps. It’s fair to say that the class used to register keyboard shortcuts, UIKeyCommand, failed to gain traction among users and developers in 2013.
\nWith iOS 9, Apple has reworked the OS’ support for external keyboard shortcuts. In the process, they’ve given developers a unified way to teach users about shortcuts and they’ve also brought over some great time-saving commands from OS X.
\nKeyboard shortcuts can be specific to a single view controller and apps can now display a cheat sheet with a list of supported shortcuts. The cheat sheet, called Discoverability, is an overlay that appears in the middle of the screen upon holding the Command key on a keyboard. In Discoverability, each app will be able to list the shortcuts it supports with labels and required key combinations. Developers can choose to assign labels to keyboard shortcuts with the optional discoverabilityTitle
property of UIKeyCommand; while an unlimited number of shortcuts can be set in an app, only those with an associated label will be displayed (according to the order set by the developer) in the Discoverability overlay.
Discoverability is a notable change as it helps exposing keyboard shortcuts in a consistent way: instead of having to read through an app’s About page, you can press a single key on the keyboard to get a system-wide cheat sheet for each app that exposes keyboard shortcuts.
\nAssigning shortcuts on a per-view controller basis is also a welcome change from iOS 7. In iOS 9, developers can program subsets of shortcuts that are enabled in specific views of their apps, which will then appear in Discoverability only when you’re in that section. In Safari, for instance, the list of shortcuts will be slightly different depending on whether you’re on a webpage or the browser’s Favorites view31; in Notes, the Find Note shortcut will only be supported in a list of notes and not in the main view of the app.
\nDifferent shortcuts depending on an app’s view.
This is another instance of iOS 9 adopting the best features of OS X without their burden: because shortcuts are only listed in Discoverability when they can be used, there’s no concept of unavailable, grayed-out shortcuts in iOS 9. If you see a shortcut in Discoverability, you can try it and it’ll do something. In the months I’ve spent exploring the keyboard shortcuts supported by Apple in iOS 9, this has proven to be a better implementation than the often-unavailable, non-contextual shortcuts listed in the menu bar drop-downs of Mac apps.
\nThe increased flexibility of UIKeyCommand and Discoverability should make for a quicker adoption of keyboard shortcuts among third-party apps. As far as Apple goes, they’ve done a good job at supporting shortcuts in their apps and being more in line with what they offer on OS X. Messages can now finally send by pressing the Return key; Mail, Safari, and Calendar offer a good selection of shortcuts to navigate the UI, switch between views, and perform actions; some apps don’t have shortcut support (Podcasts, Photos, App Store), but it’s no big deal.
\nWhere Apple also surprised me is in the Home screen shortcuts that are actually available as system-wide options. Go to the Home screen, press Command, and notice how iOS 9 supports Command-Tab and Command-Space? Those aren’t just commands you can use on the Home screen: you can Command-Tab through apps and open Search from anywhere on iOS 9.
\nModeled after the equivalent shortcut for OS X, Command-Space allows you to launch iOS’ Spotlight. However, unlike OS X’s Spotlight, the search panel on iOS doesn’t come up as a system-wide popup; instead, pressing Command-Space will take you to the Spotlight section that is normally accessed by swiping down on the Home screen (more on this later). Essentially, Command-Space acts as a more traditional implementation of the dedicated Spotlight key that iPad keyboards from third-party companies have exhibited for the past few years. It would have been nice to see a modal Spotlight available across the entire OS; for now, this will do – if only in terms of muscle memory for those coming from OS X.
\nThe Command-Tab app switcher is a real treat. If you’re used to Command-Tabbing on your Mac, you’ll feel right at home on iOS 9 for iPad. Pressing the shortcut will bring up an overlay with app icons (nine in landscape, six in portrait), which you can cycle through by pressing Tab again; like on OS X, Command-Shift-Tab cycles through apps in reverse order.
\nThere is one key difference between OS X and iOS 9 for Command-Tab: while OS X employs the Command-Tab switcher to enable users to move across open apps, the concept of “open” app is more blurred on iOS. For this reason, Apple chose to bring Command-Tab to iOS as a way to jump between recently used apps.
\nLike the iOS app switcher puts the last app you used front and center in the UI, hitting Command-Tab on iOS 9 immediately highlights that app, so you can lift your fingers off the keyboard to switch back to it. Therefore, it’s best to think of iOS 9’s Command-Tab as a faster, keyboard-based version of the system app switcher, limited to the most recent apps and aimed at letting you jump between them, regardless of their state.
\nThe improved support for external keyboards in iOS 9 is a step in the right direction. iOS 7 was a timid and inconsistent attempt at offering shortcuts for apps. iOS 9 feels like a more complete thought around external keyboards and their relationship with OS features and apps.
\nHowever, for two step forwards, there’s one step back for Apple here: new multitasking functionalities such as Picture in Picture, Slide Over, and Split View don’t have any sort of integration with external keyboards. I suppose that these new features have led to more questions for Apple engineers, but this is not an impossible problem to fix. A way to activate Slide Over and Split View and Picture in Picture shortcuts would help a lot.
\nThere’s still a long way to go for iOS to gain full independence from touch when a physical keyboard is connected, and we’ll probably never get to that point. iOS is, ultimately, a multitouch platform and external keyboards are an option – they’re not meant to be treated as the only input system of an iPad.
\nFrom such standpoint, it’s easy to pardon Apple for not supporting the new multitasking features, Control Center, Notification Center, navigation in the search page, or actionable notifications with keyboard shortcuts. The deeper you get into iOS’ architecture, the harder it becomes to justify an input method that doesn’t involve touch.
\n\nWith new multitasking features, Apple had to rethink parts of the core structure of iOS for iPad. This has introduced novel challenges and complexities, some of which haven’t been addressed in this release.
\nWhile Slide Over reinforces the idea of primary and secondary app by dimming the primary app in the background, such distinction isn’t available for Split View. There’s a reason for that: Split View is meant to let you use two apps at the same time, and downgrading one of them to a lesser state would diminish the idea that you’re able to interact with two apps on screen. But this poses a question when the keyboard is shown: where are you typing?
\nHint: look at the ‘Done’ button.
With the current Split View design, there is no strong indicator of which app is receiving input from the iOS keyboard. There is an indicator – the blinking cursor – but it’s not persistent or clear enough if you need to discern where the keyboard is going to type into.
\nI don’t believe this will turn out to be a major problem in practice: in using iOS 9 on my iPad, occasionally typing into the wrong app in Split View hasn’t made me long for the simpler times of full-screen apps. Still, as a mere design critique, I believe Apple could figure out ways to better indicate the relationship between apps in Split View and the keyboard. In the meantime, I’d advise developers to strongly consider unique tint colors for the text cursor to help increase visual recognition.
\nThe concept of active app state (or lack thereof) gets worse when you’re using an external keyboard. Let’s play this game again: which app is listening for keyboard shortcuts now?
\nA problem that is somewhat eased by the blinking cursor turns into a bigger usability concern when the cursor isn’t displayed but input is still accepted through an external keyboard. With Discoverability and keyboard shortcuts, you’ll end up with a case of Schrödinger’s Split View: an app is both active and inactive at the same time, and your perspective is all that matters. Or, rather, a single touch matters: the app receiving keyboard shortcuts will be the one where the keyboard was last shown. If you put two apps with a text field side by side, you can tap one after the other to change what Discoverability thinks it’s the active app.
\nThis is not obvious, and you can test it by putting two apps in Split View, connecting a Bluetooth keyboard, and holding Command while tapping both apps’ text fields. The Discoverability overlay will move between the two according to what iOS interprets as last-used and therefore actively-receiving-external-input app. Thankfully, there are plenty of ways that Apple could go to improve the visual structure of Split View – for example, the shared status bar could be a good place for an active app indicator.
\nYou can “fake” multiple instances of Safari by putting Safari View Controller next to Safari in Split View.
One of the key aspects of Slide Over and Split View is that they cannot show two sections of the same app at once. Only individual apps can be displayed concurrently on screen: you can’t split Safari in multiple views and display both views on screen at the same time. If you were hoping to manage multiple Safari tabs or Pages documents in Split View, you’re out of luck.
\nSplitting apps into multiple atomic units for standalone views and documents seems like an obvious next step going forward. If Apple wants to do this, the redesigned system app switcher and its heavily card-influenced design32 could be used to group multiple instances of an app together in a stack.
\nToday, Split View doesn’t support multiple views of the same app. I wouldn’t bet on that always being the case in the future.
\nWhere Apple’s multitasking architecture gets more questionable is in the thread that runs through Slide Over, Split View, and the classic app switcher.
\nIn iOS 9, there isn’t a 1:1 relationship between the app switcher and other multitasking features. The idea of “recent” app is muddled when the system has to account for multiple apps displayed on screen, but Apple could have handled some parts of this differently.
\nWhen you put a secondary app in Slide Over or Split View, that app disappears from the system app switcher. Instead of being displayed as a smaller card to the right of the current app, it’s completely hidden from the switcher’s UI. When in Split View, the secondary app is an empty rectangle on the right side of the primary app.
\nThat empty area next to Twitterrific is a secondary app in Split View.
I assume that Apple didn’t want to bring distinct touch targets to each card in the switcher; my issue is that this looks like a bug, and users should have the ability to resume Split View through a preview of both apps. Hiding the secondary app in the switcher is also a mistake as it prevents users from retrieving an app the traditional way.
\nIn an ideal state, iOS would honor the placement of primary and secondary apps. They would both be displayed in the app switcher when active; their position would always be the same as you move across the OS and all of its different ways to launch apps.
\nOn some level, iOS does exactly this: when you’re in Split View and return to the Home screen, opening another app that supports it will launch that app in Split View automatically, bringing in the secondary app from the right side again. This is good design: clicking the Home button when in Split View (and Slide Over) pushes the secondary app into the right edge of the screen before going to the Home screen, a transition that highlights the spatiality of multitasking and the division between primary and secondary app. iOS even remembers Split View if you quit the primary app and open it again, or after a device restart.
\nApple made a good call in making Slide Over and Split View as “sticky” as possible, and especially Split View feels like a feature that can be left on all the time. It could be debated whether iOS should offer settings to launch apps and notifications in the secondary app pane33, but this is a good start.
\nAnother problem occurs when the order of recent apps is overridden by the app switcher used in Slide Over and Split View. It took me a long time to figure this out, but here goes: the app switcher used in these two modes doesn’t show the same list of recent apps you see in the system app switcher (the one that comes up with a double-click on the Home button). Instead:
\nI couldn’t make out the reason behind this choice initially. Now, I see Apple’s motivation and the kind of experience they’re going for – but it could be a confusing one for those accustomed to a certain spatiality in the app switcher.
\nOne order here…
…another here.
Apple thinks that most users will tend to frequently use the same apps in Slide Over and Split View. Whether it’s an iMessage next to a webpage in Safari or a mind map alongside a podcast episode, the new multitasking app switcher is built on the assumption that frequency of use is more convenient than hierarchy and spatiality at all costs. From a design perspective, it’s an interesting conundrum: would it be better to always display the same order of apps when double-clicking the Home button and swiping from the right edge of the screen, or does the new app switcher deserve its own layer of “recently used” apps that can override whatever you recently used outside of Slide Over and Split View?
\nIt took me a while to warm to this idea, but after a few months I see Apple’s perspective and they have a point. When I’m doing something that requires interacting with two apps at the same time, I tend to go back to the same pair of apps over and over. Mail next to Safari to open links in a web browser (and great to unsubscribe from unwanted newsletters) and Messages next to Calendar; iThoughts in Slide Over when I’m writing in Editorial. I have developed my own workflows and routines for multitasking on iOS 9, and, while on principle the spatiality of the main app switcher should be respected, in practice altering the first three apps to be the ones you last used in Slide Over and Split View leads to a faster, more efficient way to multitask. Moving between apps in Slide Over and Split View is faster because of this design.
\nThe redesigned app switcher puts the last used app front and center.
The system app switcher itself has been redesigned with cards that display an app’s icon and name at the top of a preview of its last state; along with the new look, the order has been reversed, so you’ll now be swiping right to go back in the list of recently closed apps instead of going left.
\nI’ve been thinking about the redesigned app switcher a lot, and for a while I couldn’t understand why Apple would want to change the app switcher other than to spice it up. From a design standpoint, I find the new switcher to be nicer and smoother than the old one34, but, on a spatial level, it would make more sense to keep the old left-to-right order to have the Slide Over/Split View app picker fit with the rest of the UI and spatiality of the feature (everything stays to the right).
\nI still can’t find a single, perfectly solid argument in favor of the redesigned app switcher – just a few possible explanations.
\nWith Slide Over and Split View, Apple has allocated new UI layers to the limited space of an iPad. For the most part, the company has done a good job at keeping the resulting interactions as obvious as possible, but they’ve created peculiar edge cases and questions, too. As I’ve explored above, some of them haven’t been addressed in this first version of iOS 9.0; given their extremely specific nature and, arguably, philosophical problem rather than practical inconvenience, I wouldn’t be surprised if they’ll take a long time to be “fixed”.
\nThe concept of direct manipulation in iOS makes considerations on the spatiality and physical consistency of software intriguing – Slide Over and Split View exist in their own UI and UX space, but they also have to coexist with features and layers that predate iOS 9. Spatially speaking, Slide Over and Split View have managed to find their place on iOS. Can the same be said at a visual level?
\nBack in the old iPad days, there was a certain romance about opening an app and knowing it would be the one and only app you could look at while it was active. It was a consistent, pure iteration of the iPhone’s app model, glorified for a 10-inch screen and embellished by baroque textures of antique leather materials, fine exotic woods, and futuristic robot servants. It was, in many ways, beautiful.
\nToday, this is what you can achieve if you try really hard to push iOS and Split View to their limits:
\nOver 10 layers of user interface displayed in a multitasking environment that involves two apps being used at the same time. Is this the same device that launched to reviews praising how fantastic it was to be forced to use one app at a time?
\nNew features always bring new complexity, and true utility is found in the balance of increased functionality and added confusion. iOS 9 multitasking is no exception: it’s powerful, but it also dramatically increases the amount of information being displayed on the iPad’s screen.
\nWe have to ask: is it too much?
\nIf you consider Picture in Picture, Split View, Control Center, extensions, notifications, app menus, Command-Tab, and the system keyboard, an iPad can be as visually cluttered as a traditional desktop computer. Unlike a Mac, the iPad’s screen real estate is limited and the software keyboard is part of the UI, which leaves even less room for app content to be displayed. Because of its touch constraints, iOS on an iPad Air 2 can even seem busier than OS X.
\nConsider Split View and Picture in Picture. Let’s say you’re working with Notes on the left and Safari on the right while watching a video via Picture in Picture. With both apps shown in landscape mode and the keyboard active, there isn’t much room left for the video overlay, but you can keep it on either side if you don’t always need to look at what you’re writing or researching. There’s already a lot going on.
\nNow you’re in the middle of your note-taking session, Picture in Picture is playing, and you realize you need to work with an extension in Safari. Picture in Picture is displayed on top of Safari, and you hit the share icon to bring up the share sheet. What happens next?
\nBecause Picture in Picture sits on top of anything that is displayed underneath, the share sheet you just activated is fully hidden by the video player, which you’ll need to move to the left – the only possible location at this point – if you want to use extensions. Fair? Yes, because Picture in Picture is purposefully designed to always follow you around. More complex than what we’re used to on iOS? Also a “yes”.
\nAnd it’s not just Picture in Picture that raises questions on the visual clutter now possible on an iPad with iOS 9. What happens when you combine the split keyboard – it still exists! – with Picture in Picture and a bunch of share sheets? How about a custom keyboard, Split View, and Command-Tab? There’s a lot of different styles and UI elements claiming their own space at once.
\nAfter using iOS 9 on my iPad for the past three months, I’ve wondered if Apple had gone too far in the name of productivity. The iPad used to be the friendly device that could be a calculator, a book, a newspaper, or the web in the palm of your hands. With iOS 9, we’re presented with a device that can show dozens of layers of content at the same time, often from three different apps, with a UI design that doesn’t bring back any feeling of familiarity from physical objects that you hold in your hands.
\nAnd that’s okay.
\nFor the iPad to grow up, new challenges and complexities ultimately have to be accepted. The entire point of this discussion – and the thread I’ve tried to develop in this chapter – isn’t to argue that Apple should have built multitasking features with no complexity, but to observe how they reacted to the complexities they were destined to introduce all along.
\nThe iPad can’t be more without becoming more complex, but the way that complexity doesn’t become complicated is under Apple’s control. With no restraint, we might have ended up like this. And to truly appreciate the fine balance struck between functionality and confusion, we need to look back at iOS 7.
\nIf Apple hadn’t redesigned iOS from the ground up with a focus on text, color, clarity, and content, I don’t think iOS 9 multitasking would be as usable as it is today. Modern apps may have lost some of the craft and skills required to recreate realistic visuals, but as a result designers and developers now have to pay attention to aspects of functionality, accessibility, and inter-app communication that were largely ignored or glossed over before.
\nI’ve been writing about iOS apps for a while, and I’m not affected by any kind of nostalgia for the good old days of skeuomorphic trends. This isn’t about taste: from a mere functional standpoint, iOS today is more legible, clear, and better prepared to support apps that work together.
\nBy hitting the reset button on a design pendulum that had swung too far in favor of photorealism, Apple has created a more uniform, scalable iOS app ecosystem that feels and works as a cohesive force instead of a bunch of fancy toys. It may be less “fun” for designers who don’t have to spend weeks setting up a particular leather texture in Photoshop. But iOS today is better equipped for users of all types and needs, and true care for design is found in how it serves people, not in how much it’s appreciated by fellow designers on Dribbble.
\nThis is evident with new multitasking features in iOS 9 for iPad. When using two apps in Split View, you’re likely not going to end up with a wooden shelf on the left side and a metallic cabinet on the other, all while a small video player with rounded corners floats around the screen projecting heavy shadow effects on top of everything. Instead, using Apple’s apps (and third-party ones) in Split View feels like a unified, consistent experience: two pieces of software based on the same design language, not clashing with each other, distinct but homogenous. The more subdued, unassuming nature of the modern iOS makes sheets of content as well as entire apps feel like part of a whole system – multiple components of the same machine and not competing protagonists of the same stage.
\nVisual uniformity plays an essential role in making iPad multitasking appear simple and devoid of additional clutter, and it suggests that Apple has been preparing for the increased complexity of iOS 9 for a long time. I wouldn’t be surprised if bigger iPhones and the prospect of iPad multitasking ended up being key factors in the decision process of iOS 7 back in late 2012. Today’s iPad multitasking wouldn’t be possible without an OS that can scale and adapt depending on a user’s task, screen size, language, or preferred device orientation. With the system that Apple has put in place – from San Francisco and the Retina display to the keyboard and extensions – every piece works together.
\nThere is an added complexity to the overall iPad experience when using new multitasking features in iOS 9, but it’s always mitigated by understandable constraints. Whether it’s the two possible sizes for Split View or the magnetic behavior of Picture in Picture, the increased capabilities of iOS 9 on the iPad come with acceptable limitations. There are some instances of Apple trying to be too clever or redesigning features for reasons that aren’t completely clear yet; for the most part, though, Apple has been judicious in its implementation of simultaneous apps and system features.
\nOn paper, iPad multitasking can be visually complex. In practice, you’re not going to always be dealing with overcrowded Split Views displaying multiple menus. iOS 9 multitasking strikes a great balance of optional power tools and the underlying simplicity that has characterized the iPad for the past five years.
\nI bet it’s going to be even better on a 13-inch device.
\n\nMy work for MacStories and Relay FM involves writing posts, doing research in Safari, taking notes, managing email, and communicating with others. I use a lot of apps every day, and while I try to automate tedious and repeating tasks as much as possible, there’s always quite some switching required to jump from app to app, even after the extensibility brought by iOS 8.
\niOS 9 has profoundly changed the way I work from my iPad. This has been true for Apple apps, as well as third-party ones I’ve been testing this summer.
\nI’ve come to use Apple’s Notes app every day. Thanks to its built-in support for Slide Over and Split View, I’m using Notes as a persistent scratchpad next to Safari, Mail, and Messages. While the use case is the same as before – I’m taking notes or reading text from a note – the speed and the simplicity granted by multitasking are completely new. I’ve been able to read Apple’s documentation for iOS 9 on one side of the screen and take notes at the same time with Notes in Split View; I can scroll my Twitter client, take a quick note with Slide Over, and later reopen that note while I’m talking to Myke so we can go through the thoughts I saved.
\nNotes and multitasking have also been phenomenal additions when testing new apps and updates for MacStories reviews. When I’m trying a new app, I can use multitasking to save first impressions about the interface and user experience. Later, when I have to send feedback to the developer, I can put Mail and Notes in Split View, copy and paste text, and I’m done.
\nThe savings granted by iPad enhancements in iOS 9 may seem small as individual entities, but they add up over time. If I’m listening to an episode of Mac Power Users and I want to open show notes in the browser, I can just put Safari next to Podcasts, tap all the links I want to check out, and Split View will open those links in Safari on the right side.
\nIf I’m researching a topic and I realize I need to visualize it with a mind map, I can fire up the excellent iThoughts, split the screen to have Safari next to it, and turn my iPad into a more proficient research tool that is both a mind map and a browser at the same time.
\nIn fact, this review has been entirely researched, composed, and edited on my iPad Air 2 thanks to iOS 9’s new iPad features.
\nSoon after Apple seeded the first beta of iOS 9 to developers, I installed it on my iPad and started working on iOS 9 research. Since then, I only used a Mac twice for tasks related to the review that couldn’t be done on iOS. First, I had to install the beta OS on my iPad and transfer WWDC session videos to the Videos app with iTunes.35 Then, I had to use El Capitan’s Safari Web Inspector to measure performance of Content Blockers in iOS 9’s Safari. These two occasions aside, OS X was never involved in any part of the writing or editing process on my end.
\nWith Videos and Picture in Picture, I was able to play a WWDC session while looking up information in Safari and taking notes at the same time. With the new Shortcut Bar and cursor control gesture, formatting and editing text in a note (to fix typos, delete lines, or create headings) was quick and painless. If Picture in Picture was getting in the way, I could dock it to the side and hide it from view. For about two weeks, I only took notes, watched videos, took more notes, and copied and pasted links and images from Safari36 into Notes in Split View. I saved tweets about interesting iOS 9 features from Twitterrific into Notes with the share extension, too.
\nThen, when the time came to turn my notes into an outline with some kind of structure, I used a beta version of iThoughts to enter Split View so I could look at the notes and mind map side by side. This way, I was able to have my stream of notes on the right and paste them or rewrite them on the left in the mind map. iThoughts has fantastic shortcurts to create new nodes and sub-nodes with the Return and Space keys of the software keyboard, and that (combined with swiping) was a great way to create branches in seconds. If I needed to insert images in the mind map, I could use the built-in image picker or copy an image from Safari in Split View and paste it into the map. This went on for about a month.
\nWhen I started writing the actual review in mid-July, I used Slide Over extensively because Editorial – my text editor of choice – didn’t support iOS 9 multitasking. This is the unsung benefit of Slide Over – it brings the convenience of iOS 9 multitasking to any app, regardless of their support for the more powerful Split View.
\nI’ve used everything in Slide Over while writing in Editorial. I used it to reply to iMessages without leaving Editorial so Myke and my girlfriend didn’t think I was ignoring them (I was just very focused). Safari, iThoughts, Photos, Notes, and Mail were excellent to look up my mind map and reference material while assembling chapters. Later on, as I got more betas with support for iOS 9 multitasking, I used apps such as PCalc for quick calculations in Slide Over, Terminology to look up words in the dictionary (not even Apple has something like that), and Dispatch to instantly turn new email messages into tasks.
\nSafari is the perfect example for Slide Over.
Because of iOS 9 multitasking and enhancements to the iPad’s software, for the first time this year I was able to produce a full review on iOS and be happy about it. Picture in Picture, Slide Over, Split View, and keyboard changes were great additions to my writing process, and their effects touched every area of my iPad workflow.
\niOS 9 is a watershed moment for iPad users, and a game changer for the iPad platform. I’ve been using the iPad every day for the past three years, and iOS 9 brings a radical new way to work with apps on the device.
\nFor a long time, I thought that I didn’t want split screen multitasking on the iPad. I was afraid that such feature would make the iPad less focused – that eschewing the principle of one app at a time would result in the iPad losing its way. After using iOS 9, I can say that I was abundantly wrong.
\nWorking with Dispatch and Workflow in Split View.
With iOS 9 multitasking, I feel more focused when working on my iPad because I’m switching between apps less. It sounds absurd, but it’s not: because I no longer lose the context of what I’m doing by clicking the Home button, multiple apps aren’t a distraction. I’m not focused on one app at a time anymore. I’m focused on a task. And if that involves multiple apps, the iPad can handle it.
\nWith iOS 9, an app on the iPad no longer necessarily commands the entire screen. This makes the iPad more comparable to a traditional computer, with the screen being used not to mimic a single utility but as a canvas for software, in multiple shapes and forms.
\nThe complexities created by the ability to manage concurrent apps have been largely kept at bay by Apple’s design choices, which include limited compact sizes, selected corners for Picture in Picture, and, like extensions last year, a lack of tools to programmatically activate multitasking – meaning, you’ll always have to touch the screen to initiate Slide Over and Split View.
\niOS 9 multitasking’s most evident pitfall is lack of drag & drop between apps in Split View. You would think that, given two apps side by side and a platform based on touch, Apple would have built a system to move information from one app to another. This isn’t available yet, and it’s my biggest wish for future iterations.
\nOther improvements I’d like to see in multitasking would be a faster way to flip the secondary and primary app in Split View, as well as support for external keyboard shortcuts in multitasking. Right now, there’s no way to show Slide Over or manage Split View with an external keyboard, which slows me down when I don’t want to touch the screen.
\nThe inconsistencies of the classic app switcher and what’s still missing don’t change the underlying premise. iPad multitasking on iOS 9 transforms how the device can be used as a computer every day. I run MacStories entirely from my iPad, and multitasking has dramatically sped up how I work. But what Apple has done with iOS 9 on the iPad goes beyond multitasking alone.
\nFor the first time since its launch in 2010, the iPad is ready to have its own unique OS. Apple’s focus on familiarity and consistency with iPhone OS was a selling point of the iPad five years ago, but with time it became a liability.
\nFor the past four years, iOS for iPad has mostly felt like a rushed adaptation of the main iOS for iPhone, with uninspired designs scaled up from the smaller screen. iOS 9 shows some progress on this front – such as a redesigned Notification Center with two widget columns – but it’s possible to come across UIs that have been enlarged from the iPhone without proper consideration. However, the sheer amount of what’s new and exclusive to the iPad in iOS 9 offsets what is left over from the previous era.
\nToday, we’re seeing what iOS for iPad should be. A version of iOS that shares the same underlying technologies and design language of the iPhone, but optimized for the different hardware and interactions of the iPad. There’s still work to be done, but what we have today is an impressive step up from iOS 8. With multitasking, keyboard changes, bigger folders, and Picture in Picture, all past mistakes are forgiven.
\niOS 9 is the first version of iOS that isn’t afraid to let the iPad be the iPad. Consistent with the iPhone, willing to take its own risks, and reminiscent of a Mac without the baggage of OS X.
\nWith iOS 9, the iPad has entered adulthood.
\n\niOS 9 introduces a supercharged Spotlight that, under the umbrella of Search, aims to lay a new foundation for finding app content and connecting apps.
\niOS 9 Search: accessed by swiping down (center) or from a dedicated page (right).
iOS Search is accessed by swiping down on any Home screen (like the existing Spotlight) or as a standalone page to the leftmost side of the first Home screen in a return to form that does more than just search. When swiping down from the Home screen, the cursor is immediately placed in the search box, ready to type; if opened from the dedicated page, you’ll have to tap the search box or swipe down to bring up the keyboard – an important difference geared at showcasing other features available in this screen. As far as searching is concerned, both modes lead to the same results.
\nOn the surface, iOS 9 Search augments the existing Spotlight by extending its capabilities beyond launching apps and searching for data from selected partners. With iOS 9, you’ll be able to look for content from installed apps, such as documents from iCloud Drive, events from a calendar app, or direct messages from a Twitter client. This can be done by typing any query that may be relevant to the content’s title and description; results display rich previews in iOS 9, optionally with buttons to interact with content directly from search.
\niOS Search is more than a fancier Spotlight. Changes in this release include new APIs for local apps and the open web, highlighting Apple’s interest in web search as an aid to apps. Some aspects of it aren’t clear yet – and Apple has been tweaking quite a few things over the summer – but the nature of the change is deep and intriguing.
\nA key distinction to note in Apple’s implementation of Search is that are two different indexes powering results that appear in Spotlight and Safari. A local, on-device index of private user content and data that is never shared with anyone or synced between devices; and a server-side, cloud index that is under Apple’s control and fed by the company’s Applebot web crawler.
\nIn iOS 9.0, the focus is mostly on the local index, which will power the majority of queries on user devices.
\niOS 9 can build an index of content, app features, and activities that users may want to get back to with a search query. It’s comprised of two APIs: CoreSpotlight, an index of user content built like a database that can be periodically updated in the background; and our friend NSUserActivity, this time employed to index user activities in an app as points of interest.
\nResults from Maps and WhereTo have buttons to open directions.
From a user’s perspective, it doesn’t matter how an app indexes content – using the new search feature to find it always works the same way. Search results from apps in iOS 9 are displayed with the name of the app they’re coming from, a title, description, an optional thumbnail image, and buttons for calling and getting directions if those results include phone numbers or addresses. Visually, there is no difference between results powered by CoreSpotlight and those based on user activities: both display rich previews in iOS 9 and can be tapped to open them directly into an app.
\nOn a technical level, the difference between CoreSpotlight and NSUserActivity for developers is that while activities are intended to be added to the on-device index as the user views content in an app, CoreSpotlight entries can be updated and deleted in the background even if the user isn’t doing anything in the app at the moment. For this reason, a todo app that uses sync between devices may want to adopt CoreSpotlight to maintain an index of up-to-date tasks: while entries in the index can’t be synced, developers can update CoreSpotlight in the background, therefore having a way to check for changes in an app – in this example, modified tasks – and update available results accordingly.
\nApple is also giving developers tools to set expiration dates for the CoreSpotlight index (so entries in the database can be purged after some time to prevent the archive from growing too large) and they can rely on iOS’ existing background refresh APIs and combine them with CoreSpotlight background changes to keep the local index fresh and relevant.
\nApple’s own apps make use of both CoreSpotlight and NSUserActivity to build an on-device index of user content. Mail, Notes, Podcasts, Messages, Health, and others allow you to look for things you’ve either created, edited, organized, or seen before, such as individual notes and messages, but also the Heart Rate pane of the Health app or an episode available in Podcasts.
\nSearch results from Drafts and Dispatch.
I’ve also been able to try apps with support for local indexing on iOS 9. Drafts 4.5 enables you to search for text from any draft stored in the app. Clean Shaven Apps has added CoreSpotlight indexing support to Dispatch, allowing you to look for messages already stored in the inbox and a rolling archive of the latest messages from another mailbox. On iOS 9, iThoughts lets you search for any node in a mind map and jump directly to it from search, bypassing the need to find a file in the app, open it, and find the section you’re looking for.
\nWhereTo’s iOS 9 update was perhaps the most impressive search-related update I tested: the app lets you search for categories of businesses nearby (such as restaurants, coffee shops, supermarkets, etc.) as points of interest, but you can also get more detailed results for places you’ve marked as favorites, with a button to open directions in Maps from search with one tap.
\nApple has given developers a fairly flexible system to index their app content, which, with a proper combination of multiple APIs, should allow users to find what they’re expected to see in their apps.
\nHowever, this isn’t all that Apple is doing to make iOS Search richer and more app-aware. Apple is building a server-side index of crawled web content that has a connection to apps – and that’s where their plans get more confusing.
\nIn building iOS 9 Search, Apple realized that apps often have associated websites where content is either mirrored or shared. For the past several months, Apple has been crawling websites they deemed important to index their content with a crawler called Applebot; now, they’re ready to let every website expose its information to Applebot via web markup. The goal is the same: to provide iOS Search with rich results – in this case culled from a much larger source.
\nThe server-side index is a database in the cloud of public content indexed on the web. Unlike a traditional search engine like Google, though, Apple’s primary motivation to keep a cloud index is to find web content that has an app counterpart, so users can easily view it in a native app.
\nThink of all the services that have native apps for content that is also available on the web: from music services to online publications and websites like Apple’s online store or Foursquare, many of the apps we use every day are based on content that comes from the web and is experienced in an iOS app. From such perspective, Apple’s goal is simple: what about content that can be viewed in an app but that hasn’t been experienced yet by the user? What about a great burger joint near me listed in Foursquare that I still haven’t seen in the Foursquare app, or an article about Italian pasta that I haven’t read in my favorite site’s app yet?
\nInstead of having to search Google or use each app’s search feature, Apple is hoping that the iOS Search page can become a universal starting point for finding popular content that can be also be opened in native apps.
\nBecause the web is a big place, to understand the relationship between websites and apps Apple has started from an unexpected but obvious place: iTunes Connect. When they submit an app to the App Store, developers can provide URLs for marketing and support websites of an app; Apple can match those websites with the app in their index and crawl them with Applebot for content that could enrich Search. These pieces of content – such as listings from Airbnb or places in Foursquare – will then be available in the Search page and Safari (the browser’s search feature can only search this type of content, as it doesn’t support local app search) and will open in a native app or on the indexed webpage if the app isn’t installed.
\nHow Applebot “sees” a MacStories article.
To teach Applebot how to crawl webpages for iOS Search and give results some structure, Apple has rolled out support for various web markup technologies. Developers who own websites with content related to an app will be able to use Smart App Banners, App Links, and Twitter Cards to describe deep links to an app; the schema.org and Open Graph standards are used to provide metadata for additional result information.
\nApple calls these “rich results”. With schema.org, for instance, Applebot is able to recognize tagged prices, ratings, and currencies for individual listings on a webpage, while the Open Graph image tag can be used as an image thumbnail in search results. The goal is to make web-based results rich in presentation as their native counterparts. When you see rich descriptions and previews for links shared on Twitter, Facebook, or Slack, Open Graph and schema.org are usually behind them. Apple wants the same to be true for iOS search results. They’ve even put together a search API testing tool for developers to see how Applebot crawls their webpages.
\nIn practice, it’s been nearly impossible for me to test the server-side index this summer. In my tests from June to early September, I was never able to consistently find results from popular online services (Airbnb, Foursquare, eBay, or Apple’s own online store) that were relevant to my query or capable of enriching the native experience of an app on my device. In the majority of my tests, web results I managed to see in Search were either too generic, not relevant anymore, or simply not performing as advertised.
\nFor example, when typing “Restaurants Rome Foursquare” without having the Foursquare app installed on my device, I got nothing in iOS Search. My assumption was that popular Foursquare results would be available as Applebot-crawled options in Search, but that wasn’t the case. Same for Airbnb, except that I occasionally managed to see listings for apartments fetched from the web, but they weren’t relevant to me (one time I typed “Airbnb Rome Prati”, and I somehow ended up with an apartment in France. The result was nicely displayed in Search though, with a directions button for Maps).
\nI’ve started seeing some web results show up in iOS Search over the past couple of days under a ‘Suggested Website’ section. Starting Monday (September 14th), I began receiving results from the Apple online store, IMDb, and even MacStories. I searched for content such as “House of Cards”, “iPad Air 2”, and “MacStories iPad”, and web-based results appeared in Search with titles, descriptions, and thumbnails. In all cases, tapping a result either took me to Safari or to the website’s native app (such as the IMDb app for iOS 9 I had installed). Results in Search were relevant to my query, and they populated the list in a second when typing.
\nThe issues I’ve had with web results in iOS 9 Search this summer and the late appearance of “suggested websites” earlier this week leads me to believe that server-side results are still rolling out.
\nThe most notable example is that Apple’s own demonstration of web results in search, the Apple online store, isn’t working as advertised in the company’s technical documentation. The web results I saw didn’t appear under a website’s name in Search, but they were categorized under a general ‘Suggested Website’. In Apple’s example, Beats headphones should appear under an ‘Apple Store’ source as seen from the web, but they don’t. My interpretation is that proper server-side results with rich previews are running behind schedule, and they’ll be available soon.
\nA change in how NSUserActivity was meant to enhance web results adds further credence to this theory. As I explored in my story from June, Apple announced the ability for developers to tag user activities in their apps as public to indicate public content that was engaged with by many users. According to Apple, they were going to build a crowdsourced database of public user activities, which could help Applebot better recognize popular webpages.
\nHere’s how Apple updated its documentation in August:
\n\n\n Activities marked as eligibleForPublicIndexing are kept on the private on-device index in iOS 9.0, however, they may be eligible for crowd-sourcing to Apple’s server-side index in a future release.\n
Developers are still able to tag user activities as public. What is Apple doing with those entries, exactly? Another document explains:
\n\n\n Identifying an activity as public confers an advantage when you also add web markup to the content on your related website. Specifically, when users engage with your app’s public activities in search results, it indicates to Apple that public information on your website is popular, which can help increase your ranking and potentially lead to expanded indexing of your website’s content.\n
If this sounds confusing, you’re not alone. To me, this points to one simple explanation: Apple has bigger plans for web results and the server-side index with a tighter integration between native apps (public activities) and webpages (Applebot), but something pushed them back to another release. The result today is an inconsistent mix of webpages populating Search, which, as far as web results alone are concerned, is far from offering the speed, precision, and dependability of Google or DuckDuckGo in a web browser.
\nThere’s lots of potential for a search engine that uses web results linked to native apps without any middlemen. If working as promised, iOS Search could become the easiest way to find any popular content from the web and open it in a native app, which in turn could have huge consequences on app discoverability and traffic to traditional search engines – more than website suggestions in Safari have already done.
\nHowever, this isn’t what iOS Search is today, and it’s not fair to judge the feature based on the merit of its future potential. The server-side index is clearly not ready yet.
\n\nToday, iOS 9 Search is useful to find content from installed apps. The ability to look for specific pieces of content and quickly get to them has been a major addition to my workflow and, if anything, the biggest hurdle in using Search has been remembering that I can now look for everything on my devices. After years of being used to opening apps first and finding content second, it’s hard to kick a habit entrenched in muscle memory.
\nFor their own apps, Apple has done a good job at ensuring content found in Search is properly displayed and highlighted once opened in the relevant app. When you open a message from Search, the message bubble in the Messages app is darkened to indicate it’s the selected result; a reminder opened from Search gets a bold title in the Reminders app; podcast episodes and Mail messages are also shown as selected items in the respective apps after you open them from Search. Part of this was already in place with iOS 8, and it’s been extended to more apps with iOS 9.
\nAs for third-party developers, it’s up to them to figure out ways to restore their app’s state when selecting search results, but most of the apps I tried with Search support – Dispatch, Drafts, iThoughts, and others – used similar techniques to update their UIs and restore results.
\nI’m still learning how to remember that I can now find information and documents more quickly thanks to Search. I’ve become a fan of the ability to look for songs and playlists in My Music and play them right away from Search – and I like how iOS adjusts the ranking of songs based on those I’ve been listening to recently. Searching for messages in Mail has been considerably faster and more accurate when done from Spotlight than the app’s own search feature. I love how the Podcasts app exposes show descriptions and notes to search, and I’ve grown accustomed to jumping to specific sections of the Health app and iThoughts via Search. I can’t wait to see what apps like Slack, Dropbox, Editorial, and Pocket will do with Search and how that will speed up the way I move across apps and tasks.
\nMy main concern with new data sources available for Spotlight is that Apple hasn’t built more advanced controls to choose how app content ends up in there. In iOS 9, it’s possible to turn off apps that populate results in Settings > General > Spotlight Search, but there’s no way to reorder apps and make sure that, for instance, Mail results are always at the top. This, combined with the way iOS 9 dynamically ranks results based on engagement and puts some of them in a Top Hits section, has caused me some confusion from a spatial perspective, as results aren’t always in the same position or in the same order.
\nAlso, because indexing local app content can be a CPU-intensive task, background updates to the database may not be immediate (though this has been sporadic in my tests) and the search functionality of NSUserActivity and CoreSpotlight is not supported on the iPhone 4s, iPad 2, iPad (3rd generation), iPad mini, and iPod touch (5th generation). Developers will have to carefully consider how to index their app content to avoid consuming too many resources, but I’m optimistic the system will scale gracefully on latest hardware. Results on my iPad Air 2 come up almost instantly as I start typing a query, and they continue to update in real time as I add keywords.
\nApple hasn’t built a traditional document-based search feature in iOS 9. For the past two years, the company has been enhancing its Spotlight search tool with external integrations such as Wikipedia results, movie showtimes, and snippets of web results from Bing. iOS 9 expands that to account for the richness of data inside apps.
\nWhile users will be able to launch apps and look for documents in the traditional way, Search in iOS 9 is aware of the unique nature of apps, which may include activities, points of interest, sections of a document, and other subsets of content. In a post-PC world, it makes sense to have a new kind of search that focuses on what’s inside apps rather than filenames alone.
\nModern apps aren’t static containers of files. They’re rich experiences, and iOS 9 can index the activity that takes place inside them. App search in iOS 9 has lived up to my expectations. I’m waiting to see what Apple has in store for their server-side index.
\n\nDeep linking is where the pieces of Apple’s app search and navigation puzzle come together. iOS 9 marks Apple’s long awaited foray into native deep linking, and the company is betting heavily on deep links as a superior way to launch apps, navigate them, index content, and share results.
\nDeep linking refers to the ability to link to a specific location within an app. Deep links are URIs that open apps into discrete navigation points, such as the Steps screen in the Health app, a profile view in Tweetbot, or an email message in Mail. With proper support from developers, any iOS app screen or activity can have its own deep link. Over the years, a number of third-party companies have attempted to establish cross-platform standards for deep links in apps; with iOS 9, Apple aims to provide deep linking support in every app natively.
\nDeep links provide structure. With deep links, individual app sections and activities can be restored when a user opens a link to them from Search. They power smart reminders so Siri can create todos that contain a deep link to reopen an app’s view. Smart App Banners, used by developers to match their websites to native apps, enable Applebot to associate web links with deep links and prioritize certain webpages over others when indexing webpages.
\nUnder the hood, deep links lay a new foundation for launching and navigating apps on iOS. This starts from the app launching animation itself: alongside a revised multitasking switcher, iOS 9 features a new transition for going from one app to the other that pushes the current app to the left and slides onto a new one to the right.
\nThanks to this, opening apps on iOS feels more like navigating pages of the OS – a metaphor that reinforces the idea of deep links capable of connecting specific sections of apps together. The animation feels faster than iOS 8 and it makes sense within the spatiality of iOS 9.
\nThe new app launching animation does more than offering new visual eye candy – it showcases iOS 9’s deep linking capabilities in the status bar. New in iOS 9, every time you’ll leave an app to open another one by following a link or a notification, you’ll get a back button in the upper left corner of the status bar to go back to the “launcher” app.
\nThe back button has long been a staple of Android devices; at least initially, it was surprising to see Apple follow a similar approach for iOS 9. The similarities with Android’s back navigation feature are only superficial: iOS’ back button is built into the status bar and it offers a shortcut to return to the app that launched the one you’re in; it’s overridden every time an app launches another one. If you open a link from Messages into Safari, the status bar will show a back button to return to Messages; if you tap a notification when in Safari and open Twitter, the back button will only return you to Safari.
\niOS 9’s back button doesn’t go back into the entire navigation stack of recent apps. It sticks to the status bar even if you move around an app, but it’ll disappear after two minutes. It’s not a persistent back button – it’s a temporary shortcut.
\nThe back button is useful when combined with deep links that open apps into specific views. When used from Search, the back button makes it easy to view a result, go back to Search with one tap, pick another, and so forth. It also makes tapping notifications less disruptive to the user experience, as returning to what you were doing is one tap away. With it, the role of the Home button is considerably diminished in iOS 9, as returning to the previous app is easier and more contextual.
\nThe placement of the back button is problematic. When following a link or a notification into another app, the button will be displayed in the left corner of the status bar, hiding Wi-Fi and carrier information – an essential detail that tells us how our devices are connecting to the Internet. On more than one occasion, I found myself following links and wondering why an app wasn’t loading – I couldn’t tell if the app was having problems, or if my 4G network had poor reception because the back button was covering up everything. I wish Apple had thought of a gesture to manually dismiss the back button; on the iPad, they could have at least placed it next to Wi-Fi and carrier information given the bigger display.
\nI sympathize with the struggle to find a placement for this button. Ultimately, an OS with superior deep linking features benefits from a system-wide shortcut that lets users navigate back and forth between apps as they would with webpages, and that corner of the status bar is the least intrusive option. It’s not perfect but it could have been worse; in practice, it’s useful and consistent.
\nApple’s plans for deep linking extend beyond search, smart reminders, and the back button. With iOS 9 Apple is introducing Universal Links, a way to launch and link to apps with web links. With Universal Links, Apple is letting websites and apps communicate through the common thread of a URL, with verifiable ownership, graceful fallbacks, and cross-platform support.
\nA Universal Link opened from Safari (back button) in MeisterTask.
Universal Links are meant to offer a superior option to custom URL schemes for launching apps and sharing links to them. A Universal Link is a web link that, with a file uploaded by developers on their app’s servers and integration in Xcode, iOS 9 can open into a native app instead of its website. Upon first launching an app with Universal Links support in iOS 9, the app will check for the configuration file on its server; from that point on, whenever possible, HTTP links to that domain will open in the app, showing the deep linked view.
\nTo understand how Universal Links work, imagine that Twitter will start supporting them for twitter.com URLs and their iOS app. Every time you tap a link to a tweet on iOS 9, that link will open the tweet in the native app instead of Safari if you have it installed. If you don’t have the app or share the link with someone who doesn’t have the app, the link will open in the browser as a fallback because it’s a normal URL.
\nImagine this for links to shared projects in a todo app, songs on a streaming service, Slack uploads, Overcast podcast episodes, or Google Docs files. With a regular HTTP link, iOS 9 will take you to the content you’re looking for inside a native app. Universal Links are platform neutral: if the app isn’t installed, they go straight to the web anyway.
\nThe flow of a Universal Link from Google to IMDb’s app and webpage.
When writing this review, I was able to test a version of IMDb with Universal Links. When opened from Safari, Google, Search, and any other app, imdb.com links automatically opened in the native IMDb app, showing me the content – such as trailers or movie pages – I would have seen on the web by default in iOS 8.
\nUniversal Links are meant to provide a safe, cross-platform way to share links to content in apps without relying on custom URL schemes. Instead of linking to a user profile in Twitter with the custom 'twitter://
URL scheme, you’ll be using the same twitter.com links you see in the browser every day. With a custom URL scheme, if you don’t have the app installed and tap the URL, it does nothing. URL schemes are local; Universal Links are global and local at the same time.
Universal Links carry important benefits over the old way to link to specific areas or features of apps. Universal Links are always mapped to the right app: while different apps can claim the same URL scheme on iOS, Universal Links work by matching an app with a JSON file on the app’s server; this ensures that links from a certain domain can only open in its associated app. In Twitter’s case, this could mean that, if installed, twitter.com links will always launch the official Twitter app, and The Iconfactory and Tapbots won’t be able to do anything about it as they can’t control the twitter.com server.
\nIt was obvious for Apple to elect web URLs as the best way to link to apps: web links are omnipresent in today’s communications, they work everywhere, and they are the common language of the web.
\nUniversal Links are designed to not be noticed and to feel as seamless as possible. For the most part, that’s exactly what using them is like – you tap a link and, if it’s a Universal one, it’ll open in an app.
\nThere are some aspects of the process that you can control. When opening a Universal Link, iOS 9 will display a forward button on the right side of the status bar (opposite to the back button) to give you the option to view the link in Safari instead. The same issues mentioned for the back button apply here as well, as the shortcut takes over battery and Bluetooth icons (and looks comically alone on the iPad). However, the ability to jump from native app to webpage with one tap is convenient, and I couldn’t imagine any other place for it.
\nIf you choose to view a Universal Link in the browser, a banner will sit atop the webpage with an Open button to return to the native app if you change your mind. This is the equivalent of a small Smart App banner, but it’s not as obtrusive. It’s a nice idea, and it lets you cycle through native app and web view for a Universal Link with one tap.
\nAll together, iOS 9’s new deep linking features make for a unified app experience built with speed, security, and consistency in mind. They’re also signs of a mature OS and app ecosystem that are ready to talk to each other with links that connect app content to system features, eschewing the numerous hacks and workarounds of custom URL schemes.
\nGoing back to iOS 8’s app switching design and limitations feels cumbersome after trying iOS 9. Deep links and Universal Links dramatically speed up moving between apps – and they reduce the Home button to a mere hardware option for going back to the Home screen. The back button, while not perfect and perhaps a bit inelegant at times (reaching it also requires a certain dexterity), is a great shortcut, and I can’t imagine using iOS without it now. I wasn’t able to try many apps with Universal Links support, and, while I believe they won’t be suitable for apps that don’t rely on web content, I believe they offer a superior option to URL schemes in every way.
\niOS apps are starting to feel less and less like silos. Aided by the back button, deep links and Universal Links are another step towards more interconnected apps.
\n\nFor many of us, iOS devices are the most important computers in our lives. With iOS 9, Apple is rolling out a series of features for proactive recommendations and intelligent suggestions aimed at making the devices we use every day smarter, more contextual, and personal. The results are mixed, but the beginning of something new is afoot.
\nThe source of all of iOS 9’s proactive and intelligent features is our data and daily routine. By using information we store in system apps such as Mail and Calendar and by observing our habits, iOS 9 can discover patterns in the apps we use, when and where we tend to use them, and it can offer shortcuts to show us what we’re most likely going to need next. In broad strokes, this is at the core of Apple’s proactive initiative: by learning from our habits and data, iOS can be a more helpful assistant in everyday life. Unlike similar efforts by other companies (namely Google), Apple has prioritized user privacy when building these functionalities into iOS 9, a design choice with a deep impact on what the OS is capable of suggesting.
\nIntelligent and proactive recommendations are scattered throughout iOS 9 with shortcuts in various places. Some of them are labeled as Siri features, while others are new options in existing apps that use content from other apps to save time with suggestions.
\nAs a starting point, the new Search page features Siri suggestions for contacts and apps. Displayed at the top of the page and replacing the old contact shortcuts of the iOS 8 app switcher, these shortcuts aren’t indicative of contacts explicitly marked as favorites or apps you’ve recently used. Rather, these suggestions are based on what iOS thinks you’re going to need.
\nRecommendations are informed by different variables, such as frequency of use, time of the day, day of the week, current location, and other patterns the OS spots and that are used to build up suggestions for apps and people. There’s a chance you’ll see an app you only use on Thursdays and friends you only contact during the weekend. One time, I got a suggestion for my Shopping list in Reminders, and I later realized it was because I was at the grocery store and iOS had memorized the list I frequently used when shopping there.
\nI also saw a recommendation for this review’s EPUB in iBooks (with a read completion status) when I was assembling the eBook and constantly checking it out in the app.
\nWhen working as advertised, Siri suggestions are handy because they bring serendipitous discovery to the Search page. And because they’re not a static set of favorites but a dynamic, continuously updating list of shortcuts that learn from you, they adjust alongside your routine and what you’re likely to do next.
\nFor example, I use the Do Button app every night before sleep to tell my girlfriend the exact time I went to bed with an email. Now, iOS 9’s Search page shows me a shortcut to that app every night between 3 and 5 AM, when I typically go to bed. I’ve seen suggestions for Google Maps at specific places where iOS knew I was going to start navigation in the app, and a couple of friends pop up on Saturdays because I tend to text them and ask them to go out for dinner (when tapping a contact, iOS displays phone, message, and FaceTime actions for it). iOS 9 has picked up some of my habits, and when I come across a suggestion that is accurate, I’m glad iOS is helping me save time.
\nThat’s not always the case, though. The patterns that I described above are fairly easy to spot as they’re repeatable, discrete routines that stand out from everything else. But I use my iOS devices all day every day, and I switch between apps and conversations a lot. The average result is that, for me, Siri suggestions in the Search page mostly are a random selection of shortcuts with the occasional gem that appears at regular intervals when needed. That’s the problem with a general purpose suggestion feature based on “patterns”: when you use your iOS device too much (as I do), app and contact suggestions tend to feel like a lottery. They can’t spot too many distinguishable patterns just by looking at which apps I launch.
\nApps like Slack, Twitterrific, and Twitter are always listed in my Search page, but that’s because I always use them. When I don’t see Twitter and Slack, I see a repeat of apps I’ve recently used – the same entries from the app switcher. Do these make sense as persistent suggestions intermixed with shortcuts based on specific times of the day and locations? Wouldn’t it be better to identify the apps I’m constantly using and display them in some kind of separate Top Hits view? Of course I know I’m going to be reading Twitter and Slack. There’s no point in iOS telling me to do so.
\nAnother issue, I believe, is that Siri suggestions in this screen don’t have any explanation attached to them. iOS 9 doesn’t say “Good morning, here’s what happened in Slack last night and I think you’re going to need CityMapper next because you take the subway on Tuesdays”; it just brings up a bunch of shortcuts, leaving it up to you to figure out if they’re useful or not. Sometimes, they are, and it feels nice. Most of the time, they are too generic to warrant a top spot in the Search page, and I think of turning them off entirely. I haven’t yet because I’ve seen how they can be useful at times, but I’d like them to be more than recently used apps.
\nAlong the same lines, the Search page offers shortcuts for Nearby businesses in Maps. Besides the fact that, for my area in Rome, Apple’s business database continues to be outdated and lackluster, I don’t understand the kind of Maps suggestions iOS gives me.
\nI would expect these recommendations to account for my habits (as tracked by iOS’ Frequent Locations feature) and likelihood of needs to show me businesses relevant to my routine and time of the day. Instead, iOS’ Nearby section has shown me all sorts of business suggestions at the most disparate times: convenience stores at 9 PM alongside “Nightlife” POIs37; coffee shops and restaurants at 3 AM38; “Fun” and “Transport” suggestions in the afternoon, which usually don’t go well together.
\nIn three months, I have never found the Nearby suggestions in the Search page to be useful. The categories are too broad for me to understand at a glance whether the place I’m looking for is “Fun” or “Nightlife”, and the time of the day when they are displayed is usually not in line with what I do on a daily basis. I would prefer iOS to provide me with practical advice for individual places I frequently visit, such as the current traffic to get to my neighborhood supermarket or weather conditions at my favorite beach. Alas, this kind of detail and personalization isn’t available for Nearby suggestions, and that’s disappointing.
\nThe other system-wide proactive mechanism of iOS 9 is standalone app recommendations. In this case, iOS will suggest an app to launch in the same area where Handoff for apps is displayed in the Lock screen (bottom left) and app switcher (at the bottom in iOS 9).
\nApp suggestions can be displayed in the Lock screen (left), or in Handoff.
These shortcuts are, like the Search page, accounting for different variables to recommend an app you’re likely going to need. Because of their placement in the UI, they can be more easily noticed when your device enters a scenario that iOS identifies as a pattern.
\nIn addition to time of the day and location, iOS 9 can monitor Bluetooth and audio connections and guess which app you may need when that happens. If you tend to open the Music app after plugging in your EarPods, iOS will bring up the Music icon and media controls in the Lock screen, or it’ll display a shortcut in the app switcher telling you that you can open Music because an audio connection has been detected (I like how these suggestions come with an explanation). Or, if you like to watch Netflix with your Beats Wireless on, iOS will also spot that pattern and recommend Netflix as soon as a Bluetooth connection is established.
\nHandoff has a new location in iOS 9.
The same variable can lead to different recommendations in different times of the day. Listen to audiobooks on your way to work in the morning but to podcasts when going back home? iOS 9 will show different apps for those two scenarios, learning and adjusting over time.
\nApple also added contextual awareness support for getting in and out of the car in iOS 9, combining that with proactive suggestions. This isn’t well documented by Apple, but as seen with Reminders, iOS 9 has the ability to recognize user presence in a car by looking at connections to generic car Bluetooth devices as well as CarPlay. The car becomes another dimension for smart suggestions in iOS 9, which can give you app shortcuts based on what you do – such as listening to podcasts or music when driving – but also traffic notifications for where you’re most likely going. The idea of using the car as another layer of user patterns is an intriguing one, and while I couldn’t test this because I don’t have Bluetooth in my car, impressions from hundreds of users I polled on Twitter were positive.
\nI find individual app suggestions to be nice, and generally more timely and relevant than what I see on the Search page. While not revolutionary, it’s nice to be able to quickly open Music or Overcast when my Beats are connected via Bluetooth, and I’ve been surprised by how iOS picked up that I was going to need Google Maps or Nuzzel at specific times of the day, putting them on the Lock screen. As more and more sensors fill our homes and clothes going forward, I fully expect iOS to gain support for deeper context recognition – imagine suggestions powered by HomeKit devices, proximity to an Apple TV, or beacons.
\nThat’s not to say that app suggestions in the Lock screen and app switcher are perfect: I’d still like to see more targeted suggestions for Music, as iOS hasn’t figured out that I like to listen to Death Cab for Cutie every night before bed. The granularity and timeliness granted by audio and Bluetooth connections have led to more useful app suggestions in my experience, but they can improve.
\nNext up is Mail, which is used in iOS 9 as an information database in three ways. In Contacts, you can search for people found in Mail and add them as new contacts with some fields already filled-in, or you can add new email addresses to an existing contact as iOS will match the same person between Contacts and Mail. This has been useful to update old contacts with new email addresses from my own correspondence.
\nSecondly, when receiving a phone call from a number that’s not in your contacts (we all dread those phone calls), iOS 9 tries to discover who it is by looking into Mail messages. I only had that happen once, but it worked as expected.
\nLast, iOS is more proactive in offering to create new calendar events or contacts from messages that contain such information. A new banner displayed at the top of a message provides a shortcut to create new events and address book entries with one tap, which is a more visible option than iOS’ existing support for smart data detectors in message bodies. iOS 9 can also detect events from messages that contain flight details or restaurant reservations and put them in Calendar for your consideration; you can choose to turn off event suggestions based on Mail in Settings.
\nUsing Mail as a repository of information for other apps is an interesting idea: despite its somewhat archaic nature, a lot of our communications and notifications still come through email, and scanning Mail to bring up shortcuts and suggestions seems like a good idea to me (and I would like to see more of it). If anything, these features highlight the benefit of using Apple’s native Mail and Calendar apps over third-party clients, which will get none of these integrations. I wouldn’t be surprised to see users keeping Mail fetching messages in the background without using it just to take advantage of suggestions.
\niOS 9’s proactive suggestions for Contacts and Calendars don’t stop at Mail. When I was on vacation with my girlfriend, we used Apple Maps in Positano to browse restaurants nearby and call them to make a reservation. I didn’t have their phone numbers in my address book, but I noticed that iOS 9 used the restaurant’s name (taken from Yelp, I’d guess) in the Recent Calls screen, which was a nice touch.
\nIn Calendar, events that contain an address now offer the option to use a Time to Leave notification that will send an alert when it’s time to leave for an event depending on your current location and traffic. When receiving the notification, you can snooze it for later or tap it to see directions and get going.
\nAt 2 AM, it takes me 5 minutes to get there, not 16.
This is another interesting idea, but it hasn’t worked well for me in practice. Events that I knew would take me 10 minutes to get to a location consistently sent Time to Leave notifications 25-40 minutes ahead of time. Not even after “teaching” the system that my driving style and traffic weren’t as imagined by its intelligence did iOS learn that there was no need to send a notification 40 minutes early. I’ve wondered if the wiggle room between the notification and event time could be cultural: perhaps Americans like to arrive early at their events and don’t mind waiting 20 minutes while sipping on their ventis. But in Rome, there’s no such thing as arriving early or spending 20 precious minutes to wait for someone else. Time to Leave is a cool idea, not suited for my habits and local traffic.
\nThere are more intelligent and proactive features throughout the Search page and the OS, but I’ve found their realization to either be dull or their impact to be minimal. You can ask Siri to bring up photos from specific time period and albums. The Search page has a News section at the bottom, which, as a European using an iOS device with a US region format, I found to be an unappealing mix of news about presidential elections, football, and TV spoilers; in theory, iOS should be able to display news relevant to my location, but given that they’ve always been American-heavy topics, I imagine iOS is basing its news collection skills on the user’s region format. You can also ask for weather, calculations, and sports results in the Search page, but only calculations and unit conversions worked for me (I’d call them an expected utility more than an intelligent feature).
\nOur devices are becoming smarter and more context-aware every year, but Apple’s foray into intelligence and proactive suggestions doesn’t substantially alter the user experience of an iOS device. Instead, what Apple has put together is a mix of sometimes-working, nice-to-have additions that feel unfinished or that are poorly realized. Suggestions in the Search page leave much to be desired, with useful patterns that are obfuscated by generic and unmotivated shortcuts. Standalone app recommendations based on location and audio connections and Mail’s intelligent scanning is where Apple’s vision feels clear and coherent, with delightful discoveries that can save some time every day.
\nIf you’re accustomed to the level of automated intelligence in Google services such as Google Now and Inbox, iOS 9 won’t offer that kind of experience. The Search page is far from the uncanny precision of Google Now when plugged into all of your Google data, and Mail’s new shortcuts pale in comparison to the automated processing and organization tools found in Inbox.
\nThis is by design: while Google can pull it off thanks to their expertise and investment in looking at patterns across all of your data (which happens to be their business model), Apple has decided to prioritize user privacy as much as possible. That’s why proactive suggestions are (mostly) processed directly on-device, with Apple never syncing any of your usage patterns between devices or matching data from one Apple service to the other to build a more complete profile of you.
\nIt comes down to personal preference and the level of potential creepiness you allow in your computing life. Google tends to deliver impressive intelligent features and shortcuts, at the expense of a wealth of data given in return. Apple’s efforts in iOS 9 are more modest in scale, but deeply integrated with the OS and built with privacy in mind.
\nIt’s not about arguing who’s better; it’s about choosing what works better for you. Are you comfortable with Google’s impressive intelligence and suggestion tools knowing that they need as much data about you to power them, or do you prefer Apple’s more private but also less effective approach?
\nMy stance on these issues has changed a lot since two years ago, especially after trying Inbox, Google Now, and having to go back to Google Apps’ Gmail due to slow IMAP sync and search. While I conceptually don’t like the fact that my data is being used by an army of algorithms, the service I get in return is useful and it lets me work faster every day. When it comes to faster work, measurable efficiency trumps ideological stances. As an Italian movie once said, you can’t buy groceries with ideals.
\nIn Google’s Inbox, search is crazy fast, the app detects sentences that look like reminders, and it categorizes emails for me. When I was in San Francisco earlier this year, the Google app automatically pulled in my flight and hotel information from Gmail, displayed it with handy cards in the main view, and it figured out when I arrived at SFO and showed me weather reports and currency exchange rates.
\nThat was pretty amazing and useful, and it’s not something that iOS 9’s intelligence is able to provide just yet. And I have to wonder if it ever will, given that what Google does – the depth of its user tracking and cross-service integration – could only be possible with constant, cloud-based data collection that doesn’t fit with today’s Apple.
\nImagine, though, if Apple was willing to look for patterns inside apps, understanding what we write in private communications and what we search for to spot more useful patterns than app launches and EarPods connections. Would they have the skills required to build such intelligence? Would they want to?
\nUltimately, it’s not fair to compare iOS 9’s intelligence to Google services: Google will never have this kind of access to device hardware and daily user patterns. iOS 9 delivers on small, periodic proactive enhancements that are meant to save time and surprise users. Their impact is not dramatic: some of them are nice shortcuts, but by not deeply aggregating user data from multiple sources, most of them are generic and stale.
\nCaught between the tension of respecting user privacy and deepening data collection for proactive features, will Apple be able to ship more useful suggestions in the future? And how will they build it all?
\nThere’s a lot of work to do. It’s up to Apple to figure out what their ideals can allow.
\n\nEveryone struggles with battery life. Talk to any iPhone owner, and you’ll never hear them wish for shorter battery life on their device. There’s no such thing as enough battery. With iOS 9, Apple is taking some steps toward improving battery life on all devices, with particular attention to the iPhone.
\nAcross the entire OS, Apple claims to have optimized apps and key technologies to be more efficient and consume less energy. On top of this, iOS 9 can use the iPhone’s proximity and ambient light sensors to detect when it’s lying facedown on a surface, and it won’t turn on the screen when a notification comes in. By itself, this sounds like a small change, but if you receive a lot of notifications every day, every drop counts.
\nApple has increased the information displayed for battery usage, too. In the new Settings > Battery page, the list of apps and system functions consuming energy on your device includes some new entries (such as energy consumed by recently deleted apps) as well as additional detail that can be displayed by tapping the list of apps or the clock icon in the top right. This will list the minutes spent by apps in the foreground and background for the selected time range, which makes it easier to assess the consumption of an app depending on actual usage. More importantly, it simplifies the process of determining whether keeping background app refresh turned on for an app that consumes a lot of energy is worth it.
\nThe protagonist of Apple’s battery life improvements is Low Power Mode. Available exclusively on the iPhone, Low Power Mode temporarily reduces power consumption by reducing or turning off Mail fetch, background app refresh, automatic downloads, and “some visual effects”. CPU and GPU performance may also be reduced, and Auto-Lock in 30 seconds is also enforced. When Low Power Mode is active, the iPhone’s battery icon turns yellow, and the battery percentage is displayed even if you normally keep it off.
\nLow Power Mode can be activated in two ways. If you want to activate it manually, you can go into Settings > Battery and toggle it. There’s (almost) nothing stopping you from running your device in Low Power Mode all the time. The most effective way to activate Low Power Mode, though, is to wait for your device to reach 20% or 10% of battery left: when that happens, an alert will allow you to turn on Low Power Mode and start saving on energy until you can charge your iPhone.
\nLow Power Mode isn’t some kind of placebo effect. During the iOS 9 beta period, I activated Low Power Mode every time my iPhone 6 Plus reached 20% of battery left, and I noticed how it gave it a few extra minutes (about 30 in my experience) I could use to get home via Maps navigation or to go back to my car and charge it.
\nSurprisingly enough, Apple has thought about the possibility of users keeping Low Power Mode always on, and to discourage such usage – which could lead to an inferior user experience when there’s no need to – iOS turns it off automatically once an iPhone reaches ~80% of charge (Apple calls it a “sufficient level”). This won’t make it impossible to use an iPhone in Low Power Mode even if it’s not running short of battery, but it should be a deterrent.
\nIn practice, I didn’t really notice any major functional difference when Low Power Mode was on. The “visual effects” that iOS 9 turns off are the animations of Dynamic Wallpapers and the parallax effect in Perspective Zoom, both of which I don’t use and that therefore don’t impact the visual appearance of my iPhone when they’re disabled. If other effects are reduced, I can’t notice: translucencies and other zoom animations are still available in Low Power Mode, so it’s not like iOS turns into a static, opaque mix of colors when Low Power Mode is enabled. As Apple states, background app refresh and Mail message fetch are also disabled in this mode – a fair trade-off when battery life is at stake. For the most part, Low Power Mode is unobtrusive and it yields the results promised by Apple.
\nWhat’s going to be interesting, I think, is how third-party apps will react to Low Power Mode to reduce their own energy consumption. iOS 9 adds a new Foundation API with a lowPowerModeEnabled
property of NSProcessInfo that changes its state when Low Power Mode is enabled. Developers can also listen for a NSProcessInfoPowerStateDidChangeNotification
notification that, as the name implies, informs an app on the state of Low Power Mode. Apple is advising developers to check the state of Low Power Mode in their apps, reducing costly computations such as frequent network activity and high frame rates to save energy.
In Breslan’s case, his app Departure Board could support Low Power Mode by decreasing the usage of an API that checks for train departures every minute, allowing an iPhone to consume less energy by pinging a server less frequently. I tested a few apps with Low Power Mode integration in iOS 9: the email client Dispatch, for instance, is going to disable CoreSpotlight indexing and profile pictures in Low Power Mode to prevent an iPhone from indexing new messages in the inbox and fetching avatars from the Internet.
\nMax Litteral’s Television Time, a TV show tracker, takes similar measures to reduce CPU consumption when the user enables Low Power Mode: the app will stop animations when loading images into table view cells, it’ll disable downloading and animating thumbnails in search results, and it’ll also disable face detection on show posters. Litteral is looking into disabling show sync in Low Power Mode as well, which would further reduce energy consumption by stopping network activity.
\nThe work that Apple has done at a system level to prolong battery life with more efficient apps and Low Power Mode should already be enough to give most users a few minutes of extra battery. But in an ideal world, additional savings could be granted by third-party developers being good platform citizens and supporting Low Power Mode by adjusting their apps’ behavior to consume less energy whenever possible.
\nPerplexingly, Low Power Mode isn’t available on the iPad – a missing feature in stark contrast with the iPad focus of iOS 9. I don’t understand what pushed Apple to make Low Power Mode an iPhone-only perk of iOS 9, and I’d like to see it coming to the iPad as soon as possible. Even if not as essential as an iPhone for emergency scenarios, Low Power Mode could be useful for iPad users looking for some extra minutes of battery life every day.
\nThe last two major versions of iOS have been riddled with bugs and performance issues in their first releases, and it’s good to see Apple prioritizing efficiency, storage, and stability this year. In addition to battery-themed enhancements, iOS 9 introduces smaller software updates for consumers (which I couldn’t measure with developer betas), an option to install iOS updates at a later stage, and App Thinning, a set of optimization tools (slicing, bitcode, and on-demand resources) to decrease the footprint of apps by delivering only the resources needed by a user’s device. I couldn’t test App Thinning either, but I expect it to have major implications for the future of local device storage, App Store downloads, and gaming content.
\nStability-wise, iOS 9 is a step up from iOS 8. In my experience, I have seen only a couple of Home screen crashes and random reboots in three months of iOS 9 betas (down from at least twice a week, both on the iPhone and iPad), fewer interface glitches, and increased stability on the iPad Air 2 with Apple apps, using the app switcher, and extensions. New multitasking features on the iPad Air 2 have been rock solid, with a smooth Slide Over, stable Split View, and flawless Picture in Picture.
\nUnfortunately, memory pressure continues to be an issue on the iPhone 6 Plus, which can still lag in terms of frame rate compared to the Air 2 and that often purges apps from memory to free up resources for the current app. On the 6 Plus, I’m still seeing Safari frequently reload tabs after switching to another app, a slower than usual Camera, and slightly delayed touch responses when moving between apps.
\nApple’s house cleaning in iOS 9 offers a superior experience than older versions of iOS – at least on recent hardware – and it removes much of the cruft that had accumulated over the past two years. The real effect of features such as App Thinning and Low Power Mode for apps will only be seen in the next few months if developers opt into them, and it’s intriguing to imagine a future where they’ll be the norm.
\n\nThere are dozens of other features in iOS 9 that, either in Apple apps or developer APIs, make iOS devices fast and more efficient. A few highlights:
\nUpdated setup process. By default, iOS 9 prompts users to create a complex (6-digit) passcode when setting up a device for the first time. Other passcode types (such as 4-digit numeric code) are still available by tapping an options button. iOS 9’s setup also features a new option to move data from an Android device with a dedicated Android app made by Apple.
\nYes, these are glorious.
Notification quick replies for third-party apps. In iOS 9, apps can implement quick replies in actionable notifications. Introduced with Messages last year, Apple has opened up the API to developers this year, allowing them to mix custom buttons with a text field to type a reply inside the notification without opening the app. I tested this feature with beta versions of Twitter clients and other apps, and it’s incredibly convenient. I expect all messenger-type apps to adopt this.
\nTrust developer certificates. Installing beta versions of third-party apps from outside TestFlight gets a little more convoluted (but also more secure) in iOS 9 with a new system that requires you to manually trust enterprise developer certificate. To do so, you’ll need to go into Settings > General > Profiles > Enterprise Apps and select an installed app you can trust; otherwise, it won’t launch.
\nSafari can save PDF to iBooks. In iOS 9, you no longer need a third-party app to save a webpage as PDF. A new iBooks share extension lets you to save any webpage as a PDF document into the app – perfect for reading articles later or sharing PDFs via email and annotating them with Markup.
\nWi-Fi Assist. Available in Settings > Cellular Data, this new toggle allows an iOS device to use its mobile network when Wi-Fi connectivity is poor. A lot of people have welcome this option, but I prefer to keep it turned off as I like to know exactly which network I’m using, and the automatic switch between cellular and Wi-Fi is not clear in the UI.
\nReplayKit for recording gameplay. For the first time, Apple is offering game developers an API to natively record gameplay without using a third-party SDK. ReplayKit can be initiated automatically by an app as the user is playing, or it can be started manually. Apps can ask users to record the screen only or also access the microphone to include game commentary. A video file generated with ReplayKit is then passed to the system share sheet in-app, so users can save it to the Photos app, send it to other apps, or share it on social networks. With ReplayKit, creating Let’s Plays and video reviews for iOS games should be easier and better integrated with the system.
\nShared Links extensions. Apps in iOS 9 can plug into Safari’s Shared Links section to give users the ability to view links alongside RSS and Twitter links already supported in the browser. I suspect that news readers, social apps, and RSS apps will consider this extension type to make their items available outside in Safari, but I haven’t been able to test any of them yet.
\nMaps Nearby and route delays. In addition to public transit information, Maps for iOS 9 features Nearby categories (a more complete section of the same options in Search) to browse businesses nearby and explore places around you. Also, during navigation, Maps will show banners at the top of the screen for upcoming closed roads, roadwork, and faster routes.
\nSwipe down to dismiss photo previews. In iOS 9, swiping down when viewing a photo in Photos or Messages lets you close the preview instead of reaching out to the Done button at the top of the screen. Another example of a sloppy gesture that simplifies interaction and that cuts down the number of taps required to operate the OS. I’m using this every day now.
\niPad Today view gets bigger. The Today view of the iPad in landscape mode has received a new two-column layout split in Today view and Widgets view. Widgets can still be added as before, but now you can choose to display them with a bigger column on the left (Today) or in a narrow column on the right. This brings a better visual organization of widgets without any functional difference between the two columns, and it takes advantage of the bigger screen to show more widgets at the same time. I’m looking forward to trying this improved layout on the iPad Pro.
\nNew widgets. Speaking of the Today view, iOS 9 adds new widgets for Find my Friends and to glance at the battery level of connected Bluetooth accessories as well as the device itself. The new ‘Batteries’ widget is automatically installed once you pair an Apple Watch with your iPhone – and it’s been a convenient way to check on my Watch battery level directly from my iPhone. Plus, iPhone, Apple Watch, iPad, and even Beats wireless headphones get nice icons to complement the widget’s appearance.
\nSpotlight unit conversions and calculations. The improved Spotlight of iOS 9 goes beyond Search and it borrows from OS X to let you perform quick calculations and unit conversions. You can convert currencies (type or dictate “10 USD to EUR”), perform operations (also with values such as “pi”), convert temperatures, and more. I’ve been using this regularly, and it reduces the number of apps I have to keep on my devices for basic conversions and operations.
\nApple Music gets a new contextual menu. The Music app went through a major redesign for Apple Music and Beats 1 in June, and iOS 9 brings a revamped contextual menu that packs more options in a cleaner presentation. The menu now displays a larger album artwork at the top that clearly indicates you can tap it to go to the selected item. Underneath it, new icons allow you to love a song, start a station, and share. A good improvement as Apple continues to fix and clean up the issues that affected the new Music app since launch.
\nLast, because of limitations in my country, unavailable hardware, or features that can only be tested after the public launch of iOS 9, here’s a recap of what I couldn’t test for this review:
\nIn some ways, iOS 9 feels like the third and final installment of the iOS 7 saga.
\nWith San Francisco, a revised keyboard, and an interface that’s been polished across the OS, Apple’s new design language has moved past its awkward (and problematic) teenage years to accept its own style and voice. Interfaces aren’t diamonds: they’re not forever, and iOS will change again. For now, iOS 9 is, visually speaking, a culmination of the work started two years ago, ready for what comes next.
\nChanges to default apps and new system features also are iterative improvements that complete Apple’s post-iOS 7 vision and get rid of problems accumulated so far. Podcasts, Mail, iCloud Drive, and new features in Safari don’t revolutionize those apps, but they make them substantially better for everyone. Deep links and Universal Links show that Apple has long been thinking about doing more than URL schemes to simplify opening and linking to apps. Safari View Controller extends Safari to web views in any app. User activities and app indexing for Search prove that Apple often plays the long game, showing their hand only when every technology is in place for an arguably basic utility (Spotlight) to become something new.
\niOS 9 isn’t Apple’s new Snow Leopard. Just because some additions and changes may not be as massively popular or be instantly recognizable as a new design and custom keyboards were, it doesn’t mean they don’t exist. The company has put more resources into optimizing iOS, and early results are encouraging. Low Power Mode makes a difference when battery is running low, and developers will be able to support it in their apps; the entire OS feels snappier and more stable in daily usage.
\nThis isn’t the year of No New Features: under-the-hood optimizations and app enhancements walk together in iOS 9, with Notes being a prime example of it. Apple isn’t taking the foot off the pedal: they’re just being more careful behind the wheel.
\nAlong the way, there are some missteps and disappointments that will have to be taken care of in the future. Mail is lagging behind the innovation we’re seeing in email apps from large companies and indie studios. iCloud Drive is far from the functionality offered by Apple’s OS X Finder and other companies’ iOS apps. The intelligence of Apple News and proactive suggestions leaves a lot to be desired, casting doubts on whether it can dramatically improve. And as others are reimagining mobile messaging with integrations and services that were unimaginable a few years ago, major changes are suspiciously absent from Apple’s Messages app this year.
\nAs a result, iOS 9 for iPhone is a more efficient version of iOS that many will appreciate with time as they discover what’s new. Its improvements aren’t as easily marketable as iOS 7 and iOS 8, but they’re not any less important. What Apple hasn’t done this year doesn’t make iOS 9 worse: it just adds to a list of low-hanging fruit for next year.
\nIf you’re an iPhone user and you’re skeptical on whether to upgrade or not, my recommendation couldn’t be easier this time around: iOS 9 is better than iOS 8 in every way and you should upgrade.
\nAnd then there’s the iPad.
\nThis year, the iPad is getting the first version of iOS truly made for it. After too many unimaginative releases, Apple has understood the capabilities of the iPad’s display and its nature of modern portable computer. Free of dogmas and preconceptions of what an iPad ought to be, iOS 9 fundamentally reinvents what an iPad can become going forward.
\nPicture in Picture rethinks watching video on the device. Slide Over cuts down the time required to jump between apps. New shortcuts make working with Bluetooth keyboards on the iPad a joy. And by using two apps at once, Split View reimagines the role of the iPad in the iOS ecosystem, positioning it between an iPhone and a Mac for people like me who need exactly that.
\nI’ve been working on the iPad for the past three years, and the changes in this year’s iOS release have done more to make me work faster than iOS 6, 7, and 8 combined. Apple has created a new beginning for the iPad’s software, and while it’s not perfect and the experience will have to be refined in places, the impact of iOS 9 for iPad users – especially iPad Air 2 users – will be felt in the following years.
\nIt’s not easy to blend tradition with a fresh start, but that’s what iOS 9 does to the iPad. The iPad is still the device where you can immerse yourself in one app at a time. But now you can also do more, using the large screen to switch between multiple apps and tasks with multitouch. And in doing so, you’ll discover that iOS 9 multitasking for iPad doesn’t carry over the complexities and rules that applied to traditional desktop OSes.
\nApple has leveraged years of design and interaction constraints to give more freedom to iPad users, creating an experience that can be more complex but still intuitive. The iPad is not a Mac, but the same argument works in reverse, too: a Mac is not an iPad – it can’t have its portability, it doesn’t have its app ecosystem, and it’s not a screen you can hold in your hands. With iOS 9, there are even more reasons to consider an iPad as a new kind of computer: capable of true multitasking, and built with the strengths of iOS in mind.
\nWith iOS 9, Apple is ready to admit that the iPad is a computer without the baggage of the PC. I know that I’m not going back to a Mac, and this review wants to be a tangible proof of it. Writing this wouldn’t have been possible without iOS 9, and I’ve never felt more focused.
\niOS 9 shows where the future of iPad is. The leap has been taken.
\nFive years later, it’s just like starting over.
\nUIFieldBehavior
, new in iOS 9 as an enhancement to UIKit Dynamics. According to Apple, \"a field behavior defines an area in which forces such as gravity, magnetism, drag, velocity, turbulence, and others can be applied\". ↩︎\nUIFieldBehavior
. ↩︎\nFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.
\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.
\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;
\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;
\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.
\nLearn more here and from our Club FAQs.
\nJoin Now", "content_text": "With iOS entering the last stage of its single-digit version history, it’s time to wonder if Apple wants to plant new seeds or sit back, maintain, and reap the fruits of the work done so far.\nLast year, I welcomed iOS 8 as a necessary evolution to enable basic communication between apps under the user’s control. With extensions based on a more powerful share sheet, document providers, widgets, and custom keyboards, I noted that iOS had begun to open up; slowing down wasn’t an option anymore.\nIn hindsight, many of the announcements from last year’s WWDC were unambiguous indicators of a different Apple, aware of its position of power in the tech industry and willing to explore new horizons for its mobile operating system and what made it possible.\nFollowing the troubled launch of iOS 6 and subsequent rethinking of iOS 7, Apple found itself caught in the tension between a (larger) user base who appreciated iOS for its simplicity and another portion of users who had elected iPhones and iPads as their primary computers. Alongside this peculiar combination, the tech industry as a whole had seen the smartphone graduate from part of the digital hub to being the hub itself, with implications for the connected home, personal health monitoring, videogames, and other ecosystems built on top of the smartphone.\nWWDC 2014 marked the beginning of a massive undertaking to expand iOS beyond app icons. With Extensibility, HealthKit, HomeKit, Metal, and Swift, Tim Cook’s Apple drew a line in the sand in June 2014, introducing a new foundation where no preconception was sacred anymore.\niOS’ newfound youth, however, came with its fair share of growing pains.\nWhile power users could – at last – employ apps as extensions available anywhere, the system was criticized for its unreliability, poor performance, sparse adoption, and general lack of discoverability for most users. The Health app – one of the future pillars of the company’s Watch initiative – went through a chaotic launch that caused apps to be pulled from the App Store and user data to be lost. The tabula rasa of iOS 7 and the hundreds of developer APIs in iOS 8 had resulted in an unprecedented number of bugs and glitches, leading many to call out Apple’s diminished attention to software quality. And that’s not to mention the fact that new features often made for hefty upgrades, which millions of customers couldn’t perform due to storage size issues.\nBut change marches on, and iOS 8 was no exception. In spite of its problematic debut, iOS 8 managed to reinvent how I could work from my iPhone and iPad, allowing me – and many others – to eschew the physical limitations of desktop computers and embrace mobile, portable workflows that weren’t possible before. The past 12 months have seen Apple judiciously fix, optimize, and improve several of iOS 8’s initial missteps.\nEight years1 into iOS, Apple is facing a tall task with the ninth version of its mobile OS. After the changes of iOS 7 and iOS 8 and a year before iOS 10, what role does iOS 9 play?\nIn many cultures, the number “10” evokes a sense of growth and accomplishment, a complete circle that starts anew, both similar and different from what came before. In Apple’s case, the company has a sweet spot for the 10 numerology: Mac OS was reborn under the X banner, and it gained a second life once another 10 was in sight.\nWhat happens before a dramatic change is particularly interesting to observe. With the major milestone of iOS 10 on track for next year, what does iOS 9 say about Apple’s relationship with its mobile OS today?\nAfter two years of visual and functional changes, is iOS 9 a calm moment of introspection or a hazardous leap toward new technologies?\nCan it be both?\neBook Version\n\nAn eBook version of this review is available to Club MacStories members for free as part of their subscription. A Club MacStories membership costs $5/month or $50/year and it contains some great additional perks.\nYou can subscribe here.\n(Note: If you only care about the eBook, you can subscribe and immediately turn off auto-renewal in your member profile. I’d love for you to try out Club MacStories for at least a month, though.)\nDownload the EPUB files from your Club MacStories profile.\nIf you’re a Club MacStories member, you will find a .zip download in the Downloads section of your profile, which can be accessed at macstories.memberful.com. The .zip archive contains two EPUB files – one optimized for iBooks (with footnote popovers), the other for most EPUB readers.\n\nIf you spot a typo or any other issue in the eBook, feel free to get in touch at club@macstories.net.\nTable of Contents\nDesign\nSan Francisco\nKeyboard\nRounded Corners\n\nSafari\nWelcome Changes\nContent Blockers\nSafari View Controller\n\nThe State of Share Sheets\nThe State of Reminders\niCloud Drive\nPodcasts\nMail\nApple News\nNotes\nFormatting\nChecklists\nImages and Sketches\nShare Extension and Links\nLinks\nFiles\n\nAttachments Browser\nNotes as Rich Documents\n\niPad\nGrowing Pains\nThe Importance of Being iPad\nSlide Over\nSplit View\nPicture in Picture\nSoftware Keyboard\nHardware Keyboards\nThe Spatiality of iPad Multitasking\nThe Visual Complexities of iPad Multitasking\nThe Utility of iPad Multitasking\n\nSearch and Deep Linking\nLocal Search: CoreSpotlight and User Activities\nApple’s Server-Side Index\nThe Flow of Search\n\nDeep Links, Back Links, Universal Links\nIntelligence\nPerformance and Low Power Mode\nEverything Else and What I Couldn’t Test\nFive Years On\n\n\nDesign\nFrom an aesthetic perspective, iOS 9 doesn’t drift away from the focus on clarity, content, and color that debuted in 2013. iOS 9’s design builds upon the new course of iOS 7, with some notable differences that epitomize Apple’s penchant for refinement this year.\nSan Francisco\nAfter a short and public affair with Helvetica Neue, iOS 9 brings a new system font called San Francisco.\nDesigned in-house by Apple and introduced at WWDC as a family of typefaces that is both “inconspicuous and beautiful”, San Francisco is a sans-serif typeface which unifies the typographic voice of iOS, OS X, and watchOS, bringing visual consistency with two distinct sub-families and new APIs for developers to enhance the textual presentation of their apps.\n\nFrom an average user’s perspective, the change to San Francisco may not appear as a drastic modification to the iOS interface. To people who are not familiar with the intricacies and details of type design, San Francisco may look subtly different, but mostly in line with the neutral and utilitarian nature of Helvetica Neue. This isn’t meant to sound as a slight to design experts and type connoisseurs, but a lot of people won’t notice the technical detail and years of work behind San Francisco.\n\nFrom the look of the clock and date on the Lock screen to the time in the status bar and bold labels in title bars, San Francisco refreshes the textual appearance of iOS without clamoring for undivided attention. With San Francisco, iOS 9 doesn’t suddenly look like a jailbreak tweak that changes the system font to impractical typefaces just for the sake of customization. San Francisco looks nice – and it is objectively better than Helvetica Neue in some cases – but it doesn’t stand out and shout “Look at me, I’m new!”, and I believe that’s exactly what Apple set out to attain in their effort to craft a modern typeface for all their platforms.\nWith iOS 7 (and 2014’s work on OS X Yosemite), Apple’s design team strove to build a structure that could forgo realistic representations of objects and textures in favor of a new hierarchy of text and color. While different colors and the interplay of layers have been used to suggest interactivity and depth, the job of communicating hierarchies and relationships between interface elements has largely fallen upon text weights and sizes.\nThat’s why, for instance, title bars feature non-tappable bold titles and regular-sized colored buttons next to them, or why the same font used across three lines, but in different weights, can lay out an email’s sender, subject line, and body text preview. Apple discovered that once you get rid of shadows and textures to embellish UIs and push people towards interaction, text itself can be the texture; possibly an even more versatile one, because of its programmability, scalability, and widely recognized properties.\nWe’ve seen how third-party apps such as Overcast and Twitterrific gained an identity of their own thanks to the typefaces they employed. From this standpoint, is it really a surprise that Apple – a company with a knack for controlling the primary technologies of their products – chose to design a new family of typefaces to use everywhere?\nThat’s where San Francisco is worth exploring. Understanding its technicalities can help us comprehend Apple’s decisions and appreciate the details that will be shared between the OS and apps.\nSan Francisco comes in two sub-families: San Francisco, used on iOS 9 and OS X El Capitan, and San Francisco Compact, used on Apple Watch. The two variants are related but not equal, with similar designs but different proportions and shapes that can adapt to multiple screen sizes and UI needs.\nEach family has, in Apple’s parlance, two cuts – Text and Display – with six weights for Text and nine weights for Display. Apple’s goal with San Francisco is to bring a consistent voice and reading experience to all platforms; while the typeface stems from a shared set of rules and guidelines, each flavor of San Francisco has been designed and engineered for the platform it’ll be displayed on.\n\nAs an artistic expression and software feature, fonts stand at the intersection of craftsmanship and engineering. In designing San Francisco, Apple was tasked with creating a typeface that looked good and was consistent in and out of itself, but that could also scale across three OSes and, more importantly, provide users and developers with controls over font size and readability.\nThis is one of the core aspects of San Francisco: with an increasing array of screen sizes and ways to personalize the appearance of fonts on iOS, a new family of typefaces ought to solve a problem bigger than nice looks. Text is everywhere, but there’s no perfect font size or style that can fit everyone.\nConsider Dynamic Type, an Accessibility feature introduced in iOS 7 that allows users to change the size of the system font. With a single slider available in the iOS Settings, users can make text bigger or smaller at a system-wide level, achieving a more comfortable reading experience in Apple’s apps but also third-party apps which integrate with Dynamic Type.\nSan Francisco expands upon the idea of ever-changing font sizes with Text and Display, two cuts which are intelligently applied by the system when needed. The main trick employed by Apple is that iOS 9 switches automatically from Display to Text at 20 points. When this threshold is met, Apple “cheats” by altering the font ever so slightly so that it remains readable at smaller sizes. This makes Text suitable for text content displayed at 19 points and below, and Display the best option for labels and other text content displayed at 20 points and above.\n\nThe difference between Text and Display is subtle and it likely won’t be noticed by most users, but it contributes to keeping San Francisco readable at any size, in any app, with any Dynamic Type setting. This also means that Text and Display are not equal: when text gets smaller, details of round shapes (such as the terminal of a lowercase “a”) get shorter and simplified, while the arm of a lowercase “t” and the shoulder of a lowercase “r” get slightly wider or longer to ensure shapes of letters can be quickly identified when reading. Then, size-specific tracking (the space between letters throughout an entire word, not to be confused with kerning) makes text further spread apart, in order to avoid confusion when reading sentences at a smaller point size.\nApple explained the decision as having to adjust visual perception through illusion. Sometimes you have to cheat to make text look good to the user, and altering some details of San Francisco Display in its dynamic transformation to Text allows the same font to always produce readable text no matter the size.\n\nIf you really want to spot the differences, you can go into Settings > General > Accessibility > Larger Text and have fun moving the slider to see how San Francisco adapts to smaller and bigger sizes. Or, you can pick San Francisco in the new Safari Reader and tweak its size to see how Display and Text come into play and change according to your preference.\nTo understand the importance of increased detail at a smaller size in a different context, think about the setup process of videogame consoles or accessories like a Chromecast or an Apple TV. When viewed from a distance, lowercase text on TV keyboards can be hard to recognize, especially when your eyesight isn’t as good as it used to be. Now, consider that millions of people with visual impairments or low vision may come across similar readability problems on their iOS devices on a daily basis. When bigger text or zoomed UIs aren’t an option, having smaller text that is still recognizable and legible becomes essential.\nSan Francisco’s smaller optical size feels airy and with fewer or more details depending on the anatomy of a character. Some of its subtleties will be lost to the average user, but that’s the point. People don’t have to know how fonts work. They just have to find them readable. And this is true for all kinds of people, with all kinds of needs.\nIn my tests with apps updated to support the new system font, I found San Francisco to be more comfortable and readable at smaller sizes than other sans-serif typefaces like Helvetica Neue and Avenir. More importantly, San Francisco strikes a good balance of adding fresh personality to the system and keeping an underlying familiarity with previous versions of iOS. It feels more lively, it’s not too sharp and geometric, and I find it legible enough at any size.\nTwitterrific with Helvetica Neue (left) and San Francisco (right).\nFor developers, San Francisco offers APIs and features that can be enabled in apps without having to resort to Unicode tricks or fall into limitations of previous system fonts.\nFeatures are behaviors embedded in San Francisco via code; developers can choose to use them or opt out. Some of these features include built-in support for fractions (so developers aren’t limited by the choice of fractions as Unicode glyphs and don’t have to write custom code to display them), native superscripts and subscripts, alternate 6 and 9 symbols for smaller sizes, proportional numbers by default (but developers can opt into monospaced if they want to), uppercase forms for various math symbols, and a vertically centered colon.\nOn top of this, San Francisco has comparable vertical metrics to old system fonts for basic compatibility with existing third-party apps and UIKit; it covers Polish, Hungarian, Cyrillic script, Greek script, and more, allowing for consistent localization of apps for international users; and, developers have been given new APIs to access all of the weights available in San Francisco.\nI’d like to point out three of the features available in San Francisco, as they have nice, observable properties that anyone can appreciate with enough attention.\nThe vertically centered colon is used by Apple in typesetting the system clock, and it can always be seen in action in the iOS 9 Lock screen and timestamps in Messages. This is a nice detail which makes for a pleasing viewing experience.\nAlternate symbols for 6 and 9 are also interesting as they’re already in use by Apple in the Stopwatch app for Apple Watch and on the back of the Watch itself for the serial number. At smaller sizes, the similarly curved shape of these two numbers can be easily confused, and the alternate look (opt-in for developers) enables flatter, discernible shapes with lower cognitive load.\n\nMonospaced and proportional numbers aren’t exactly new in iOS 9: Helvetica Neue supported switching from monospaced to proportional in iOS 8, but the default behavior has changed in iOS 9. Now, the system defaults to displaying proportional numbers: in most cases, proportional numbers (with variable width) are more evenly spaced with fewer unusual gaps between them than monospaced (fixed width) counterparts. If a series of numbers is rendered proportionally, a skinny “1” won’t take up the same width of a larger “5”, leading to a more pleasant effect.\nThere are instances in which monospaced numbers would be preferable (such as columns in spreadsheets or animations in progress bars), but for most cases of body text and labels (and the system clock), proportional numbers as the default option is the right move.\nTo better showcase the capabilities of San Francisco in iOS 9, I asked Daniel Breslan, an independent developer who works on Departure Board for iPhone, to create a sample app for iOS 9 and compare San Francisco to the same text strings and layout of an iOS 8 app with Helvetica Neue.\niOS 8 (left) and iOS 9 (right).\nThe same custom app, running on iOS 9 for iPad.\nThis was a fun experiment as it shows how tracking, Text and Display cuts, and font features affect apps in practice. In the screenshots above, you can see how San Francisco adds a bit of personality to an otherwise neutral typeface, with details such as fractions, superscripts, subscripts, and alternate 6 and 9 making for a superior reading and formatting experience in iOS 9.\nToday, it seems obvious that Apple wanted to control the typographic destiny of its ecosystem. Our experience with iOS devices primarily involves reading text. Whenever we pick up an iPhone or iPad and we look at something, text is part of it. Text is communication, texture, and call to action. Text has established a new visual hierarchy since iOS 7, but Apple wasn’t directly in control of its appearance and functionality. With San Francisco, Apple has set out to design a typeface that’s familiar, flexible, elegant, and accessible.\nThe details of San Francisco may go unnoticed in the general public, but by claiming control of the system font, Apple is allowing developers to spend less time dealing with code required to build features that are native in San Francisco, letting them focus on other parts of their apps. This is a common thread in iOS 9, and an expected next step given the maturity and reach of the iOS ecosystem in 2015.\n\nKeyboard\nAnother San Francisco-related change in iOS 9 that demands a standalone section is Apple’s response to criticism on the Shift key design of iOS 8.\nThe solution proposed by the company in iOS 9 is twofold. As far as the Shift key is concerned, iOS 9 introduces a simpler design that makes its On/Off state more obvious. When turned off, the Shift key has a gray background with a hollow glyph that matches the adjacent keys. When turned on, the entire key turns white with a black, filled glyph.\nThe new Shift key design in iOS 9.\nThe new design clearly indicates the activation state of the Shift key, and it goes a long way in removing doubts on whether Shift is enabled or not, solving a major usability issue of the iOS 7/8 keyboard.\niOS 9 also brings a feature which has been a staple of keyboards on Android, other OSes, and even videogame consoles for decades: lowercase and uppercase letters on the keyboard.\n\nApple has added lowercase and uppercase variations of San Francisco letters to the iOS keyboard, turning on the option by default. With a default configuration, this means that when typing on iOS 9 – on either the iPhone or iPad – keys will constantly switch from uppercase to lowercase letters.\nCritics of this design choice will argue that switching between uppercase and lowercase characters looks garish and fiddly, that Android has featured a similar behavior for years, and that such keyboard design loses the purity of the original iPhone’s keyboard. I understand the conceptual premise of this argument, but it dismisses the actual benefit of the new keyboard design in iOS 9.\nAnimating characters between uppercase and lowercase transitions may not look as polished and precise as an always-uppercase keyboard, but it’s more practical. I’m used to looking at uppercase and lowercase characters every day, and I don’t mind seeing this on the iOS keyboard. The affordance is simply stronger and more natural.2 It’s faster to look at lowercase keys and expect lowercase characters to be entered on screen as a result of my direct manipulation than having to guess the output of my typing based on the Shift key alone.3\nThis is the very advantage of software keyboards: they can be updated via software. For eight years, the iPhone’s keyboard was stuck on imitating the hardware keyboards of Macs, with uppercase keys that look bigger and better but that can’t be updated or changed dynamically. Alternating between lowercase and uppercase characters is just another case of a right thing to do because iOS doesn’t have to be like a PC.\nApple is making this an option, at least for now. If you’re unhappy with the new keyboard design in iOS 9, you can go into Settings > General > Accessibility > Keyboard and turn off ‘Show Lowercase Keys’. This will turn off alternate lowercase and uppercase letters, but it won’t apply to third-party custom keyboards from the App Store.\nLeft: Character Preview turned on.\nApple has also added an option to disable Character Preview, the little popup that appears every time you tap a letter on your iPhone’s keyboard. This is likely a consequence of having lowercase and uppercase letters – it’s no longer of paramount importance to confirm the character that is being typed when you’re looking at a keyboard that adapts to each case.\nIf you’re okay with this option, you can turn off Character Preview in Settings > General > Keyboard.4\nRounded Corners\nWith the exception of San Francisco, the overall look of iOS isn’t changing in version 9.0, but floating menus and sheets have received rounder corners and stronger drop shadows that make them stand out against the background of an app.\n\nAs a result, buttons in menus are taller and they look more like buttons, inviting users to tap on them.5\nThe change is subtle but noticeable, and I’ve been thinking about why Apple has decided to bring this to iOS 9 when there doesn’t seem to be any official update to the design guidelines of iOS.\nSweet, sweet buttons.\nThe explanation I came up with is that every major Apple software redesign tends to overshoot and dial back over time, dampening the most radical aspects to readjust what was probably exaggerated in the first version. Alerts and sheets had a serious depth problem in iOS 7 and iOS 8, and they could be confused with the rest of an app.\nBut I also believe the iPad motivated this change: with the device now capable of showing multiple apps and sheets on screen at the same time, making each element independent from what’s underneath it is going to be crucial for clarity. Bigger, rounded menus with drop shadows look nicer and they also serve a purpose. I wouldn’t be surprised to see more updates along these lines in future versions of iOS.\n\nSafari\nSince switching to Safari as my default browser two years ago, I’ve often called it Apple’s best app on iOS. While the “best” monicker may be up for debate, Safari is my favorite Apple app: its elegant interface hides a considerate collection of gestures and menus that make reading and browsing a pleasure, especially on the iPad. Alongside Editorial, Safari is where I get most of my work done on iOS. Unlike some, as a heavy user first and a publisher second, I believe that Safari showcases Apple at its best on iOS.\nWith iOS 9, Apple isn’t rethinking any core aspect of Safari the app: they have brought some nice changes and additions to how you use Safari and what you can do with it, but together they make for an iterative update that doesn’t include any major visual tweaks.\nWhere Apple has taken a bold new direction is in what you can see when using Safari and how the power of Safari can be extended to other iOS apps. Here lie two of the biggest surprises of iOS 9 – and, potentially, a harbinger of the future of web publishing.\nWelcome Changes\nSafari Reader, the company’s tool to clean up articles on the web for a clutter-free reading experience, has been updated in iOS 9 to include new font options and backgrounds. Functionally, Safari Reader is still the same: you tap a button in the address bar (if available) to clean up an article, and the app generates an elegant, text-and-images-only version that strips out all extraneous content.\n\niOS 9 provides a total of eight fonts (including San Francisco), four backgrounds, and 12 font sizes you can use in Safari Reader. The configuration you pick applies to every instance of Safari Reader on your device.\nWhile Safari Reader doesn’t offer the typographic controls of Instapaper (my preferred reading experience for web articles), the improvements in iOS 9 make it more versatile and capable of adapting to different users. Font options and backgrounds could help users with visual impairments tweak Safari Reader to yield a superior combination of text and colors for higher readability.\nI don’t use Safari Reader every day – I prefer reading long-form articles in dedicated apps like Instapaper and Pocket – but when I come across the occasional article that I want to read right away, I remember how nice Safari Reader is. I may not be a heavy user of it, but Reader is a solid Safari feature and the changes in iOS 9 are good ones.6\nTap & hold, Paste and Go.\nA great little touch in Safari for iOS 9 is a way to quickly open a URL or a new search query. If you have a link copied in the clipboard, you can tap & hold the address bar and hit “Paste and Go” to navigate to the copied link directly. This is a fantastic shortcut that has made me save several seconds every day: before iOS 9, I had to select the address bar, select all text, paste, and hit Return. In iOS 9, the same result can be achieved in two taps.\n\nSafari is also smart in recognizing whether you’re holding the address bar with a URL or some text in your clipboard. If what you’ve copied isn’t a URL, “Paste and Search” will be an option instead. This lets you look up what you’ve copied on the default Safari search engine.\n\nEven if you don’t need to paste URLs or search terms, you can still tap & hold the address bar to get a ‘Copy’ button that lets you copy the current webpage to your clipboard.\nWith iOS 9, Apple has also changed two other Safari features – “Request Desktop Site” and “Find on Page”. Both options are now available as action extensions in the share sheet; the updated placement makes them more visible and puts the spotlight on the share sheet, which too many users are still unaware of.\nBoth features can also be activated in other ways. “Find on Page” can still be accessed by typing any word in the address bar and tapping the ‘On This Page’ result at the bottom of the list.\n\nApple’s designers got more creative with the alternate location for “Request Desktop Site”: if you tap & hold the refresh icon in the address bar, you’ll get a shortcut to get the desktop version of the webpage you’re viewing. This is another handy time-saving tweak, and it makes sense to bundle it with the refresh icon as requesting a desktop site does refresh the current page.\nAn important addition to iOS 9’s Safari (that is hopefully not going to be seen a lot) is detection of phishing attempts. When visiting a website that has been reported as phishing and is likely trying to trick users into disclosing personal information, Safari will turn red. The browser’s not blushing – it’s automatically preventing the website from loading. In this special view, Apple has included links to learn more about phishing scams, report an error, go back in navigation, or continue anyway.\nPro tip: never visit this site. Unless you have to write a review of iOS 9.\nThis is a good move, and changing the color of the UI helps in paying attention to what’s happening on a potentially malicious webpage. The Safari team put a lot of thought into this: the share sheet will be disabled as well, preventing users from sharing the URL with others or mistakenly adding the website to their bookmarks.\nThe other changes in Safari for iOS 9 are in line with the “small and nice to have” nature of updates in this release. If you’re using Safari with an external keyboard, you can select results in the address bar with the Up/Down arrows and confirm your choice using Return – this change alone allows me to use Safari with my Belkin keyboard more consistently as I rarely have to touch the screen.\n\nIf you have multiple usernames and passwords saved for a website, Safari’s autofill will put up a new Passwords button to view the logins you’ve stored and pick a password to log into the current webpage.\nUnfortunately, this collection of minor changes is all there is to Safari in terms of core app updates. There’s still no way to properly manage file downloads – truly an inexplicable decision now that iOS 9 offers plenty of choice with extensions and an iCloud Drive app to manage files. It’s absurd that, in 2015, if you want to download a bunch of files from the web on an iPad, you need a third-party app.\nUnlike its counterpart on El Capitan, Safari on iOS 9 doesn’t offer the ability to pin tabs to the leftmost side of the tab bar – a feature I would have liked to see on the iPad. Among all the new keyboard shortcuts in the app (there’s even one to show and dismiss Safari Reader), there still isn’t one to activate the share sheet and use extensions with the keyboard – an oversight I’d like to see fixed in the near future.\nThankfully, what Apple didn’t bring to Safari in iOS 9 is offset by features to extend the browser in new ways.\n\nContent Blockers\nSafari on iOS 9 supports a new type of extension: Content Blockers. By content blocking, Apple means the ability for Safari to identify subsets of content or resources on a webpage to hide them or prevent them from being loaded altogether. Through an extension that provides a set of triggers and actions, Safari can block cookies, images, scripts, pop-ups, and other content with CSS overrides and full resource blocking.\nOn the surface, Content Blockers may be viewed as ad blockers for iOS, and that’s likely going to be their most popular use case, but they can do much more than simply blocking ads on webpages.\n\nContent Blockers, like any other extension on iOS, are installed from apps you download from the App Store. An app can offer a Content Blocker extension, which you can activate in Settings > Safari > Content Blockers.\nContent Blockers are based on a fast, memory efficient model that informs Safari on what to hide or block ahead of time rather than requiring Safari to consult with an app while loading a webpage. To create a Content Blocker, developers have to provide a JSON object that contains dictionaries of triggers and actions; this is then compiled into bytecode, which can be evaluated very efficiently by Safari. For performance reasons, Content Blockers are only available for apps compiled to run on 64-bit architectures, supported by the following devices:\niPhone 5s\niPhone 6\niPhone 6 Plus\niPhone 6s\niPhone 6s Plus\niPad Air\niPad Air 2\niPad Pro\niPad mini 2\niPad mini 3\niPad mini 4\niPod touch 6\nIn addition, Content Blockers are supported in Safari and the modern Safari View Controller; they’re not available in legacy web views built using UIWebView and WKWebView.\nThe model behind Content Blockers is aimed at being faster and more private than existing content blocking extensions for desktop browsers. In addition to not having to consult with a full app while loading a webpage – a task that would increase RAM and battery consumption – what the user does inside Safari is never exposed to Content Blockers. The URLs of webpages that users view in Safari with a Content Blocker installed are never passed by Safari to the Content Blocker itself. The only job of a Content Blocker is to provide a dictionary of rules; Safari takes care of everything else.\nThe actual rules that build a Content Blocker are organized in triggers and actions. Triggers can be filters for specific URLs, resource types (such as scripts or media), load types (first-party for resources from the same domain, or third-party from external domains), as well as filters to apply to a specific domain or to other domains except a specific one. Filters can be combined in a dictionary by a Content Blocker: for example, an app could assemble a trigger for all third-party scripts on amazon.com or a trigger that uses a regular expression to hide instances of the word “espresso” on every website except macstories.net.\nAs for actions, Content Blockers can either apply a CSS override (using css-display-none) to hide content or block matched content defined in the trigger. When blocking a resource, Safari won’t just load it and hide it in the background: it’ll completely prevent the matched resource from loading. From a technical standpoint, the Safari team has done a great job at making Content Blockers easy to build, fast, and private. Even non-developers can look at the JSON object powering a Content Blocker and grasp what’s going to happen once activated.\nContent Blockers are a byproduct of the modern web. Given the astronomic rise of web advertising and businesses based on tracking user behavior and personal information through scripts embedded on webpages, there has been a considerable growth in the average size of webpages in recent years. It’s not uncommon to come across a webpage that goes over 10 MB of data for all the external scripts, resources, and media that a browser needs to load even if they aren’t necessarily part of the content the user is interested in.\nThe name ‘Content Blocker’ is something of a misnomer, since this feature isn’t meant to hide or block content, as most people would define it. Instead, they act on ads or scripts which often do block content, such as a popup image which covers what you are trying to read.\nContent Blockers can also hide any page content with CSS overrides. If you’ve ever wanted to be able to hide comments, social buttons, sidebars you don’t care about, or interactive galleries from your favorite websites, Content Blockers will offer a way to do so with a few rules.\nRight or wrong, this is what Content Blockers are: Safari extensions that can hide or block content with good performance and user privacy in mind. Rather than dwelling on the moral aspects of this technology, I set out to discover if Apple’s claims were reflected in Content Blockers built by developers for iOS 9.\nSince June, I’ve been testing eight Content Blockers for iPhone and iPad. There was a lot of overlap between them: most developers I spoke with were building Content Blockers aimed at blocking ads and trackers, using public rules available from EasyList and EasyPrivacy. Content Blockers that only focused on blocking ads and scripts through those databases worked fine, but one was comparable to the other – and I suspect we’re going to see a lot of Content Blockers that use EasyList and EasyPrivacy with a minimal UI to wrap it all together.\n1Blocker is a highly customizable Content Blocker.\nCurious to have a better understanding of Content Blockers, I tried some that went beyond simple ad/tracker blocking and that incorporated CSS overrides, specific resource blocking, and user-defined filters. I ended up using two Content Blockers for this review:\n1Blocker, an excellent all-in-one Content Blocker that can block ads, trackers, social widgets, Disqus comments, web fonts, adult sites, and that lets you create your own rules for URLs, cookies, and page elements to hide or block. 1Blocker is Universal and it comes with over 7000 built-in rules, which you can individually turn on and off.\nShut Up, an iOS version of the popular stylesheet by Steven Frank that hides comments from hundreds of websites and commenting systems.\nWith these Content Blockers installed, I ran tests on a few websites I frequently visit that heavily rely on third-party ads, tracking networks, and external resources. For some of them, I also applied custom CSS overrides to make the content more pleasing to my eyes, which involved hiding sidebars, comment sections, footers, and other boxes of content I didn’t care about.\nApple makes it easy to test Content Blockers and measure their performance: with a Content Blocker installed, a tap & hold on the refresh icon in the Safari address bar will reveal a second option to reload the current webpage without Content Blockers. This shortcut comes in handy if a site becomes problematic after blocking some of its content, but it can also be used with Safari’s Web Inspector on OS X to measure resources, size of resources, and time until the load event fired with and without Content Blockers.\nAll of the following tests were performed on an iPad Air 2, on a Wi-Fi network, with caching enabled.\n\nFor context, you can also take a look at screenshots of a few websites before and after applying Content Blockers. Changes include both blocked resources and elements hidden via CSS.\nLeft: Content Blockers disabled.\n\n\nThe numbers above speak for themselves. I started using Content Blockers on my iPhone and iPad in June. Two weeks ago, out of curiosity, I decided to disable them for a few days to monitor my reaction. That’s when I realized I’m not going to disable Content Blockers if the web doesn’t improve in the foreseeable future.\nI’ve gotten used to seeing webpages load in two seconds again. For the first time in years, I haven’t exceeded my monthly data cap, with hundreds of MBs of data left in my plan. In the long run, Content Blockers will give me back dozens of minutes I haven’t spent waiting for content to load alongside ads and trackers.\nEffectively, using Content Blockers on iOS 9 feels like getting a new web. Lighter, faster, more private, and less cluttered.\nAnd that’s where I feel bad, and where the entire discussion about Content Blockers lies. I don’t feel good knowing that, by using Content Blockers, I’m not supporting my favorite sites through the ad and tracking-based business model they opted for. After all, if an article on the web is given to me for free, what right do I have to complain about the means used to monetize it?\nOn the other hand, though, this is my time and my cellular data – money out of my pocket – and the ad-supported web has gotten out of hand. As a user and website owner, a simple article with three paragraphs of text and one image shouldn’t weigh 5 MBs and take 20 seconds to load. I can’t accept webpages that continue processing network requests in the background for over a minute, consuming battery on a device that I may need at the end of the day for an emergency. I’m all for ad-based business models – this very website is partially supported by sponsors – but is there really no better, more tasteful, less intrusive way to run a website than what we’re seeing on the web today?\nAs a user, am I supposed to feel bad about Content Blockers and not use them at the risk of wasting data and battery life, or should I fight back?\nSome websites can change a lot with Content Blockers.\nApple’s answer seems clear. With iOS 9, Apple is not pre-installing any Content Blockers in Safari, and I’d be surprised if they’ll ever actively promote one on the App Store. But they have made it ridiculously easy to install and use one, with benefits that aren’t conceptual or limited to power users – everyone understands more battery and less data. When the majority of iOS users will realize that Content Blockers make Safari and the web faster and lighter, I can’t imagine why they wouldn’t want to use one. And, I bet, most of them won’t feel as bad as I did when writing this review. They won’t add sites to whitelists like I did, to enable ads on them. They won’t care about the business model of a website. They’ll just enjoy the cleaner, lighter web made possible by Content Blockers.\nApple is killing the proverbial two birds with one stone here. By building Content Blockers as extensions bundled inside apps and prioritizing user privacy and performance over anything else, they are leveraging the App Store to let users discover and install Content Blockers, which can’t observe what a user does in Safari. This is quite a change from popular desktop blockers that, by evaluating rules while a webpage is being loaded, can track user behavior and URLs.\nAlso, by making them exclusive to Safari, Content Blockers will likely create a powerful network effect: when users of third-party browsers on iOS (possibly made by a company based on ads) will know that only Safari supports Content Blockers, I bet that millions of them will switch to Apple’s browser.\nApple described Content Blockers as a way to add something to the experience of a webpage “by taking something away”. I’m okay with what’s being taken away, and my devices are better because of it. Content Blockers are a win-win scenario for Apple and users. As for web publishers…The times, they are a-changin’.\n\nSafari View Controller\nWith iOS 9, Apple wants to reimagine in-app web views using the interface, features, performance, and security of Safari. To do so, they’ve built Safari View Controller – a Safari view that lives inside an app and that runs in a secure, isolated process.\nLeft: standard Safari View Controller. On the right, Safari View Controller with a custom tint color set by an app.\nWith Safari View Controller, Apple is giving developers a way out from custom web views, allowing them to stop building miniature web browsers with less functionality than the system browser. As I detailed earlier this year, this has major implications for how the web is embedded and experienced in iOS apps.\nSafari View Controller has been closely modeled after Safari with consistency and quick interactions in mind. Safari View Controller looks a lot like Safari: when users tap a web link in an app that uses it, they’ll be presented with a Safari view that displays the address bar at the top and other controls at the bottom or next to it.\nSafari View Controller looks a lot like Safari.\nSafari View Controller mostly looks like Safari for iPhone and iPad: on the iPad, navigation controls, the address bar, and a share icon live in an address bar at the top; on the iPhone, navigation and share buttons are displayed at the bottom. Both flavors of Safari View Controller come with a Done button to dismiss the browser in a consistent position at the top right.\nThere are two minor differences with Safari: when opened in Safari View Controller, the URL in the address bar will be grayed out to indicate it’s in read-only mode; while users can navigate to other links in a webpage displayed in Safari View Controller, they can’t tap the address bar to manually type a URL or start a new search. And, a Safari button is available in the toolbar, so that users can jump to Safari if they want to continue viewing the current webpage in the browser.\nIn addition, Apple is making sure that user privacy and security are highly valued in how Safari View Controller operates. Safari View Controller runs in a separate process from the host app, which is unable to see the URL or user navigation happening inside it (same concept of Content Blockers). Therefore, Apple has positioned Safari View Controller as entirely safe: private user data stays in Safari and is never exposed to a third-party app that wants to open a link with Safari View Controller. This is an important difference from custom web views (built using legacy APIs such as UIWebView and last year’s WKWebView) that are adopted by millions of apps, including those of big companies like Twitter, Facebook, and Google.\nThe considerations behind Safari View Controller have allowed Apple to port much of the polish and functionality that users know (and expect) from Safari to any app that uses Safari View Controller in iOS 9.\nSafari View Controller in Twitterrific. Notice the WordPress toolbar because I’m already logged into MacStories.\nSafari View Controller shares cookies and website data with Safari. If a user is already logged into a specific website in Safari and a link to that website is opened in Safari View Controller, the user will already be logged in. Safari View Controller has access to iCloud Keychain’s password autofill, plus contact card and credit card autofill, Safari Reader, and the system share sheet for action and share extensions. If Private Browsing mode is enabled in Safari and a user opens a webpage in Safari View Controller, it’ll open in Private Browsing as well.\nSafari View Controller in Private Browsing mode, showing Safari Reader and the system share sheet.\nLike Safari, Safari View Controller supports detection of phishing websites, it has a nice built-in progress bar, and it also provides custom views for errors and other issues in webpages. To top off everything that’s shared between Safari and Safari View Controller, Content Blockers you activate in Safari will be enabled for Safari View Controller, too – by itself, a huge incentive to start using it.\nSafari View Controller isn’t without its fair share of issues and questionable design decisions. First and foremost, it doesn’t give developers too much control over the visual appearance and behavior of the in-app web browser.\nDevelopers can style Safari View Controller with a custom tint color: they change the color of buttons to match their app’s main color, and that’s it. They can’t change the color of the entire UI or insert custom interface elements in the toolbars. This helps with consistency in that Safari View Controller closely resembles Safari – therefore giving users a sense of security and familiarity – but the effect can be somewhat jarring if the host app’s interface wholly differs from the look of Safari.\nAt a technical level, because Safari View Controller doesn’t expose anything about the webpage being loaded, apps that use it have no control over caching and they can’t use webpage information in any way beside letting users share what they’re viewing with a share sheet (which is also sandboxed). Again, this was done in the name of security and performance: there are trade-offs involved to provide users with a way to open webpages with a Safari view using shared credentials, the Nitro JavaScript engine, Content Blockers, and everything that comes with Safari.\nSafari View Controller opens modally on top of the host app, with the only way to dismiss it being a ‘Done’ button at the top of the screen. Developers can’t use custom gestures to dismiss Safari View Controller – such as the popular swipe from the left edge – and they can’t put it inside an app’s view so users can move back and forth between other views and Safari View Controller. If you’re used to being able to open a web view in Tweetbot and switch to another tab while the webpage is open, prepare to say goodbye to that ability if Tweetbot implements iOS 9’s new web view. Safari View Controller is modal, and the webpage completely takes over the host app functionally and visually.\nAside from the shortcomings I’ve already mentioned for Safari, my main problem with Safari View Controller is that it lacks one essential Safari shortcut I’ve come to rely upon. Unlike Safari, you can’t tap & hold the refresh icon in the Safari View Controller address bar to request a desktop site or reload a webpage without Content Blockers.\nAnother issue is that, currently, the 1Password extension won’t show up in Safari View Controller as it’s unable to fill the contents of a webpage with user information. This can be a major annoyance if you don’t use iCloud Keychain in addition to 1Password, but it hasn’t been a problem for me as I always save passwords filled by 1Password in iCloud Keychain as well. If you’re already logged in via Safari, you will be logged in via Safari View Controller.\nDespite these limitations, the benefits of Safari View Controller are too convenient for developers to ignore.\nI’ve been able to test beta versions of my favorite apps with Safari View Controller support, including Twitterrific, Dispatch, and Terminology. After getting used to the speed and convenience of Safari View Controller, I don’t want to go back to custom web views.\nWhen I tap a link in my Twitter stream and it opens the same webpage I’d see in Safari, it feels like the web is better integrated across the entire OS rather than split in a bunch of proprietary renditions of it. Webpages open fast; I’m already logged into websites I know; and, I have native access to features like Safari Reader, extensions, and Content Blockers. With Safari View Controller, I don’t have to learn a new web view every time I use a different app. Safari – and everything I like about it – comes along for the trip, giving me performance and functionalities I expect from my iOS browser.\nThere are apps in which Safari View Controller won’t be a suitable option. Editorial’s built-in web browser, for instance, supports multiple tabs and access to webpage content with JavaScript, both of which can’t be implemented in Safari View Controller. I wouldn’t want to see apps like Editorial abandon their advanced web views just for the sake of adopting Safari View Controller. But in most cases, apps that only display one webpage at a time will be better off using Safari View Controller, as it offers a consistent interface, speed, and a safer environment for users. At this point, I’m just annoyed whenever I come across an iOS 9 app that doesn’t use Safari View Controller to open links.\nEditorial’s browser has tabs and custom integrations that won’t be available in Safari View Controller.\nThough I’d like Apple to address them, the potential issues of Safari View Controller haven’t been showstoppers in daily usage. At the cost of losing custom branding and custom menus in web views, I now prefer following links that open in Safari View Controller because I feel safe knowing that it’s Safari, and because I know the webpage will be displayed exactly like I want it to be.\nIf anything, the biggest problem of Safari View Controller is that I doubt companies with a vested interest in custom features and user tracking will use it. Just like iOS 8 extensions didn’t provide developers with data about user behavior, which resulted in the likes of Facebook, Twitter, Google, and Pinterest not using them (or taking a long time to support them – with a different tactic), I wonder if Safari View Controller adoption will be slowed down by an unwillingness to support a sandboxed web view.\nWe’re already seeing the kind of argument these companies have against Safari View Controller, and I’m afraid it could remain a niche feature of iOS 9 for millions of users. I’ll be happy to be proven wrong here.\nSafari View Controller solves two problems. Apple wants to give developers a way to focus on other parts of their apps’ code, leaving the job to display web content to a browser that has been refined and improved over the years. At the same time, their goal is to give users a consistent, safe, and fast way to open webpages with the functionality they already know.\nContent Blockers and Safari View Controller draw a line in the sand between web companies and users. Apple has chosen to give users the power to fight back and enjoy a faster, safer web. The ball is in the developers’ court now.\n\nThe State of Share Sheets\nLast year’s most notable addition to iOS – action and share extensions – came from an unsuspected place: the share sheet Apple had long used to let users share photos and text with preinstalled actions.\nIt wasn’t an optimal solution, though: the one-size-fits-all approach of the share sheet meant developers had to cope with issues related to how different inputs were passed to extensions; the share sheet’s design caused discoverability concerns as many users couldn’t understand how to enable extensions. Apple built a framework for action and share extensions, tied it to the share sheet, and left developers and users to deal with figuring out how it worked.\niOS 9 only partially addresses these issues. For developers, iOS 9 has a new dictionary API that lets extensions show up even if they only accept one of multiple input representations shared by an app. This is aimed to fix the paradox of developers who, to be good platform citizens, made their apps share data in multiple formats and ended up punishing users because iOS 8 extensions had to support every single input format to appear in the share sheet.\nImagine an app that can share text in multiple formats such as TXT, PDF, and HTML. In iOS 8, an extension needed to support all formats to be an option in the share sheet triggered in the host app; in iOS 9, the new dictionary API enables extensions to be available in the share sheet even if they only support one of multiple representations shared by the host app.\nIf adopted by developers (the feature is opt-in), iOS 9 extensions should be available in more places. For users, this could mean fewer mysterious disappearances of icons from the share sheet depending on the app they use. Alas, I wasn’t able to test this during the summer as I couldn’t get my hands on enough betas of apps with support for the new extension dictionary API.\n\nAnother share sheet-related change in iOS 9 is the ability to open it from a new “Share” option of the copy & paste menu. This way, apps that chose to avoid share sheet integration in iOS 8 (such as Apple’s own Mail and Reminders) will feature extension support in iOS 9 as long as they allow users to select text. It also makes it more obvious that it’s possible to share selected text through extensions instead of having to reach for a separate share icon.\nThis is a welcome tweak, with some reservations. The new menu-triggered share sheet doesn’t expose all the information that is available from the regular share sheet. For instance, if you select some text on a webpage in Safari and hit the share icon in the top toolbar, extensions will be able to see a variety of data about text selected on the webpage, the URL, HTML code, and more.\nA share sheet in Mail for iOS 9.\nIf you select text in a webpage and share with the new menu button, only selected text will be passed to extensions as plain text, with no additional webpage information embedded in it. The same happens for any app that works with formatted text: you can try this yourself by sharing some rich text from an email message to the Mail extension itself (Mail inside Mail is now possible, too).\nApple isn’t backing away from the share sheet, nor are they making considerable changes to the way extensions can be discovered, managed, and activated. Developers still have no way to programmatically trigger specific extensions or take users directly to the share sheet’s configuration screen; users need to manually open the share sheet’s setting view to toggle action and share extensions on and off.\nExtensions remain one of the most powerful technologies of iOS. But as I’ve argued before, they’ll have to break out of the share sheet sooner or later to embrace a wider audience. I was hoping iOS 9 would mark the framework’s maturity with the ability to customize how extensions are activated while keeping everything safe and consistent, but I’ll have to postpone my hopes until next year.\nThe State of Reminders\nApple’s Reminders app hasn’t witnessed a meaningful alteration in iOS 9, carrying over the same design introduced with iOS 7 in 2013 (paper texture included) and keeping a structure based on lists. What Reminders does gain in iOS 9, though, is a new way to capture any kind of information from apps through a universal interaction layer – Siri.\n\nIn iOS 9, you can create smart reminders in any app by summoning Siri and saying “remind me about this”. Through that sentence, Siri will capture the state of your current app activity – a webpage in Safari, a message in Mail, a view in a third-party app – and it’ll save it as a reminder that includes the app icon and a link to reopen the view from the originating app. Smart reminders are supported across every iOS app, and a due date or time can be appended to the Siri query, too. After issuing the command, Siri will immediately display the newly created reminder, carrying the icon of the app where you’ll want to go back to.\nThe way Apple has implemented this feature is clever, and it shows quite a bit of foresight on their part. Smart reminders via Siri use Handoff, introduced in iOS 8, to capture app state through the NSUserActivity API: the same technology employed to continue an activity from one device onto another can now be used to save that activity as a reminder that contains a deep link to an app.\nDevelopers that started supporting Handoff last year have already done most of the work required to expose their apps to Reminders via Siri: the app icon and relevant section are fetched by NSUserActivity; optionally, developers can add a title to any activity available to the API. The title will become a todo’s name in Reminders if the activity is captured via Siri.\nA note opened from Reminders.\nThe beauty of this system is that iOS 9 makes virtually any app interaction a potential source for Reminders. Any app view can (theoretically) be an indexable user activity. The latest message in an iMessage conversation can be saved as a reminder. A user profile in Twitterrific is an activity you can save. A point of interest in Maps is something you can be reminded about. An article from Pocket or Instapaper is a view that Siri understands because it’s a user activity. The list goes on and on, and developers don’t have to adopt any new technology to restructure their apps around this concept. As long as they use Handoff and set the right properties, apps can give Siri a way to create reminders off of them.\nAs a user, it’s nice to be able to open Siri, create a reminder for anything I’m doing in an app, and rest assured that it’ll be easy to get back to it later. In the Reminders app, todos created this way display an app icon on the right, which can be tapped to reopen the app in the view you asked to be reminded about. Similarly, if a smart reminder has a due date, its alert will have a button to launch the app with one tap.\nSometimes, Siri can’t remind you about apps.\nSmart reminders won’t work everywhere7, but they’re as close to being a universal glue between Reminders and apps as they can possibly be. I don’t use Reminders as my primary task management system (I like Todoist, and I’ve been experimenting with 2Do lately), but I recognize the utility of capturing the context of my activity for a task.\nTreating app views as points of interest that can become todos and restored at any time is a powerful idea, unavailable to other todo apps on iOS. It facilitates a new type of interaction: rather than saving some generic text as a task, you’ll be saving a reminder that takes you back to the rich experience of an app.\nI have two problems with smart reminders in iOS 9. The first one is that you can only create them with Siri: if you find yourself in a situation where you can’t talk to Siri but still want to create a smart reminder, you can’t. It’s an odd omission, especially given the new Reminders share extension available to any app that wants to use it.\n\nThe new extension can create normal reminders with the same details screen (for dates, location, etc.) of the Reminders app, with no concept of captured user activities and deep links whatsoever. Creating new reminders from Safari through the extension does put a Safari icon in the reminder, but it’s an exception.\nThis leads me to my second problem. The deeper Safari integration of the Reminders extension and a lack of public APIs for smart reminders suggest that the feature is exclusive to Apple’s app.\nSmart reminders in Reminders (left) and Fantastical, without deep link metadata.\nDevelopers of Calendar and Reminders clients such as Fantastical or Sunrise won’t be able to read the deep link information (app icon and linked activity) stored in smart reminders; instead, they’ll display them as text-only items. There might be workarounds8, but they’re not official solutions – which makes the entire system a little less useful if want to use the Reminders backend without the Reminders app.\nThe other addition to Reminders in iOS 9 is the ability to create reminders for when you’re getting in or out of your car. These reminders use a Bluetooth connection in your car to determine when your device has entered or left the vehicle, which can be useful to remember to do something before driving or immediately after stopping. To this day, Apple hasn’t clarified whether car reminders in iOS 9 require CarPlay or can work with generic car Bluetooth and third-party accessories.\n\nIn my tests and informal polls on Twitter, iOS 9 car reminders didn’t work with Bluetooth dongles used to add Bluetooth support in a car (such as my Aukey one), but they work with built-in Bluetooth configurations (by the car manufacturer) as well as CarPlay ones (of course).\nI wasn’t able to test this feature, but I’m intrigued by the idea of Reminders getting more savvy about user context and location. I wouldn’t be surprised to see HomeKit devices, iBeacons, and Apple Watches from friends nearby becoming future “reminder points” for the app.\nReminders continues to be a basic todo app that works for a lot of people but that falls short of advanced options for users like me. That’s okay: Apple doesn’t need to provide a full-fledged task manager for the majority of iOS users. The idea of turning app activities into reminders is interesting, and I guess that it’ll become a fixture of many people’s habits going forward. It hasn’t been enough to lure me back into Reminders, but, between Siri reminders and car alerts, its smart simplicity is starting to look a lot more attractive.\niCloud Drive\nFor years, certain circles of tech observers argued that Apple should have added a visible filesystem on iOS to make it more like OS X. And for years, Apple went in the opposite direction, doubling down on sandboxing and secure communications between apps.\nLast year, we saw the culmination of those efforts with document provider extensions and a revamped document picker that enabled users to pick (via an extension) any app as a storage service. Even so, don’t be fooled by the new iCloud Drive app in iOS 9: this is not Apple relenting and bringing a Finder to iOS. It’s an app with a document provider extension – and even a mediocre one.\n\nThe iCloud Drive app in iOS 9 is a wrapper for the iCloud Drive documents that users have been able to view in the iOS document picker since last year. There is nothing new or surprising about it: it’s the same interface you know from iOS 8 document pickers and OS X, with app folders, various sorting options, and buttons to create folders and move files around. The app is installed by going to Settings > iCloud > iCloud Drive > Show on Home Screen, but in testing I also received a prompt to install it as soon as I made a modification to a file on my Mac and the change propagated to iOS.\n\nI guess the reason the app exists is to give people a simple way to view and manage iCloud Drive documents – and that’s good, given Apple’s baffling decision not to offer an iCloud Drive app last year. But everything else in the app is mostly unsurprising, confusing, or frustrating.\niCloud Drive versions in MindNode.\nThe app is extremely basic: it doesn’t let you search from the root view into subfolders, there’s no way to view and restore versions of files9 (or recover deleted files, for which you’ll need the iCloud website), and you can view tags, but you can’t create them because they’re still exclusive to OS X.\nTo my knowledge – and I’ve looked around – there is no way to add photos from the Photos app to iCloud Drive. If you take a screenshot and want to organize it in a folder outside of Photos, I don’t know how you would do it. This stems from the fact that there is no system-wide iCloud Drive extension to save files from anywhere. Well, there is one, but it’s only available for some types of attachments in Mail, and you can’t use it anywhere else.\nIn iOS 9, document-based apps can support a new LSSupportsOpeningDocumentsInPlace property that adds the ability to open files from other apps “in place” to edit them without creating a copy. For the iCloud Drive app, this means that documents from iCloud-enabled apps can be tapped in iCloud Drive and they’ll automatically open in the associated app without generating a duplicate file.\nI ran some tests with an iOS 9 version of MindNode, a popular mind-mapping app for iPhone and iPad. When tapped in iCloud Drive, MindNode documents opened the MindNode app in-place, taking me straight to editing. This can also be done by long-pressing a file and tapping the ‘Info’ button to open a file in an associated app.\nFrom iCloud Drive to editing in MindNode. Will this work with other apps?\nHowever, soon after installing MindNode, iCloud Drive started automatically opening Numbers files into MindNode. My understanding is that, given the lack of an iOS 9 version of Numbers, iCloud Drive sent Numbers documents to an app that supported the file format and the open-in-place feature. What’s going to happen once multiple apps support open-in-place and the same file type? Will iCloud Drive display a prompt to pick an app for each file? I had no way to test this.\nIdeally, this is a good user experience. The iCloud Drive app is a container of files that lets users jump to editing documents in other apps, avoiding copies and confusion with duplicates. Will the system work once (and if) more developers adopt the open-in-place functionality? It’s a good addition, but it’s too early to tell.\nCan you guess what this means?\nThe user experience of document provider extensions is still problematic, with too many steps required to install and navigate external services and a confusing mix of modes for copying and opening files. In addition to open-in-place, another change in this area of iOS is a new label for document providers that don’t let other apps open and edit their documents (such as Dropbox and Google Drive). I’m not sure, though, that users will understand what this means, and, like open-in-place, I haven’t been able to properly test this functionality with different apps.\nThe iCloud Drive app joins a complicated array of features, and it does little to improve iOS file management workflows except being an icon on the Home screen and supporting new search APIs in iOS 9.\nApple has a lot of work to do with iCloud Drive on iOS – for both the app and the extension. iOS 9’s iCloud Drive doesn’t offer the same document management and search features of the Finder of OS X or the clarity and reliability of competing services such as Dropbox, Google Drive, and OneDrive. I know I’m not going to put critical work documents – such as this review – in iCloud Drive any time soon. The tools it offers aren’t enough to make me feel safe about it.\nDon’t believe anyone who says the iCloud Drive app for iOS 9 marks the arrival of proper file management on iOS: the road ahead is long and winding.\nPodcasts\nSince moving past tape reels and realistic buttons in 2013, Apple’s Podcasts app has received a variety of incremental updates that have made it a decent solution for people who are not seeking the advanced options of third-party clients such as Overcast and Pocket Casts. Apple’s work on Podcasts culminated in September 2014 with the app being pre-installed on iOS 8, further cementing it as the default option for millions of people looking for a way to listen to their favorite shows. Apple’s Podcasts didn’t destroy the market for third-party alternatives – many of which are still thriving – but it brought a good-enough solution for everyone.\nWith iOS 9, Apple seems to acknowledge that the renaissance of podcasting has to be sustained by a more capable listening experience, and they’re enhancing the Podcasts app with welcome improvements on both the iPhone and iPad, particularly for viewing new episodes and controlling playback.\n\nThe main addition to the app is a Music-inspired mini player that is available at the bottom of the screen and that can be used to pause and play episodes, see what’s playing, and access options without going into the full-screen Now Playing view. The mini player has worked well for iOS 8.4’s Music app in that it provides a convenient shortcut for playback controls; on big iPhones, it’s easier to reach the mini player than stretching your thumb to tap a Now Playing button at the top.\nThe mini player works well for Podcasts, too: it can be tapped or dragged to reveal a modal Now Playing view, which, just like Music, comes with a semi-translucent bottom half and a tappable artwork in the upper part that contains episode descriptions and show notes (what would be lyrics in Music). Like Music, Podcasts for iOS 9 offers queue management with Up Next/Play Next to build a list of episodes that can be controlled separately.\nThe sleep timer has been moved to center of the Now Playing screen, sitting between a share icon and a contextual menu to save an episode or remove it from downloads, view its description, and share it.\n\nUnfortunately, Podcasts also borrows the bad parts of Music’s Now Playing view: there are two ways to share an episode’s link (with an obnoxious “Check out this cool episode” prepended message), the progress bar is too thin to be dragged comfortably on smaller screens, and the view doesn’t take advantage of the iPad’s big screen. Episode descriptions aren’t displayed next to an episode’s artwork or below it on the iPad, so you’ll still be tapping a large artwork in the middle of the display to show a small box containing text and links – not exactly a good use of the device.\n\nMusic-based changes aside, the second major addition to Podcasts is in the show organization department with a cleanup of the My Podcasts and My Stations views. In iOS 9, Apple has removed the ability to show all podcasts in a grid view and it has rolled stations into My Podcasts. List view will be the only option in the app now, and any station will be available at the top of the list with a play button to start listening. You can still tap a station to view episodes inside it and tweak settings, which offer the same options of iOS 8.\nWith stations becoming part of My Podcasts, Apple freed up a slot in the app’s tab bar, which is now used by an Unplayed view that is my favorite change in this release. Like other podcast clients before, Podcasts defaults to showing a reverse chronological list (newest to oldest) of all episodes from all podcasts, grouped by day, regardless of their download status.\nSupertop’s Castro (left) and Podcasts on iOS 9.\nHeavily inspired by Supertop’s Castro, the Unplayed view of iOS 9 provides an easy way to view what’s new and Unplayed without dabbling in stations and search. As a result, finding episodes to play is faster and their presentation is better as they indicate what’s already been downloaded with an icon.\nA nice touch of this view is that, as you keep scrolling into past episodes, they will be grouped by extended periods of time such as This Month, Last 3 Months, Last 6 Months, and This Year instead of standalone weeks and days. Also, I like how you can search for text in descriptions – a great option for those who often remember specific episodes by links mentioned in it.\nTo cap it all off, Apple has refreshed individual show pages with more options as well. When a podcast has downloaded episodes on the device, a Saved tab will sit next to Unplayed and Feed to group all episodes stored locally. Podcast and storage settings can be accessed with a new gear icon, and you can choose to remain subscribed to a podcast without receiving notifications of new episodes from it.\nI love Marco Arment’s Overcast, and I find its many thoughtful details and listening features unparalleled in any podcast app for iOS. After using Podcasts extensively, though, I admit that I’m liking the mix of simplicity and moderately advanced options it has. The Unplayed view is a terrific addition, and the Music-inspired design and interactions create a cohesive audio experience on iOS 9 that make the two apps feel as part of the same ecosystem. Even when searching on the iTunes Store, iOS 9’s Podcasts displays a new tab to get individual podcast episodes, which is fairly similar to Apple Music in that you can start streaming or download right there from search.\nI’ve been listening to my favorite shows with the Podcasts app since June, and I’m fine, but I’ll probably go back to Overcast because of its audio effects and upcoming features. The temptation to keep using Podcasts and its system integrations – like Siri, or controls on Apple Watch – is strong, and I can safely recommend the app to anyone looking for a default podcast client. Apple’s work on Podcasts for iOS 9 is solid, and the app is a good option for everyone at this point.\nMail\nMail updates in iOS 9 are far from what I wished for, but there are two changes worth mentioning.\nSaving an attachment from Mail in iOS 9.\nYou can now add attachments to new messages from any document provider extension on your device, including iCloud Drive. To do so, you need to tap & hold in the message body to show the copy & paste menu and choose the new ‘Add Attachment’ option.10 This will show the document picker to select files from installed apps such as Dropbox and Google Drive, which will be inserted as inline attachments in a message.11 It’s another example of the idea that, in lieu of a traditional filesystem, extendible apps are the modern filesystem of iOS.\n\nThe second addition is Markup, a feature borrowed from OS X that brings the power of Preview annotations to iOS. Available for PDF files and images and exclusive to Mail, Markup lets you annotate a file when sending a message or reply with Markup to a file sent by someone else. To annotate a file, you can select it or place the cursor next to it and choose Markup from the copy & paste menu; for replies, you can use an action extension or hit the toolbox icon in the lower right corner of a Quick Look preview in Mail.\n\nMarkup works like the eponymous feature introduced in Yosemite last year: you can add text and shapes, choose between various colors and stroke sizes, add a magnification loupe with adjustable size and zoom, and add a signature. Two nice details that I appreciate: iOS detects freehand shapes and it offers to replace them with a precise version; and, multiple signatures can be added to Markup and they’ll be previewed in a menu so you can choose one. Markup signatures will also be synced with iCloud across devices.\n\nMarkup on iOS 9 is good and it works well for annotating images and documents, pointing out ideas and issues to someone over email. The feature has been built with collaboration in mind: annotations won’t be flattened onto the document after sending it, so you’ll be able to remove Markup annotations from another person, add your own, and send the document back with your notes.\nThe only problem of Markup, in fact, is that it’s too good to be limited to Mail. Apple needs to make Markup compatible with Photos, and even better it should be an option of native Quick Look previews or an action extension for system-wide annotations. I wish I could use Markup anywhere, but I can’t.\n\nIn terms of minor enhancements, Mail’s swipe gestures have received new icons to more easily distinguish actions. In-app search has been slightly updated as well, with new search tokens that can be tapped to reveal additional filters. There’s also a progress bar that indicates when search is loading (unfortunately, it’s still slow for Gmail accounts).\nTap search tokens to reveal options.\nMail is a fine default client, affected by the same problems I’ve been covering for the past two years. Search inside the app is slow, there’s no way to make messages actionable with extensions, and the inbox lacks the smart organizational tools found in popular third-party clients such as Outlook, Spark, Inbox, and CloudMagic.\nMail for iOS is a desktop-class app, but that’s starting to become a liability. I’m hoping to see more iOS-first features next year.\nApple News\nApple News, the company’s vision for the future of Newsstand mixed with a response to Flipboard and Instant Articles, is launching this week in the U.S., U.K., and Australia. While I was able to try the service in Italy by changing my device’s region format to American, that’s proven to be utterly inconvenient12, and I’ve chosen to leave the job of reviewing Apple News to my Australian colleague Graham Spencer. I still have a brief thought on the app, though.\nAn example of the content I am recommended in Apple News.\nApple News lets you follow individual sources (websites) and topics; the latter is reminiscent of Zite (acquired by Flipboard). Supposedly, the app learns from your reading habits by monitoring what you read on-device, but it also syncs your data via iCloud for convenience. Thanks to machine learning, Apple News should, in theory, understand what an article is about, leading to further exploration of topics via tags, and it should also give you more articles to read in a Music-inspired For You section. The majority of websites in News work by sharing an RSS feed with the service (like MacStories does), which is reformatted to look nice in the app; others are composing special articles for Apple News with a native format that offers more control and customization.\n“Discover Card”?\nIn my experience, Apple News has been comically bad at recognizing topics from articles and it has provided me with recommendations that rarely piqued my interest. At one point I thought it was getting better, but that was just the luck of two interesting articles that appeared in For You. Unlike Apple Music, you can only like stories in Apple News with no way to say “I don’t like this”, and that’s proven to be a major issue for me. I can easily ignore 80% of the content recommended to me in the For You section, but there’s no way to tell the app about it.\nIt’s not like I haven’t tried to like News or use it every day. I have added dozens of blogs and topics to my favorites. I have read articles on both the iPhone and iPad, liked them, and shared them with people. I have explored topics and checked out native articles, which are pretty cool (but the examples were limited). In spite of my dedication, Apple News just didn’t get any better as a recommendation engine. And if I have to use it as an RSS reader, then I’m just going to keep using NewsBlur, which gives me more control over my sources.\nAh, yes, interactive media.\nI am an RSS power user, but the problems I noticed in Apple News aren’t minor annoyances that only people who subscribe to 200 feeds will notice. You can’t reorder items in the Favorites view. You cannot teach the app what is good and what’s not interesting. You come across awkwardly computer-generated topics such as Central Processing Unit, Mobile App, Interactive Media, and the all-encompassing Business, which is often complemented by a cover photo of an American businessman with a peculiar hair style. The app doesn’t use Safari View Controller for viewing articles on the web, which means that Content Blockers aren’t supported either.\nThere may be a market for Apple News, but this first version feels too unfinished, slow, cluttered, and computer-y for me to take it seriously when it comes to my daily news workflow. I suppose you could appreciate Apple News as a way to browse a few favorite sites and topics in a simple, visual fashion, and I continue to be intrigued by the Apple News Format, which I’ll experiment with for MacStories. But for now, I’m back to my trusted NewsBlur.\n\nNotes\nWhenever Apple announces new features and improvements for iOS’ built-in Messages app, they like to brag about its status of “most used” app on the iPhone and iPad. While that’s probably true given the popularity of iMessage and the importance of mobile messaging in our lives, I wouldn’t be surprised to see another Apple app as the runner-up in a daily usage chart: Notes.\nEverybody takes notes, but the concept of a “note” varies deeply from user to user. A note can be exactly what the name says – a short text annotation jotted down for later – but it can also be intended as a list of things to remember, a collection of products to buy, reference material, and more. The versatility of apps and data types supported by iOS has spurred the creation of an entire ecosystem of note-taking apps that can serve different purposes. There are apps to save notes as text, locations as notes, images as notes, and even create notes you can share with others or automate for yourself. In the age of iOS, a note is more than text.\nFor all the third-party apps that promise a superior management of notes, though, I’m willing to bet that Apple’s pre-installed Notes app takes the crown for the note-taking app used by millions of users on a daily basis. And unsurprisingly so: the Notes app offers basic formatting and note creation functionalities that most people are okay with, and the integration with the system (namely through Siri) and cross-platform availability via iCloud makes it a good-enough choice for the average iOS user. I couldn’t use Apple’s Notes app for what I needed to do in the past year with MacStories and Relay FM, but I understood why most of my friends were perfectly content with it. In spite of its awkwardly retro interface, Notes is dependable.\nWith iOS 9, Apple has taken aim at note-taking apps that allow users to expand notes beyond text and is supercharging its Notes app with brand new features that make it a serious player in the game and a better option for all users. While it was Messages’ turn to receive an overhaul with iOS 8, Notes is getting much deserved attention this year, with some surprising and unexpected results. I’ve switched to Notes full-time since the first beta of iOS 9, and I don’t see myself having to use another note-taking app any time soon.\n\nThe first visible change in the new Notes app is the ability to organize notes in folders13 and, like the Photos app, access recently deleted notes for 30 days with an option to restore them.\nThe organizational revamp is made possible by Apple migrating from the old, IMAP-based backend of Notes (which relied on an email protocol to sync notes across devices) to a modern, faster CloudKit-enabled infrastructure that gives the company more control and flexibility.\nWhile it’s still possible to sync Notes with an email account configured in the iOS Settings, the ability to organize notes in folders is only exposed to users who set up Notes in local mode (no sync) or iCloud. It’s safe to assume the latter will turn out to be the most popular option: once migrated to the new Notes app, iCloud accounts will be able to create folders and keep them in sync between devices – which is obviously not available with local mode.\nThe possibility to create folders for notes is hinted in the main screen of the app: named ‘Folders’ and featuring a ‘New Folder’ button at the bottom of the account list, this is where you’ll be able to create a new folder, give it a title, rename it, and delete it when it’s no longer needed. In the Folders page, accounts are grouped by name and they list each folder contained inside them. In this review, I’m going to use iCloud as the main example as it’s what I’ve been using for the past three months and what I believe the majority of iOS users will upgrade to after iOS 9.\niCloud puts every note in the ‘Notes’ folder – the default destination for new notes created via Siri (this can’t be changed as there’s no setting for a different default folder) and a top-level folder that can’t be deleted. Notes can be moved across folders by swiping on an individual note and revealing a Move menu. The interface for this is simple enough and gets the job done, but it lacks the polish of Mail’s swipe menu.\nFrom an organizational perspective, folders in Notes are likely to serve most users sufficiently well. I’ve created folders for Home and MacStories, and I found myself being okay with the ability to have notes in distinct places and access them with one tap from the main screen of the app. For the average iOS user who relies on Notes for short bits of text, folders will be a small revolution – and yet another case of iOS users deriving the greatest joy from the simplest features adopted from OS X.\nThis doesn’t change the fact that folders in Notes will be too limiting for advanced users who are accustomed to deeper management in alternative note-taking apps. Most notably, Notes’ search feature (available by swiping down on a folder to reveal a search bar) can’t restrict search to a single folder, and even when looking for a string of text that belongs to a note in a folder, the app will simply match that text as part of the top-level ‘Notes’ folder.\nPerhaps more perplexingly, Notes’ search always looks for every note across every folder and every account. Typing into the search bar of the general All iCloud “folder” (a filter that is created by default and that collects all notes from all folders in iCloud) has the same effect of trying to search in a folder that only contains a subset of notes.\n\nWhile I understand why Apple may not want to put advanced search options in the search bar, starting a new search from inside a folder should at least attempt to limit the scope to that folder. The ability to view recent searches and limit search to the current account somewhat helps in retrieving specific notes more quickly, but the aforementioned misreporting of the source folder in search only adds insult to the injury for users who are going to keep several dozens of notes. This is only one of the many issues with Notes for users who would like to do more with the app.\nIn the grand scheme of things, where you can move notes is, after all, one of the less significant changes in the Notes app. Apple has put great emphasis on what you can do with Notes in iOS 9, and that’s where the update feels most impressive – in some cases, even when compared to third-party alternatives.\nA note in iOS 9 can contain images, lists and checklists, sketches, link snippets, files, and more. The new Notes app wants to be more than a single-purpose container of text – it aims to become an everything bucket for the iOS user who doesn’t want to forget anything. This is accomplished with a refreshed set of controls and system integrations, with a few missteps along the way but, overall, with a new direction for the app that feels like the right move at the right time.\nFormatting\nSince version 5.0, iOS has provided system-wide formatting controls for text in the copy & paste menu to make text bold, italic, or underlined in any app that supported rich text, including Mail and Notes. In iOS 9, Apple is taking a page from its iWork suite (specifically, Pages – no pun intended) to offer a brand new formatting view on the iPhone and iPad that considerably extends the text style options available in Notes.\n\nOn the iPhone, new formatting controls are revealed by tapping a “+” button above the keyboard that turns into a special row with a cute animation reminiscent of Dashboard’s +/x transition of yore.14 This bar contains additional controls for checklists, photos, and sketches (more on this in a bit); on the iPad, it’s integrated as part of the new Shortcut Bar that features customizable shortcuts on both sides of the keyboard’s QuickType bar. On both devices, the Formatting view is accessed by tapping the “Aa” button next to checklists.\nFormatting options in Notes include Title, Heading, Body, three types of lists (bulleted, dashed, and numbered, which can be indented), plus shortcuts for bold, italic, and underlined text. On the iPhone, these controls are displayed in the lower half of the screen like Pages, with a scrollable list of styles and visible shortcuts for bold, italic, and underlined. While it’s fairly apparent that Apple modeled this screen after Pages and the old Google Docs app, there are some key differences.\nFormatting controls in Pages and Notes.\nFirst, Notes doesn’t have the same wealth of controls available in Pages and Docs – Apple doesn’t believe users should be able to tweak the font size of body text or the line spacing in Notes, which is meant to be a one-size-fits-all note-taking app for quick interactions and no concept of custom layout whatsoever. Secondly, Apple seems to have learned from its mistakes and put the formatting button towards the bottom of the UI (just above the keyboard) instead of opting for the title bar as they did with Pages. It’s clear that Notes has been rethought for the age of big screens, while Pages shows the last vestiges of a pre-iPhone 6 era. The result is that accessing formatting controls in Notes on an iPhone 6 Plus is easier than trying to do the same in Pages.\nOn the iPad, formatting controls are displayed with a popover floating on top of the Shortcut Bar.\n\nWhile bold, italic, and underline still require text selection to change the appearance of a text string, making some text a heading or a list can be done by tapping next to it and picking an option in the formatting list without selecting it first. In the months I’ve spent using Notes for research and personal notes, this has dramatically sped up the process of going from a list of plain text lines to a nicely formatted note with a clear structure. In the new Notes, I can start typing to get thoughts out of my head, then open formatting, tap to place the cursor next to lines I want to make different, and tap away on titles, headings, and lists to make something more relevant or structured.\nNotes may not have all the formatting options of Pages, but that’s the point. By not being too complex, Notes can appeal to users who don’t want or need Pages but that would also like the ability to mark up notes easily.\nChecklists\nSomething I’ve always noticed when taking a look at how people I know in real life use iOS is that a vast portion of them uses the Notes app as a todo system. In spite of iOS having its own Reminders app with support for alerts and geofences, a lot of people jot down things they have to do or remember in Notes. With iOS 9, Apple is catering to this use case with the ability to create checklists.\nThe implementation of checklists is straightforward: a new button next to formatting controls allows you to start a checklist or convert selected lines of text into one. As you’re adding items to a checklist, Notes offers automatic list continuation and it can also convert other types of lists into a checklist. You can check off items in a checklist, which gives you an indication of things that have been completed.\nThis list may or may not be real.\nThe way Notes treats checklists isn’t similar to how any todo app would work with lists and tasks. There’s no concept of dates or reminders in Notes. It’s not a smart database that remembers completed items when you convert back and forth between formats and styles. Checklists are just another formatting option in Notes with a stronger visual cue that makes text lines look like a todo list even if, after all, it’s just text with a checkbox.\nThe key difference to keep in mind is that Apple isn’t seeking to replace Reminders with Notes in iOS 9. Checklists in Notes can’t be given dates or any type of task-related metadata – if you want to organize your todos in proper lists with alerts and sharing settings, you’ll still need Reminders.\nThat is precisely why I believe checklists are such a clever, cunning idea. Apple may not be looking to replace Reminders directly, but a lot of people are going to be ecstatic about the addition of checklists in Notes. Those people already use the app to save things they need to act on, but until iOS 9 they’ve saved them as lines of text that later needed to be manually selected and deleted. This may sound absurd to tech-inclined folks (myself included) who often use multiple dedicated reminder and task management apps, but the reality is that millions of people don’t need the overhead of our systems.\nI’d be willing to bet that a lot of folks don’t want or have to attach metadata like priorities, locations, dates, and notes to tasks; they just want to type them out and get to them eventually. What better system than an app where there are no strict format requirements and where a note can be an image, a list of rich text, a drawing, or an interactive checklist?\nFor those people, I’d argue that the richness of Notes in iOS 9 will be superior to Reminders, with checklists being the epitome of Apple adjusting to unexpected use cases and the way people use their apps. By definition, something that needs to get done should be saved in Reminders, even without a date. But if millions of customers prefer to mix and match notes with text that loosely represents a todo and if that system can scale to incorporate nicer formatting for todos alongside other media, then it’s only fair to make Notes more versatile and yet easier to use than Reminders.\nImages and Sketches\nNotes’ improved text abilities are complemented by a set of image and video-related features aimed at letting users capture more types of information.\nPhotos and videos can be interspersed between text and lists in Notes for iOS 9: at any point during editing, you can tap the camera button to choose an item from your photo library or take a new photo or video. Whatever you pick will be displayed inline between text and other media in a note, so you can tap on a video to play it back inside Notes or tap an image to view it in full-screen.\n\nNotes is smart enough to reformat text when an image is inserted (for instance, a checklist is discontinued if you insert an image after some text) and you can also paste images copied from other apps – thus making Notes an ideal companion for Safari when you want to reference images from the web without saving them to Photos.\nTo keep things simple, Notes doesn’t have any sort of image resizing or text reflowing options to build more complex layouts as you can in Pages and other word processors. Again, Apple’s goal is to offer a more powerful note-taking app and not necessarily a slimmer Pages, so this makes sense to me. I didn’t miss such options in my tests, and I like that I can’t go wrong by inserting a bunch of pictures alongside text in a note.\nMessages’ photo picker (right) is a superior implementation.\nAlas, the image picker itself leaves a lot to be desired: rather than adopting the useful picker found in Messages (which displays recent media in a swipeable tray), Notes comes with a dull menu to open a picker or take a new picture.\nMore interestingly, Apple has built a sketching mode into Notes, enabling users to mix text, media, and other content with interactive sketches that can be updated at any time and exported as images.\nSketching is accessed by tapping the drawing icon in the bottom toolbar, which will kick Notes into a drawing screen that offers a pen, a sharpie, a pencil, a ruler, an eraser, and a color picker. A drawing’s background defaults to the same paper texture used throughout the app, and pages can be flipped horizontally or vertically. Sketches can be shared with other apps by tapping the share icon in the top right, which will export a static image.\nYes, I am an artist.\nWhen playing around with sketches in Notes, you’ll likely notice two things: the limited tools available when compared to standalone apps like Paper, and the solid performance of drawing on screen in Apple’s app.\nSimplicity shouldn’t come as a surprise: the entire app hinges on the idea of enriching a traditional note-taking environment with just enough more stuff, and, overall, Apple is doing a pretty good job at covering the basics. Performance, though, is a whole other topic.\nIn any form of interactive video output, latency is an issue to consider. Whether it’s controlling videogames with buttons or touching an iPad’s screen to tap something, the relationship between humans and software depends on latency – the delay in how long it takes for input to translate to output. And because our fingers are better input methods than we tend to believe, even the slightest amount of higher latency can lead to a disruptive user experience when the human eye is able to discern a visible delay.\nEver swiped quickly on a multitouch display and noticed virtual ink struggling to catch up with your finger sliding across the screen? That’s latency.15\nVisual latency can be ascribed to various factors, but most notably in modern software, CPUs are to blame for the delay we observe between our actions and the expected result. With iOS 9, Apple has set out to drastically reduce latency to make apps more responsive, cutting down the amount of time required to compute user touches and render them on screen. This initiative – which comes with new APIs for developers – will have the biggest impact on games (which are heavily reliant on fast multitouch gestures) and drawing apps – not to mention the upcoming Pencil for iPad Pro.\nOne of the downsides of drawing apps on iOS (and particularly on iPad) is the noticeable delay between swiping and seeing ink come up on screen. While developers have written entire custom engines dedicated to making ink appear as naturally as possible, laws of physics and intrinsic iOS limitations have made it nearly impossible to replicate the feeling of a real pen on a multitouch display.\nApple’s goal with iOS 9 isn’t to make drawing on iOS exactly like using a physical pen, but to get very close to it.\nApple is introducing Touch Coalescing, an API that leverages the iPad Air 2’s twice-as-fast 120Hz touch scan update rate to double the number of touches registered by the OS and therefore the touch information exposed to apps. Thanks to the higher touch scan rate of the Air 2 (other iOS devices can scan for touches at 60Hz), iOS 9 can accumulate twice the number of touches per second, but also coalesce those updates without wasting computational work in an app. Coalesced touches enable a single frame on the iPad Air 2 to scan for two touches, which are available to developers on demand via an API.\nBut that’s not all. On top of doubling the number of touches iOS can recognize on each refresh, Apple has built a predictive touch engine that can look into the future of user touches and guess where a user’s finger may be going next. Using some highly tuned algorithms, iOS 9 can provide developers with a fresh set of predicted touches all the time, which can be used to further decrease latency as they’re added to the iOS graphics pipeline, preceding the work needed to scan for touches, animate them, and pass them to an app. Built into UIKit, predicted touches are independent from coalesced touches, but together they can be used to make latency on iOS even lower to get closer to the idea of direct manipulation and fast performance.\nBy Apple’s estimates, the work done on iOS 9 has allowed input recognition to go from four frames to a frame and a half. If the above paragraphs are too technical: this is a massive performance improvement for drawing apps and games on the iPad Air 2, and it bodes well for the iPad Pro and Pencil accessory.\nDrawing in Notes for iOS 9 is fast, smooth, and natural. As you swipe across the screen using the pen tool or the pencil, ink renders smoothly (this is the only area where the app’s paper texture is justified) and, more importantly, animates quickly and follows the tip of your finger unlike any other drawing app. In testing Notes drawing on my iPad Air 2 and iPhone 6 Plus, I noticed no visible difference between the two, with solid performance on both devices in terms of rendering speed and animations. What Apple has done for advanced touch input in iOS 9 can be noticed when drawing in the Notes app, even if saying “it’s fast” doesn’t do justice to the fascinating complexity behind it.\nAs a user, it seems clear that drawing in Notes isn’t aimed at artists. Notes doesn’t want to replace Paper: rather, sketches are used as complements to text and images, useful in those occasions where the human finger can express shapes and ideas that would take too long with apps and images.\n\nThe ruler is an obvious candidate for education and for whoever is seeking to use an iPad to plan future house redecorations: once placed on the screen, the ruler can be rotated with two fingers to tilt its angle, then you can pick a tool and swipe across it to draw a straight, precise line that wouldn’t be possible without it. It’s an incredibly fun and reliable implementation and one of Apple’s finest details in iOS 9.\nSketches in Notes are a good example of why Apple likes to control the entire stack. If the company wasn’t in charge of every single aspect of its hardware and software, it wouldn’t have been able to optimize iOS to take advantage of the display of the iPad Air 2 and build a predictive touch engine aimed at reducing latency.\nWhen you control every facet of the experience, you can focus on seemingly unimportant aspects of software such as going from four frames to one and a half for input recognition, even if most users only want to produce crummy drawings in Notes. And that’s okay, because knowing how that works and why it performs so well is part of the fun in these articles, and I’m excited to see what developers do with it.\nShare Extension and Links\nApple’s willingness to turn Notes into iOS’ everything bucket is perhaps best exemplified by its new share extension. From anywhere on iOS, you can now capture text, links, and files and save them into a new note or, even better, append them to an existing note. This is, alongside iPad multitasking, one of my favorite features of iOS 9 and it has allowed me to drop several workflows I built for iOS 8.\n\niOS 9’s Notes extension lives in the share sheet, and it’s a share extension that lets you capture anything you come across that can be shared by an app. The extension is a floating popup that carries the same paper texture of the Notes app, with the same yellow UI accents and letterpress effect on text. The Notes extension defaults to saving shared items in a new note, but you can tap the “Choose Note” button at the bottom to pick an existing note where you’d like to save something into. The extension will also remember the last note you saved an item into if you bring it up after a few seconds.\nIf you’ve used extensions such as Evernote’s or 2Do’s since their debut last year, you’ll be familiar with the thinking behind the Notes extension. Any text, URL, or file you can share on iOS through the native share sheet can be passed to the Notes extension, which will preview it inline whenever possible. If you’re sending text to the extension, it will prefill the Notes sheet; an image will get a thumbnail preview on the right; a web link will get a nice snippet preview with a title and the first image found on the webpage.\nThere are two ways the Notes extension will attempt to save content in a note. For data that can be rendered inline such as videos, images, and text, the extension will either show an editable text field (text can always be edited manually upon saving through the extension) or a thumbnail preview. You can try this out by saving a photo or a video from Photos, or some text selected from Mail via the new Share button in the copy & paste menu: both media and text will be tappable or editable in a note – as if you created them from Notes in the first place.\nAlternatively, Notes will save links or files it can’t render inline as small units of content that appear as standalone, tappable items in between body text. In some cases, tapping these note attachments will show a Quick Look preview, or it’ll open Safari, or – and this is where the extension and the app get more creative – it’ll adjust the UI to preview the attached content with native media controls.\nLinks\nLet’s start with web links. For me, saving links from apps like Twitter clients or news readers accounts to one of the activities I perform the most on a daily basis on my devices. I save links because I want to cover them on MacStories, or because I need to share them with someone, or perhaps they’re reference material I’ll have to find again in the future. The apps I use to manage this daily avalanche of links tend to treat them for what they are: hyperlinks that open Safari. This is the case for apps like 2Do, Messages, Mail, NewsBlur, and countless others. The richness of the web doesn’t apply to its resource locator, which is just a link.\nNotes’ extension doesn’t work like that. In the new app, Apple has devised a way to offer a basic preview of the information available at the source URL with rich link snippets that display the associated webpage’s title, description, and lead image. Web link previews in Notes are further proof of Apple’s commitment to web metadata technologies such as Open Graph16, and they provide a fantastic way to give meaning to a URL.\nHey, Underscore!\nWhen saving a link with the Notes extension from any iOS app capable of sharing URLs (it doesn’t have to be Safari), the extension will fetch the link’s metadata and display them in the compose sheet. This is a good way to preview URLs without opening a web view: if you’re in, say, a Twitter client and want to know what a link is about without giving the website a page view, you can send the link to the Notes extension and it’ll fetch the webpage’s title and description for you.17\nThe extension’s parser is capable of following multiple domain redirects, too: if you give the extension a shortened URL, it won’t try to resolve it to its source domain, but it’ll still follow all redirects until it can preview the final webpage’s information. In my tests, the extension took less than a second to present a link snippet for unfurled links, but it could take a couple of seconds for shortened links with multiple redirects (such as Bitly or Buffer links).\nAs far as I know, no other note-taking app on iOS offers a smart web capture feature that can parse a link’s metadata to give more context to a link. Apple’s implementation could have used an indicator to show when Notes is trying to parse a link’s title and thumbnail preview18, but, overall, it’s an invisible, it-just-works kind of feature that performs admirably.\nI’m in love with the way the Notes extension saves links. As a writer on the web, the link is my currency, but sometimes its value can’t be easily assessed because URLs are fundamentally meaningless. With the Notes extension, I can assemble a note with a bunch of links and be presented with a series of small previews that have titles, two-line descriptions, and image thumbnails. And because the Notes extension can create new notes or append content to the bottom of an existing note, I can keep my lists of links as separate notes where web previews are saved in chronological order, without formatting issues, without having to create complex workflows to manage it all.\nThis seemingly minor addition fixes a serious annoyance of mine. Every week, I collect links in separate lists for this site, our MacStories Weekly newsletter for Club MacStories members, Connected, and Virtual. Until a few months ago, I used to go through each list to evaluate the importance of each link, which usually meant reopening it in Safari to recall what it was about. This was a time-consuming process, especially because apps like Evernote would sometimes fail at opening a link in Safari when tapping it.\nWith Notes, I now keep lists of links in the main Notes folder and going through the list simply involves taking a look at a link preview, then long-tapping to delete or copy. Thanks to the built-in link previews, my workflow has been reduced to a nimble visual reassessment and a tap & hold, which is better than what I used to do to process links.\nI didn’t think having a preview of a link’s title and a thumbnail would do much, but, in practice, the way Notes presents links saved from the extension is a superior solution to other note-taking apps for iOS in every way. I’ve been spoiled by web link previews in Notes, and now I want them everywhere.\nAnd that’s exactly the problem: this new behavior is exclusive to Notes, and specifically to the Notes extension. If you copy a link from, say, Messages (which, like other Apple apps, still doesn’t display the system share sheet for tap & hold) and paste it into a note, it won’t be expanded to become a rich snippet – it’ll be an old fashioned tappable URL. Similarly, while Apple has built a way to better preview URLs by attaching visual metadata to them, this system isn’t used across other Apple apps on iOS 9, which are still limited to displaying links as URLs with no extra information.\nThere’s a clear winner here.\nAs services such as Twitter, Slack, and Facebook have shown, there’s value in presenting URLs as cards of content that push information from the web to the user. Apple is thinking along these lines with rich snippets in Notes and Spotlight for iOS 9, but while the results are commendable, such effort isn’t consistent throughout the OS. I’m hoping iOS 10 will offer new APIs to present URLs as rich snippets like Notes does today.\nFiles\nThe other benefit of a smarter, more versatile Notes app is a wider array of options for saving attachments into it. Images and videos aside, the Notes extension is capable of accepting any file that can be shared via the share sheet on iOS 9, making it an intriguing solution for folks who have relied on database-style apps such as OneNote and Evernote for similar workflows. I have some reservations on Notes’ attachment handling, which ranges from “very good” to “mysteriously half-baked”.\nAttaching files to Notes.\nIn theory, you should be able to send any file to the Notes extension and choose whether you want to create a new note for it or append it to an existing one – all while retaining the ability to add a text comment from the share sheet. Files saved to Notes via the share extension will appear as links: units displayed inline within a note, with filename, size, and a thumbnail preview whenever possible. In spite of Notes’ rich engine, some file types will be rendered as Quick Look attachments that need to be previewed in a separate modal window. A .txt file won’t be attached with text, but you’ll get a .txt icon you have to tap to view plain text in Quick Look; PDFs will be previewed in the body of a note, but you’ll have to tap them to swipe through pages.\nWhat’s even stranger is the inconsistency of the attachment/preview system and fantastic little touches pitted against glaring omissions. You can search for text contained in PDFs saved in Notes, but you can’t use Markup from Quick Look previews. iOS has an underlying engine to render rich text consistently between apps, but selecting some formatted text in Mail to share it with the Notes extension will strip all formatting and save it as plain text in the app.\n\nThen, you come across voice memos, which can be saved into Notes as attachments and that have a Play button that transforms the app’s title bar in a media player with playback controls.\nWait, what?\nI understand why Apple may want to rely on extensions to extend Notes beyond its advertised capabilities. In avoiding buttons to attach voice recordings, documents, and other files from apps, Apple isn’t only ensuring the Notes UI isn’t too cluttered (unlike Evernote) – they’re also reducing the potential cognitive load of having to know what all those buttons do.\nThis argument doesn’t preclude Apple from having some basic consistency for Quick Look previews in Notes or properly teaching users that they can save a variety of file types into the app. How should a college student know that a voice memo can be saved (and played back) inside Notes? Why do PDFs come with different preview features in Mail and Notes?\nApple has done good work with the Notes extension in iOS 9. It’s been a fantastic addition to my workflow for saving links and appending content to existing notes. But if Apple truly wants to make Notes a versatile repository for all kinds of user content without an overbearing UI, the extension and Quick Look frameworks need to be reworked to always maintain formatting and have better previews for files inside Notes. Today, you can save files to Notes, but its previews are lacking in several ways.\nAttachments Browser\nWith Notes taking on new attachment-handling duties, Apple has chosen to give the app an additional view to browse all attachments saved across all folders. Accessed by tapping the grid icon in the bottom left corner of the notes sidebar, the Attachments Browser lets you view a cluster of photos and videos, sketches, websites, audio clips, and documents in a single screen.\n\nYou can tap each item to preview it and go to its associated note, or you can tap & hold it to go to the note directly. The main screen displays the most recent attachments for each category, with a ‘See All’ button on the right to view the full grid in a separate view.\nI have mixed feelings about the Attachments Browser. On one hand, it’s the perfect showcase for Notes’ attachment abilities as it can cull non-text items from all notes and present them in a view that brings them front and center. On the other hand, that’s also its biggest downside: you can’t view attachments per folder – you can only view all attachments from all notes. If you’re the kind of user who adds a lot of images to notes and would like a way to filter them by folder, you won’t be able to do that in this version of Notes.\nThe Attachments Browser also shares the same limitations of the extension when it comes to file types it doesn’t understand: .zip archives saved from other apps will be categorized under ‘Documents’, for instance. On the other hand, audio clips are playable from the main view of the browser by tapping a large Play button (cool) and links can be opened in Safari with a single tap.\nIn spite of its shortcomings, the Attachments Browser is a clever addition to Notes. While the entire app relies on blurring the difference between text and non-text content for a seamless experience, the Attachments Browser allows you to filter out everything that isn’t text and that you can interact with.\nNotes as Rich Documents\nThe few issues I have with Notes’ search and the extension don’t change my overall take on this update. With iOS 9, Notes isn’t just a powerful alternative to third-party note-taking apps – it joins Safari on the podium of Apple’s best work on iOS, period.\nI’ve been an Evernote user for years. I’ve often talked about the service’s adaptability to rich text and file attachments. I’ve relied on integrations with external apps through the Evernote API. I’ve shared notes and entire notebooks with others. Notes doesn’t have any of this. There’s no API for third-party apps and services to plug into; no sharing of notes and folders, not even with family members; no sorting options, no tags, no subfolders. From a power user’s perspective, Notes is the wrong choice. So why all this enthusiasm?\nAt some point, consistency and reliability trump automation and feature richness. Should I use something that offers the potential benefit of dozens of features, or am I better served by an app that covers the basics elegantly, works expectedly, but that has very clear limitations for automation and advanced use cases? Do I like the thought of power user features in my notes more than their actual practicality?\nAfter years of Evernote changes, feature additions, and Work Chat prompts, the simplicity of Notes is refreshing. It doesn’t cover many aspects of what Evernote is capable of, and many will be perfectly happy to stick with Evernote because they truly need all of its features.\nBut I don’t. In using Notes for iOS 9, I realized that I’m okay with the ability to intermix rich text and images, file attachments and sketches, all while taking advantage of one of the best share extensions on iOS, Siri integration, Spotlight search, and multitasking on iPad. All the workflows I created to append links to a note pale in comparison to the effectiveness of Notes’ extension and link previews. Evernote’s sync can’t be as fast or frequent as Notes’ iCloud backend. The bloat that Evernote accumulated through the years has been replaced by a basic yet powerful note-taking app that does everything I need, and I feel relieved knowing I no longer have to fight Evernote’s tendency for more. This is all there is, and it’s okay.\nThat’s not to say Notes can’t get better. Besides the aforementioned problems with search, the extension, and file attachments, I miss a way to pin specific notes at the top of my list, or to sort them alphabetically when I want to. Siri can’t delete notes, which I don’t understand. The entire app is still stuck on a paper texture and letterpress text that makes sense for sketches, but that, like Reminders, no longer has a reason to exist on iOS and that makes text harder to read sometimes.\nIt’s been three months since I started using Notes on iOS 9, and I can’t imagine going back to any other app for my needs, which involve rich text, images, links, and documents. As users increasingly rely on iPhones and iPads as their primary (and often only) computers, the decision to turn Notes into a central location for all kinds of content was a good one. Notes on iOS 9 is an extremely intelligent, focused, and useful update.\n\niPad\nAfter years of little attention paid to the user interface and features of iOS for iPad, Apple wants to correct its course with iOS 9. A combination of the OS’ maturity and willingness to reignite interest in the platform amid declining sales, Apple’s initiative encompasses app interaction, multitasking, text input, and external keyboard integration.\nThe result is the most important iOS release for iPad to date, as well as a stepping stone for the future of the device as an everyday computer.\nGrowing Pains\nWhen the iPad launched in 2010, few in the tech press knew what to make of it. If it’s a tablet, why does it run iPhone OS instead of a desktop OS? Is it a big iPod touch or a small Mac?\nAs it turned out, the preoccupations of tech bloggers were the very factors that contributed to the record-breaking first years of iPad. It was a bigger iOS device that ran familiar software specifically designed to make you feel like you were holding and using a physical object. The iPad could be a book and a newspaper. A calculator on your desk and a portable typewriter. An agenda. A diary. By design, the full-screen nature of apps on the iPad had been engineered to convince you of one simple truth: This device can be anything. And because millions already knew how to use it thanks to iPhone OS, it did offer something for everyone.\nThe biggest problem that has affected the iPad in the past three years stems from Apple itself. After the launch of iOS 6, the company began a long and tortuous journey towards a new identity for iOS. During this period, iPad got the short end of the redesign stick: while Apple was busy rethinking the core structure and visual appareance of the iPhone, the iPad got unimaginative adaptations and other UI leftovers.\nThree years after the iPad’s launch, Apple didn’t seize the opportunity to make iPad features and apps unique and tailored to the platform. They just scaled them up. The same consistency that was a smart move in 2010 didn’t make much sense in 2013 after iOS 7 and the chance of a fresh start.\niOS 7 wasn’t just a visual disappointment for iPad users who were craving for attention. From a functional standpoint, the iPad had evolved to an appealing computer replacement for many, albeit with too many compromises. Tasks that were trivial on a PC were too difficult, if not downright impossible, on an iPad. iOS apps were unable to communicate with each other. Apple had ushered users in the post-PC era with the original iPad and then left them halfway there.\nOn the iPad, iOS 7 felt like a rushed conversion that had forgotten about the promise of a revolution.\nBig changes, however, often come in small doses. With last year’s iOS 8, we caught a glimpse of what Apple’s thought process might have been: if iOS 7 laid a new visual foundation, iOS 8 was going to spread a stronger technical layer on top of it. We witnessed how Apple was willing to modularize the concept of app – the long-sacred silo – into multiple functionalities and screen sizes connected by a common, secure thread. iOS 8 came out as the yin to iOS 7’s yang: free of their (sometimes forced, frequently derided) photorealistic appearance, apps were granted an out-of-sandbox permission, too.\nIt’s not uncommon to rely on hindsight to understand the iterative changes behind Apple’s products. iOS 7 brought a new, subdued look. iOS 8 introduced a framework to extends apps. These are not features designed in a vacuum.\nExtensions make more sense with a design language that focuses on color rather than heavy textures and 3D graphics. Imagine if all your apps still looked like distinct objects and you had to interact with panels of leather on top of wooden backgrounds, metal slates, and paper sheets. Similarly, consider the new iPhones and iPads: without a design that eschews pixel-perfect object recreations, many developers would have to target new screen sizes with bitmap graphics that take time away from actual app development.\niOS 7 and iOS 8 were deeply intertwined, two sides of the same coin that Apple revealed in the span of a year. In the iPad’s case, they still weren’t enough to complete the vision of what Apple had in store for the future of the device.\nBut as they say: third time’s the charm.\nThe Importance of Being iPad\nApple’s big bet on the iPad with iOS 9 involves deep changes in multitasking and productivity enhancements that are both obvious and unexpected. To understand the gravity and consequential paradigm shift of these new features, it’s important to observe the iPad’s role today and reflect on why Apple is turning its attention to the device now.\nThe iPad in 2015 is an incredible computer at the top of the line, powered by a more flexible OS that still struggles to accommodate some basic use cases and workflows. This is key to understand the changes Apple is bringing to the iPad this year. Everything new in iOS 9 for iPad ultimately comes down to this idea:\nThe iPad is a computer in search of its own OS.\nAs I noted in my review, the iPad Air 2 is a dramatically faster and more capable iPad than older generations, to the point where it’s fair to wonder why such power is needed at all.\nIn the same product line, though, lies the ever-surviving iPad 2, a second-generation device released in 2011 and that can still run the latest version of iOS. The longevity of iPad hardware and Apple’s policy to support old devices with software updates has created a curious dichotomy for the company: the latest iPad, more powerful than traditional computers in some instances; and the iPad 2, still receiving updates but far from the user experience of the Air 2.\nThe tension between new and old, modern and traditional is also quite apparent in iOS itself. With iOS 8, Apple debuted user features and developer frameworks that allowed an iPad to handle tasks that wouldn’t be possible on a Mac. For some people, an iPad running iOS 8 is preferable to a Mac with OS X. This is exactly why I elected the iPad Air 2 as my primary computer: besides form factor advantages, I like iOS and its app ecosystem better.\nAt the same time, iOS 8 is still behind OS X when it comes to performing tasks that involve switching between apps, working with files, and editing text. These are the tentpoles of any personal computing experience from the past two decades and the functionalities added in iOS 8 have done little – if nothing – to address the concerns expressed by iPad users about them. Action and share extensions have helped in exchanging data between apps, but they’re not the solution to look at two things at the same time; custom keyboards have provided a novel way of input and data extraction from apps, but what the iPad needs is a faster way to select and edit text.\nThe problem that Apple needs to solve with iOS 9 for iPad is complex. How can Apple make good of the post-PC promise with features that are drastically different from what came before – without the overhead and inherent complexity of forty years of desktop computers – but also capable of addressing modern user needs and workflows?\nApple’s answer comes as a cornucopia of changes, with new Slide Over, Split View, and Picture-in-Picture features for multitasking, better support for external keyboard shortcuts, enhancements to the software keyboard, and even a gesture to navigate and select text using multitouch.\nThe recurring theme of contrast finds its zenith in the multitasking and productivity additions to the iPad in iOS 9: some of them are brand new ideas previously unseen on OS X; others borrow heavily from the company’s desktop OS. Some of them are exclusive to the powerful Air 2; others have made their way to older iPads as well.\nPrior to inspection, such peculiar mix begs the question: does Apple know new ways to think about old problems, or is this too much for an iPad to handle?\nOne thing’s for sure: Apple is finally making what the iPad was looking for.\n\nSlide Over\nApple’s first big change to iPad multitasking requires a single swipe from the right edge of the screen.19 Called Slide Over, this is what you’ll want to use to view and interact with another app without leaving the app you’re in.\nSafari and Notes in Slide Over.\nSlide Over works by putting a secondary app on top of the app you’re currently using, called the primary app. It’s based on compact and regular size classes, and it works in both portrait and landscape orientations. Slide Over is supported on the following iPad models:\niPad mini 2\niPad mini 3\niPad mini 4\niPad Air\niPad Air 2\niPad Pro\nSlide Over can be activated from any app, regardless of whether the app you’re using has been updated for iOS 9 or not. The fact that the app you’re in may not support Slide Over in iOS 9 doesn’t have any influence on the secondary app that you’ll be able to invoke. Slide Over is all about the secondary app and cycling through apps that support it.\nThere are two ways to activate Slide Over with a swipe from the right edge of the screen. You can swipe from the area around the middle of the screen (vertically centered) to open Slide Over directly; or, you can swipe from above or below the center of the screen to reveal a pulling indicator that you can then grab to fully reveal Slide Over.\nThe right side of the iPad’s screen, showing the Slide Over pulling indicator.\nThe pulling indicator is the same that is used for Control Center and Notification Center when the app you’re in is running in full-screen mode (a common occurrence for games and other video apps). Slide Over joins Control Center and Notification Center in being a UI layer that is activated by swiping from the edge of the screen and that sits atop any running app.20\nSlide Over also comes with its own app switcher to cycle through apps. Slide Over’s app switcher is a dark overlay with app icons contained inside light gray boxes; only apps that support Slide Over will be shown in this view.\nSlide Over’s app switcher.\nThink of Slide Over as a subset of recently used apps, specifically (and exclusively) those updated for iOS 9 multitasking. You can’t quit apps in Slide Over: you can only tap to open an app and make it the secondary app running on top of an app you’re already in.\nThe cleverness of Slide Over lies in how its design dictates the experience of using it. When you pick a secondary app, it opens in what may be described as an iPhone app layout, stretched up vertically to fit the iPad’s screen. In both landscape and portrait mode, a secondary app is resized to a compact size class that resembles an iPhone app: in Slide Over, Safari moves the top toolbar buttons to the bottom of the screen like it does on the iPhone; Messages, Mail, and other Apple apps look exactly like their iPhone counterparts, only taller.\n\n\nPictured above: Calendar, Mail, and Podcasts in Slide Over next to Safari.\nTo achieve this, iOS 9 uses size classes (a technology that developers have started supporting to make iOS apps responsive for multiple display sizes) to show a UI that’s appropriate for a narrow and elongated mode. This makes Slide Over easy to use and familiar (most apps feel and work like iPhone apps) and a great way to interact with another app without taking it full-screen.\nDesign serves the experience in Slide Over, and it works. If you swipe to reveal Mail in Slide Over, you’ll be presented with a familiar view of messages in your inbox, resized to fit the Slide Over panel. If you open Notes, you’ll see a list of your notes; if you tap one, Slide Over will move to the subview required to display the note. Mail, Messages, Calendar, and other Apple apps rely on adaptive UIs and compact size classes to split app sidebars and navigation points into layouts that can be displayed in a single column with Slide Over.\nTo truly appreciate Slide Over, we need to look back at Apple’s past iOS SDKs. Since 2012, the company has been advocating for APIs to create apps capable of responding to any screen size, orientation, or localization. With a greater matrix of iOS screen sizes available to customers in more countries, Apple felt it was appropriate to rethink the design and development process with a focus on adaptivity: Auto Layout, Dynamic Type, and Size Classes were seen as signs of smaller iPads and bigger iPhones back then; today, they provide the context necessary to understand iPad multitasking in iOS 9.\nDevelopers who have been paying attention to Apple’s announcements and advice have already done most of the work required to support iPad multitasking: Slide Over uses the same compact size class that developers have grown accustomed to using on the iPhone. It’s not just easier to support Slide Over this way: it’s the best option when you’re dealing with this type of layout.\nSlide Over rethinks the idea of looking up information or acting on something without leaving an app. Think of it as having an iPhone next to any app you’re using without the inconvenience of juggling multiple devices. Need to look up a word on Google while you’re reading a document? Open Safari in Slide Over, search, and return to what you were doing. Want to type out an email without closing Twitter? Slide Over, Mail, compose, send. Same for keeping a conversation going on iMessage, checking your schedule in Calendar, or glancing at how many emails are in your inbox.\nThanks to the iPad’s large screen, you no longer need to launch apps to interact with them. A swipe is all it takes to get things done and be more efficient. This is Apple’s pitch for Slide Over.\nAgile Tortoise’s Drafts and Terminology.\nAn important aspect to note about Slide Over is that while a secondary app doesn’t take over the primary app visually21, it does take over functionally. When Slide Over is open, you can’t interact with the primary app and the secondary app simultaneously: only the secondary app is active and able to receive touch input, with the primary one being dimmed in the background. A single tap outside the Slide Over area immediately dismisses the secondary app22. When Slide Over is shown, the software keyboard is exclusive to the secondary app. From a user’s perspective, the primary app is inactive underneath Slide Over.\nKeyboard input is another interesting decision Apple had to settle on when designing Slide Over. When you tap into a text field that shows the keyboard in a Slide Over app, you’ll get the full-screen iPad keyboard, but it’ll only work with the secondary app. The layering makes sense – having a smaller, iPhone-sized keyboard just for Slide Over would be terrible on an iPad – but it introduces a new level of visual complexity that poses new challenges for Apple and developers.\nSlide Over’s app switcher can be activated by swiping down from this indicator (pictured: Twitterrific).\nSlide Over grants surprising freedom in terms of app switching. Slide Over uses a persistent app switcher indicator that can be dragged to move between the secondary app and the picker for other apps that support Slide Over. The indicator is aligned with the clock in the status bar and it can be swiped to move from app to app switcher. Effectively, this is another status bar indicator: this portion of the screen behaves like a traditional status bar in that you can tap on it to scroll to the top of lists in apps. The similarities end here, as you’ll primarily interact with the Slide Over status bar to swipe it and switch between apps.\nThe animation to move across apps in Slide Over is some of Apple’s finest visual work in iOS 9. As you begin to pull down, the secondary app starts shrinking while following your gesture – first by adopting rounded corners, then by fitting the contents of the screen to a smaller box that sits below more app icons that come down from the top of the app switcher. It’s a smooth, rewarding animation that is fast and intuitive – exactly the kind of sloppy and comfortable gesture that can be performed in a second without looking. It feels right, and it doesn’t skip a single frame on the iPad Air 2.23\nAn inconsistency I’d point out is that some Apple apps haven’t been updated with support for Slide Over. I can accept App Store and iTunes not having a compact mode (although it would be welcome), but why doesn’t Music support Slide Over? It’d be a good showcase of the feature’s raison d’être: swipe to open Music, pick a song, play, and you’re back in the primary app. This is an oversight that I’m expecting Apple to rectify soon.\nSlide Over is a terrific addition to iPad multitasking. It’s easy to activate and it doesn’t compromise on the full functionality of a secondary app: when you open it, you’re not presented with a lite version of another app – you’re given the whole experience, with its full feature set, only in a compact layout.\nThis is a powerful idea, as it noticeably cuts down the time required to jump between apps on an iPad. It makes the iPad’s screen feel like a large canvas of opportunities rather than a wasteland of bright pixels.\nSlide Over is so good, I wish notifications could always open in it, and I wish I could have it on my 6 Plus as well. Double-clicking on the Home button feels so passé when you can swipe to peek at apps.\nSlide Over is only a sliver of the iPad’s multitasking rebirth. What Slide Over enables is an even bigger change for iPad users, and a drastic new approach to app interaction on iOS.\n\nSplit View\nIf there was still any doubt on the iPad graduating from utility to computer with iOS 9, Split View clears it all. Split View is a fundamental re-imagination of the iPad’s interaction model five years after its launch. More than any other productivity enhancement in iOS 9, Split View is the iPad’s coming-of-age feature.\n\nAs the name suggests, Split View is Apple’s take on split-screen multitasking that lets the iPad display two apps simultaneously, enabling users to interact with both apps at the same time. Because of its toll on hardware and system resources, Split View is exclusive to the latest generation iPads.\nSplit View can be considered Slide Over’s offspring: it can only be activated by entering Slide Over first and tapping a vertical divider that will put both apps (primary and secondary) side by side, active at the same time. In Split View, the app switcher for the secondary app is the same one used for Slide Over, too.\nIn Slide Over, the divider can be tapped to enter Split View.\nSafari in Split View, and the Split View app picker.\nThere is no other way to activate Split View in iOS 9: the feature is entirely based on Slide Over, both in terms of design and user manipulation. If you want, you can move between Slide Over and Split View by tapping the divider and iOS 9 will cycle through the two modes.\nSplit View uses regular and compact size classes for three possible layouts. Before iOS 9, iPad apps always used regular size classes for both vertical and horizontal orientations as they ran in full-screen mode all the time. With Split View, the vertical size class is always regular, but the horizontal size class can change to compact. The diagram below shows how Split View affects size classes for iPad apps.\nSize classes.\nWhen in Split View, the user can control the size of the app window by dragging the divider to switch between layouts. This is best experienced with Split View in landscape, where the secondary app can be resized to use 25% or 50% of the screen.\n\nFor the 75/25 layout in landscape, Apple apps that are primary tend to keep roughly the same full-size layout they’d normally have, shrinking and putting some buttons closer together where necessary; in the 50/50 mode, though, apps tend to resize more and switch to iPhone-inspired hybrid layouts, usually by moving some top buttons to a bottom toolbar (Safari) or by turning sidebars into cards (Reminders).\nThe most important difference between Slide Over and Split View is that while Slide Over forces a compact app onto the one you’re in with no consequence on the primary app that stays in the background, Split View requires both apps to support multitasking with compact and regular size classes. Split View needs two iOS 9 apps updated for multitasking, otherwise you won’t be able to split the screen in two.\n\nIf Split View isn’t supported in an app you’re using, you’ll notice that Slide Over won’t have a vertical divider running across the left side of the secondary app. When writing this review, I used Ole Zorn’s Editorial: the app didn’t support iOS 9 multitasking, so I could use Slide Over to interact with a secondary app, but I couldn’t enter Split View.\n\nIf Split View is supported, you can tap the divider and iOS will bring the primary app in the foreground, prepare its layout, and present you with two apps on screen at the same time. The process takes less than a second on the iPad Air 2; after tapping and before entering Split View, the primary app (which was in the background) is blurred and its icon and name are shown on top of it to indicate which app you’re about to use Split View with.\n\nWhen resizing apps in Split View, both app screens will be blurred (iOS doesn’t want to show you the app-resizing process in real-time), and you’ll get the Split View counterparts upon releasing the divider – again, it takes less than a second.\nThere’s a nice detail worth mentioning about resizing apps. When you’re dragging the divider to resize apps in Split View, it will turn white and both apps will slightly recess in the background to communicate they’re being manipulated by the user.\nDragging the divider is also how apps are dismissed and how you can return to a full-screen app. To leave Split View manually, you have to grab the divider and swipe right to put the secondary back into the app switcher, or swipe left to dismiss the primary app and make the secondary app full-screen (primary). The process works the same way in both directions, with an app undocking from Split View as you reach the edge of the screen through an animation that pulls it away from the adjacent app and that dims it. It’s a delightful transition, smoothly animated on the iPad Air 2.\nIn my experience, Apple’s approach has been working well: when they have to adopt compact layouts in Split View, landscape apps are somewhat reminiscent of an iPad mini inside an iPad Air 2 – you can tell they’re using iPad layouts, only smaller.\nOther times, Apple uses a few tricks to mix and match elements of iPhone interfaces with iPad UIs to save on space, but the end result is not annoying thanks to conventions of the iOS platform. And when they’re using narrow layouts for Slide Over and Split View, iPad apps almost transform into smaller iPhone versions with longer layouts, which is fine for quick interactions. The constraints that Apple has put in place ensure you never end up with odd or uncomfortable app layouts, and that is the best design decision behind Split View and the new multitasking initiative as a whole.\nSplit View is an option. You’re not relinquishing control of the traditional iOS experience when switching to Split View, and you don’t have to use it if you don’t want to. Apps don’t launch in Split View mode by default: while on a Mac apps launch in windowed mode and full-screen is optional, the opposite is still true on iOS 9. Apps are, first and foremost, a full-screen affair on iOS, and the way Apple designed Split View doesn’t suggest that is changing any time soon.\nSplit View isn’t like window management on a desktop: Apple wanted to eschew the complexities of traditional PC multitasking and windowing systems, and by rethinking the entire concept around size classes and direct touch manipulation, they’ve largely achieved this goal. They have created new complexities specific to iOS and touch, but it’s undeniable that Slide Over and Split View are far from the annoyances inherent to window management on OS X. The iPad is fighting to be a computer and Split View epitomizes this desire, but it doesn’t want to inherit the worst parts of multitasking from desktop computers.\n“User control without unlimited freedom” would be a good way to describe Split View. While the user controls when Split View is activated and when it should be dismissed, Apple has (rightly) shied away from granting users the ability to resize app screens manually, have multiple overlapping windows on screen, or overcomplicate the divider with additional menus.\nThe competition (and some jailbreak tweaks) had set a precedent for this; instead, Apple has shown remarkable restraint in building a split-screen feature which requires minimal management.\nAs a result, apps in Split View don’t feel like windows at all: users never get to choose an app’s position beyond its primary or secondary state; they can’t drag an app in a corner and put three apps on screen with draggable resize controls. The screen splits to accommodate two apps – and that’s it. It’s easy to grasp, fast, and it feels natural on the Air 2.\nFor a long time, I thought that Apple wouldn’t bring this kind of multitasking to the iPad because of the complexity therein, but Split View changed my mind. Unlike windows on OS X, I don’t have to worry about overlapping windows and apps to manage windows, which is liberating. I may not be able to look at five apps at the same time, but how often do I need that many apps at the same time anyway?\nThe fact that Split View is going to be available on El Capitan as well speaks for itself. Windows are great, but managing them usually isn’t. Starting fresh with Split View feels like the best option for the iPad. This isn’t a case of Stockholm Syndrome: limitations and an intuitive design truly open up new possibilities that aren’t weighed down by confusion or complexity.\nTake the iPad’s camera for example. In theory, by putting two apps that are capable of taking pictures and videos in Split View, you should be able to look at an iOS camera in two places. But that’s not how it works in iOS 9. If you try to snap a photo in two apps when Split View is on, iOS will pause any existing instance of the camera, so you won’t be able to look at the same scene from two different apps.\n\nThis feels right given the relationship between hardware and software on iOS: there is only one camera that takes advantage of an iPad’s processor and volume controls to take pictures. Rather than bringing additional complexity to the camera APIs24, Apple designed Split View with clear boundaries that serve the user experience.\nAt the same time, they’ve also given developers the ability to opt out of Split View if they don’t want their apps to support it. While I suspect that most iPad apps updated for iOS 9 will work with Split View, it’s likely that games meant to be played in full-screen with deeper access to the iPad’s hardware and resources won’t feature Split View integration. This is fair when you consider the increased memory pressure on a complex 3D game that needs to share the screen with another advanced app or game, but I’d still love to see a Split View-enabled Minecraft running alongside a guide to the game or a live chat.\nSplit View breeds a new set of limitations and complexities, but it doesn’t fall in the trap of imitating PC multitasking. Taken at face value, Split View really is just a way to use two apps at the same time. As we’ll explore later on, that “just” is part of a bigger picture that goes beyond the idea of multiple app screens, with exciting possibilities worth addressing.\n\nPicture in Picture\nIf you ask any average tech blogger (or YouTuber), they’ll tell you that the iPad is all about “consumption”. While this simplistic reduction of the iPad’s role in millions of people’s lives has been proven inaccurate time and time again, it is true that people love to consume videos on their iPads. And for good reason: the iPad – even in the mini version – makes for a compelling portable player with a fantastic screen and a vast selection of video apps from YouTube and Netflix to VEVO, Plex, HBO, MLB, and thousands more. It wouldn’t be absurd to say that the iPad ushered the Internet in the era of modern, Flash-free video streaming and portable playback. But it also wouldn’t be too outlandish to argue that the iPad’s video player is a relic from five years ago. iOS 9’s Picture in Picture wants to address this problem.\nThese days, we check our Facebook feeds while we listen to songs on YouTube and we tweet while we’re streaming the latest Game of Thrones or a live baseball game. Video is not a one-app experience anymore. This has been the case on the desktop for years since the advent of media players and YouTube, but our primary sources of video entertainment are smartphones, tablets, and TVs. Due to the absence of a great app experience on the TV we’ve come to rely on phones to multitask while watching video. Shouldn’t a better option be available on the iPad, a device that is big enough to host simultaneous video playback and other apps?\nEven if we move beyond the tweet-while-watching-TV use case, there is an argument to be made about the utility of a video player that can coexist with apps in a multitasking environment. How many times have you wished you could rewatch any kind of technical session recorded on video while taking notes on your iPad at the same time? Follow an Apple live stream while tweeting, without having to use a Mac for that? How about watching a Minecraft tutorial while playing the game itself? For all the advances of the iPad and iOS platform in recent years, playing video on iOS is still a disruptive experience that requires complete attention. On iOS, video takes over everything else, and it’s easily interrupted as soon as you go back to the Home screen or tap a notification.\nWith Picture in Picture, Apple is taking a page from Google’s YouTube app and they’re bringing a floating video player to the iPad.\n\nIntegrated with the native media player and FaceTime in iOS 9, Picture in Picture turns a video into a resizable box that floats on top of everything and that follows you around everywhere. If you watch a lot of video on iPad, PiP (as it’s also affectionately called by Apple) is easily going to be one of your favorite features in this update.\nLike Slide Over, Picture in Picture is available on the following iPad models:\niPad mini 2\niPad mini 3\niPad mini 4\niPad Air\niPad Air 2\niPad Pro\nFor developers, Picture in Picture support can be enabled for apps that use the AVKit, AV Foundation, or WebKit class for video playback.\n\nPicture in Picture works like this: when you’re playing a video in an app updated for iOS 9 (such as Videos) or from Safari, a Picture in Picture button will appear in the lower right corner of the standard media player. Tap it, and the video will shrink into a floating player that you can drag around and that docks to the edges of the screen (or on top of Control Center and the keyboard if shown).\nPicture in Picture shows a progress bar and offers buttons to play/pause, close the video player, and take the video back into the original app, dismissing the floating player. You can’t scrub through a video in Picture in Picture – you’ll have to go back to the player app for that.\n\nPicture in Picture doesn’t only work by manually activating it: whenever you leave an app that’s playing a video by clicking the Home button or tapping a notification, the video will shrink to Picture in Picture and it’ll follow you across apps.\n\nThe behavior is slightly different for FaceTime video calls. Depending on the orientation of the caller’s device, FaceTime’s PiP will be taller than regular video; it’ll also have buttons to mute, return to the FaceTime app, and end a call instead of controlling playback. Like normal Picture in Picture, FaceTime PiP is activated when leaving a video call on the FaceTime app by clicking the Home button or opening a notification.\nUnder the hood, Picture in Picture works by integrating with three of iOS’ media player frameworks (including WebKit, so Safari and web views should get PiP support out of the box) and pushing the video player into an upper layer of iOS’ structure, giving the user control on positioning, size, and playback (as you can see, a recurring theme in iPad multitasking). Because it’s meant to be treated as always-playing background media, Picture in Picture floats on top of every app and menu; it sits on the same layer of Control Center, and it can only be obscured by Notification Center.\nThe only places where PiP doesn’t show up are the system app switcher (although it does float over the Slide Over/Split View one) and the Lock screen. Picture in Picture has been designed to be available anywhere you go: it joins the likes of background audio, VoIP, navigation, and Phone/FaceTime calls as the only features capable of continuing until completion following a click of the Home button.\nLike Slide Over and Split View, one of Picture in Picture’s tenets is the lack of complex management required to operate it. Playback controls disappear in the video player after a few seconds (you can reveal them again by tapping the PiP). You can move the PiP around by dragging it, and you’ll notice that it’ll snap to a corner of the screen as soon as you release your finger. This is meant to remove the burden of precise positioning that affects desktop apps: there’s no concept of grid spacing or pixel-snapping for windows here; just loose gestures and automatic docking to the closest edge and available space on screen.25\nYou can pinch and rotate the PiP, but it’ll always reposition itself to the correct orientation; you can also pinch and zoom to enlarge it and pinch close to make the player smaller. You cannot make the PiP as big as you want – just like you don’t have that kind of precise control over the layouts of Split View.\nHidden Picture in Picture, automatically placed above Control Center.\nIf you want to keep watching a video but it doesn’t require your undivided attention or if Picture in Picture is getting in the way, you can swipe it out of view and it’ll attach to the edge of the screen. When hidden, video will continue playing in the background, and the Picture in Picture box will display a pulling indicator along the side of the screen that you can grab to bring the video in the foreground again. This is a handy addition for all those times when you’d want a video to keep playing just for the audio while retaining the ability to watch it if needed.\nWith a total of four screen regions where it can be shown26, four where it can be hidden, and five possible sizes, iOS 9’s Picture in Picture gives you the freedom to watch video anywhere while also ensuring this flexibility doesn’t make the iOS experience cumbersome and confusing.\nPicture in Picture embodies many of the post-PC principles: it’s uniquely built for touch and it’s not burdened by the expectations of traditional PC window management.\nPicture in Picture benefits from the clean slate of iOS and the direct interactions of multitouch. You can throw it around and it’ll gain momentum and stick to the closest corner in the direction of your swipe. It feels natural and credible.27 The combination of gestures, intelligent layering, drop shadows, believable physics, and, more importantly, great performance makes Picture in Picture feel joyfully material and, ultimately, practical.\nWith Picture in Picture, my iPad has gone from being incredibly worse than a Mac to watch videos to drastically superior in one fell swoop. The simplicity and cohesiveness of Picture in Picture are remarkable: whenever I come across a video in Safari, I can click the Home button knowing that it’ll stay with me. I can even start watching another video, and Picture in Picture will automatically pause and resume later. If I tried to do the same on a Mac, I’d have to manually resize windows, perhaps install some third-party apps, and learn a combination of keyboard shortcuts. On the iPad, it’s just a button.\n\nIt didn’t take me long to realize that Picture in Picture was going to be a terrific reimagination of video playback on iPad. I was watching John Gruber’s interview with Phil Schiller for The Talk Show at WWDC, and I noticed that I could put the HTML5 video player in Picture in Picture. That led to a fantastic experience: as I was watching and listening, I could open Twitter and tweet a few comments about it without stopping the video, and I could put Safari and Notes in Split View while also playing the video to simultaneously watch, research, and take notes about Schiller’s comments. That was true multitasking, and it helped me be part of a conversation around a live event without regretting my use of an iPad to watch video. The same experience wouldn’t have been possible with iOS 8.\nI’ve been enjoying Picture in Picture to keep FaceTime video calls going while I do something else, too. Like other types of video, FaceTime used to require my complete attention: if I wanted to have a video call with someone on my iPad, I had to stop whatever I was doing. Now I’m always putting FaceTime video calls in Picture in Picture, which has turned out to be a fantastic lifehack to help my parents out with iOS issues (I can look at their computer screen and my iPad’s web browser at the same time).\nPicture in Picture doesn’t only level the playing field between iPad and other platforms for video playback – it makes the iPad substantially better thanks to its integration with other multitasking features.\nI still have some questions and concerns about Picture in Picture, though. Because of Apple’s implementation, developers can choose not to support it. Will YouTube support Picture in Picture if it helps avoiding ads and annotations on a video? Will Netflix and HBO?\nThere’s always a tension between new iOS features and the best interests of large companies: the incentives of a system integration aren’t always aligned with the companies’ business model. Ideally, Picture in Picture will become so popular that it’ll be impossible for YouTube to ignore it, but as we’ve seen with Twitter before, big companies can be exceptionally resilient and shortsighted when the question turns to being good platform citizens. In the short term, I’m also curious to see if Apple itself will support Picture in Picture with its Music app: right now, videos available in Apple Music don’t support it at all.\nPicture in Picture is to video what Control Center is to audio. As we increasingly rely on our iPads to watch video, Apple realized it was time to make video playback its own layer, always available across the system, always under the user’s control but free of the complexities that such design would entail on a desktop computer. Picture in Picture is uniquely suited for touch and iOS; I wonder if it’ll ever come to the iPhone as well.\n\nSoftware Keyboard\nWhen the iPad was introduced in 2010, Apple praised the full-screen, laptop-like keyboard that allowed for a comfortable and familiar typing experience. For years, the company stuck to that ideal, modeling the iPad’s keyboard after how a Mac’s keyboard would work, with some exceptions made possible by software.\nApple brought over popup keys (for accented letters and other symbols) from the iPhone; they added a dictation button; they even got creative by demonstrating the awkward efficiency of the almost-forgotten split keyboard for iPad, introduced in 2011. Last year, we began to see the first cracks in the software keyboard wall with custom keyboards for iOS 8, which enabled users to use non-Apple keyboards on their devices.\nApple has always taken its software keyboards seriously, and that’s resulted in a slow evolution compared to keyboards on other platforms. As we’ve seen with San Francisco, though, 2015 Apple is more open to the idea of tweaking the keyboard they’ve long held in high regard.\n\nFirst up is the Shortcut Bar, an extension of the QuickType bar that enables apps to put actions and menus next to QuickType suggestions. Available as monochromatic glyphs, these shortcuts are used by Apple across the OS with Undo, Redo, and Paste actions, and they’re often customized for specific apps to offer access to features that aren’t as easily accessible in the UI – or that sometimes aren’t accessible at all.\nAt its simplest state, the Shortcut Bar acts as a way to undo/redo operations and paste (both text and any other data). Because shortcuts are programmable, they can change depending on context: when you select text, for instance, undo and redo become cut and copy so they can work alongside paste.\n\nBecause Shortcuts can be disabled, copy & paste options are still available in the classic popup menu, so you may have the same options in two places.\n\nIt gets more interesting when Apple ties entire menus and iOS integrations to the Shortcut Bar. In Mail, the right side has buttons to bring up a popover for text styling, one to pick a photo or video, and another to show the iOS document picker to attach a file to a message. Both attachment options are also available in the copy & paste menu, but icons in the Shortcut Bar make them more visible and obvious: they’re always displayed even if you’re not selecting text.\nIn Notes, the Shortcut Bar is used to offer a unified redo/undo/copy shortcut (it uses a popover), a button to create a checklist, and another to show a text style popover on the left side of the keyboard. On the right side, there are shortcuts to add a photo or video to a note and create a sketch.\n\nI’m a fan of the Shortcut Bar. Fast access to a subset of app menus and actions trumps similar shortcuts available in the copy & paste menu, especially because they only require one tap and can present interfaces right above the keyboard. I wasn’t using QuickType suggestions before iOS 9, but the Shortcut Bar pushed me to enable the additional keyboard row.\nThird-party developers will be able to provide their buttons for the Shortcut Bar, and I’m curious to see what they’ll do in their apps.\nIn iOS 9, the UITextInput protocol has been enhanced with the ability to provide a UIBarButtonItem within a UITextInputAssistantItem object, which will be displayed in the Shortcut Bar. Developers can specify the placement of shortcuts, choosing between left (leading) and right (trailing). Any action can be associated to a button in the Shortcut Bar as it behaves like a UIBarButtonItem that is typically found in a toolbar or a navigation controller.\n\nA demonstration of custom buttons in the Shortcut Bar.\nThe share sheet can be tied to the Shortcut Bar, too.\nThe few betas of iOS 9 apps I could test sported shortcuts to navigate between text fields, add attachments to a note, or share content with the tap of a button. Developer Daniel Breslan showed me a demo of popups and share sheets triggered from the Shortcut Bar; in Drafts 4.5, users will be able to turn custom keys into Shortcut Bar items, which will be displayed alongside QuickType to perform actions on the current draft.\nThe Shortcut Bar is an ingenious way to use the extra space of the iPad keyboard to save a bit of time with preconfigured actions. Apple is providing some good examples in their default apps; hopefully, third-party developers will use their imagination for this aspect of iOS 9 as well.28\n\nI’ve been covering iOS for a few years now, and I’ve regularly lamented the lack of faster text selection and editing controls on iPad. Since its inception in 2007 and update for iPad support in 2010, text selection on iOS has left much to be desired, particularly for those looking to compose and edit long pieces of text.\nWhen it was posted on YouTube in 2012, the Hooper Selection (according to the creator, the single most duped radar at Apple) wasn’t only clever as most concept videos: it felt inevitable and it embodied the multitouch nature of iOS. As I wrote back then, it made a lot of sense.\nThree years after Hooper’s popular concept, Apple has listened to the community and used the Hooper Selection as the basis for the new trackpad mode on iOS 9 for iPad.\nSelecting text in trackpad mode with a two-finger tap & hold + swipe.\nThe core proposition of trackpad mode is that you can swipe with two fingers on the keyboard to freely move the cursor around and control its placement in a text field. It works like you’d expect from a trackpad on your Mac: characters disappear from keys as soon as you start swiping, indicating that you’re free to use the whole area as a trackpad.\nThere’s no sound effect accompanying cursor placement in trackpad mode, but there is one subtle visual cue that hints at the connection between the native iOS cursor and trackpad mode. When first placing two fingers on the keyboard, the cursor will animate to split in two: the main cursor (which can have a custom color set by the developer) will loosely follow your swiping direction even if it’s outside the bounds of a text field, and a smaller gray cursor will more precisely track your intention and be attached to characters. It sounds more complex than it actually is in practice – a testament to the fact that intuitive touch text selection is a tricky problem to solve.\nTrackpad mode isn’t limited to cursor placement, as it can also be used to control text selection. Tap with two fingers on the keyboard and iOS will select the word closest to the cursor; keep swiping left or right, and you’ll extend your selection. Or, hold two fingers on the keyboard, wait for the cursor to transform into the text selection controls (with another nice transition), and then start extending your selection by swiping.\nEasier text selection through swipes on the keyboard fixes a major annoyance of editing text on iPad. While the direct relationship between words, selection, and touch could be appreciated in the early days of iOS, it slowed down the entire process after people figured it out and just wanted to be more efficient when selecting and editing text.\nApple’s new trackpad mode works well with the Shortcut Bar: the iPad keyboard now encompasses selecting, editing, and performing actions with a unified interface. This shows how Apple is staying true to the multitouch promise of the original iOS keyboard: the unique advantage of touch keyboards is that you can always update them. Trackpad mode is a good example of that kind of mindset. It’s still the same keyboard, but now you can do more with it.\nIn my tests, trackpad mode performance has been solid and gesture recognition fairly accurate with some instances of accidental text selections and the cursor becoming stuck in a text field. Nothing that couldn’t be fixed by swiping again or tapping a letter to “reset” the keyboard.29\nI’ve been using trackpad mode to edit posts I publish to MacStories every day, and I believe it is superior to what we had before. Trackpad mode doesn’t make the iPad more like a Mac: it is only available for text editing, and rather than “putting a mouse on the iPad” it uses multitouch to bring a new behavior for the software keyboard. I find trackpad mode to be smooth and natural: I’m particularly fond of its precise character control, which has been a boon to fix typos in Editorial and manage text selections in Notes.\nApple may have Hooper to thank for inspiring trackpad mode years ago, but iOS 9’s implementation is all theirs. Trackpad mode is well suited for the large iPad display, and it’s good to see Apple trying new things with multitouch again.\n\nHardware Keyboards\nWith iOS 7, Apple introduced support for programmable shortcuts on external Bluetooth keyboards. While iOS supported system-wide commands for text formatting and undo back in the early days of the iPad, iOS 7 allowed developers to add custom shortcuts to their apps.\nAdoption of the feature didn’t work out as expected. Apple was inconsistent in their usage of keyboard shortcuts: some apps didn’t support them at all; others had full sets of shortcuts matching those available in the same app for OS X; others, like Messages30, only supported some of the shortcuts available on the Mac. Even worse, there was no API to inform the user about keyboard shortcuts: while Mac apps could use the menu bar as a place where users could click around and learn shortcuts, the same wasn’t the case on the iPad. Support for keyboard shortcuts in iPad apps was sporadic, poorly documented, and we even tried to create a webpage to showcase shortcuts supported by popular apps. It’s fair to say that the class used to register keyboard shortcuts, UIKeyCommand, failed to gain traction among users and developers in 2013.\nWith iOS 9, Apple has reworked the OS’ support for external keyboard shortcuts. In the process, they’ve given developers a unified way to teach users about shortcuts and they’ve also brought over some great time-saving commands from OS X.\n\nKeyboard shortcuts can be specific to a single view controller and apps can now display a cheat sheet with a list of supported shortcuts. The cheat sheet, called Discoverability, is an overlay that appears in the middle of the screen upon holding the Command key on a keyboard. In Discoverability, each app will be able to list the shortcuts it supports with labels and required key combinations. Developers can choose to assign labels to keyboard shortcuts with the optional discoverabilityTitle property of UIKeyCommand; while an unlimited number of shortcuts can be set in an app, only those with an associated label will be displayed (according to the order set by the developer) in the Discoverability overlay.\nDiscoverability is a notable change as it helps exposing keyboard shortcuts in a consistent way: instead of having to read through an app’s About page, you can press a single key on the keyboard to get a system-wide cheat sheet for each app that exposes keyboard shortcuts.\nAssigning shortcuts on a per-view controller basis is also a welcome change from iOS 7. In iOS 9, developers can program subsets of shortcuts that are enabled in specific views of their apps, which will then appear in Discoverability only when you’re in that section. In Safari, for instance, the list of shortcuts will be slightly different depending on whether you’re on a webpage or the browser’s Favorites view31; in Notes, the Find Note shortcut will only be supported in a list of notes and not in the main view of the app.\nDifferent shortcuts depending on an app’s view.\nThis is another instance of iOS 9 adopting the best features of OS X without their burden: because shortcuts are only listed in Discoverability when they can be used, there’s no concept of unavailable, grayed-out shortcuts in iOS 9. If you see a shortcut in Discoverability, you can try it and it’ll do something. In the months I’ve spent exploring the keyboard shortcuts supported by Apple in iOS 9, this has proven to be a better implementation than the often-unavailable, non-contextual shortcuts listed in the menu bar drop-downs of Mac apps.\nThe increased flexibility of UIKeyCommand and Discoverability should make for a quicker adoption of keyboard shortcuts among third-party apps. As far as Apple goes, they’ve done a good job at supporting shortcuts in their apps and being more in line with what they offer on OS X. Messages can now finally send by pressing the Return key; Mail, Safari, and Calendar offer a good selection of shortcuts to navigate the UI, switch between views, and perform actions; some apps don’t have shortcut support (Podcasts, Photos, App Store), but it’s no big deal.\nWhere Apple also surprised me is in the Home screen shortcuts that are actually available as system-wide options. Go to the Home screen, press Command, and notice how iOS 9 supports Command-Tab and Command-Space? Those aren’t just commands you can use on the Home screen: you can Command-Tab through apps and open Search from anywhere on iOS 9.\n\nModeled after the equivalent shortcut for OS X, Command-Space allows you to launch iOS’ Spotlight. However, unlike OS X’s Spotlight, the search panel on iOS doesn’t come up as a system-wide popup; instead, pressing Command-Space will take you to the Spotlight section that is normally accessed by swiping down on the Home screen (more on this later). Essentially, Command-Space acts as a more traditional implementation of the dedicated Spotlight key that iPad keyboards from third-party companies have exhibited for the past few years. It would have been nice to see a modal Spotlight available across the entire OS; for now, this will do – if only in terms of muscle memory for those coming from OS X.\nThe Command-Tab app switcher is a real treat. If you’re used to Command-Tabbing on your Mac, you’ll feel right at home on iOS 9 for iPad. Pressing the shortcut will bring up an overlay with app icons (nine in landscape, six in portrait), which you can cycle through by pressing Tab again; like on OS X, Command-Shift-Tab cycles through apps in reverse order.\nThere is one key difference between OS X and iOS 9 for Command-Tab: while OS X employs the Command-Tab switcher to enable users to move across open apps, the concept of “open” app is more blurred on iOS. For this reason, Apple chose to bring Command-Tab to iOS as a way to jump between recently used apps.\nLike the iOS app switcher puts the last app you used front and center in the UI, hitting Command-Tab on iOS 9 immediately highlights that app, so you can lift your fingers off the keyboard to switch back to it. Therefore, it’s best to think of iOS 9’s Command-Tab as a faster, keyboard-based version of the system app switcher, limited to the most recent apps and aimed at letting you jump between them, regardless of their state.\nThe improved support for external keyboards in iOS 9 is a step in the right direction. iOS 7 was a timid and inconsistent attempt at offering shortcuts for apps. iOS 9 feels like a more complete thought around external keyboards and their relationship with OS features and apps.\nHowever, for two step forwards, there’s one step back for Apple here: new multitasking functionalities such as Picture in Picture, Slide Over, and Split View don’t have any sort of integration with external keyboards. I suppose that these new features have led to more questions for Apple engineers, but this is not an impossible problem to fix. A way to activate Slide Over and Split View and Picture in Picture shortcuts would help a lot.\nThere’s still a long way to go for iOS to gain full independence from touch when a physical keyboard is connected, and we’ll probably never get to that point. iOS is, ultimately, a multitouch platform and external keyboards are an option – they’re not meant to be treated as the only input system of an iPad.\nFrom such standpoint, it’s easy to pardon Apple for not supporting the new multitasking features, Control Center, Notification Center, navigation in the search page, or actionable notifications with keyboard shortcuts. The deeper you get into iOS’ architecture, the harder it becomes to justify an input method that doesn’t involve touch.\n\nThe Spatiality of iPad Multitasking\nWith new multitasking features, Apple had to rethink parts of the core structure of iOS for iPad. This has introduced novel challenges and complexities, some of which haven’t been addressed in this release.\nWhile Slide Over reinforces the idea of primary and secondary app by dimming the primary app in the background, such distinction isn’t available for Split View. There’s a reason for that: Split View is meant to let you use two apps at the same time, and downgrading one of them to a lesser state would diminish the idea that you’re able to interact with two apps on screen. But this poses a question when the keyboard is shown: where are you typing?\nHint: look at the ‘Done’ button.\nWith the current Split View design, there is no strong indicator of which app is receiving input from the iOS keyboard. There is an indicator – the blinking cursor – but it’s not persistent or clear enough if you need to discern where the keyboard is going to type into.\nI don’t believe this will turn out to be a major problem in practice: in using iOS 9 on my iPad, occasionally typing into the wrong app in Split View hasn’t made me long for the simpler times of full-screen apps. Still, as a mere design critique, I believe Apple could figure out ways to better indicate the relationship between apps in Split View and the keyboard. In the meantime, I’d advise developers to strongly consider unique tint colors for the text cursor to help increase visual recognition.\nThe concept of active app state (or lack thereof) gets worse when you’re using an external keyboard. Let’s play this game again: which app is listening for keyboard shortcuts now?\n\nA problem that is somewhat eased by the blinking cursor turns into a bigger usability concern when the cursor isn’t displayed but input is still accepted through an external keyboard. With Discoverability and keyboard shortcuts, you’ll end up with a case of Schrödinger’s Split View: an app is both active and inactive at the same time, and your perspective is all that matters. Or, rather, a single touch matters: the app receiving keyboard shortcuts will be the one where the keyboard was last shown. If you put two apps with a text field side by side, you can tap one after the other to change what Discoverability thinks it’s the active app.\n\nThis is not obvious, and you can test it by putting two apps in Split View, connecting a Bluetooth keyboard, and holding Command while tapping both apps’ text fields. The Discoverability overlay will move between the two according to what iOS interprets as last-used and therefore actively-receiving-external-input app. Thankfully, there are plenty of ways that Apple could go to improve the visual structure of Split View – for example, the shared status bar could be a good place for an active app indicator.\nYou can “fake” multiple instances of Safari by putting Safari View Controller next to Safari in Split View.\nOne of the key aspects of Slide Over and Split View is that they cannot show two sections of the same app at once. Only individual apps can be displayed concurrently on screen: you can’t split Safari in multiple views and display both views on screen at the same time. If you were hoping to manage multiple Safari tabs or Pages documents in Split View, you’re out of luck.\nSplitting apps into multiple atomic units for standalone views and documents seems like an obvious next step going forward. If Apple wants to do this, the redesigned system app switcher and its heavily card-influenced design32 could be used to group multiple instances of an app together in a stack.\n\nToday, Split View doesn’t support multiple views of the same app. I wouldn’t bet on that always being the case in the future.\nWhere Apple’s multitasking architecture gets more questionable is in the thread that runs through Slide Over, Split View, and the classic app switcher.\nIn iOS 9, there isn’t a 1:1 relationship between the app switcher and other multitasking features. The idea of “recent” app is muddled when the system has to account for multiple apps displayed on screen, but Apple could have handled some parts of this differently.\nWhen you put a secondary app in Slide Over or Split View, that app disappears from the system app switcher. Instead of being displayed as a smaller card to the right of the current app, it’s completely hidden from the switcher’s UI. When in Split View, the secondary app is an empty rectangle on the right side of the primary app.\nThat empty area next to Twitterrific is a secondary app in Split View.\nI assume that Apple didn’t want to bring distinct touch targets to each card in the switcher; my issue is that this looks like a bug, and users should have the ability to resume Split View through a preview of both apps. Hiding the secondary app in the switcher is also a mistake as it prevents users from retrieving an app the traditional way.\nIn an ideal state, iOS would honor the placement of primary and secondary apps. They would both be displayed in the app switcher when active; their position would always be the same as you move across the OS and all of its different ways to launch apps.\nOn some level, iOS does exactly this: when you’re in Split View and return to the Home screen, opening another app that supports it will launch that app in Split View automatically, bringing in the secondary app from the right side again. This is good design: clicking the Home button when in Split View (and Slide Over) pushes the secondary app into the right edge of the screen before going to the Home screen, a transition that highlights the spatiality of multitasking and the division between primary and secondary app. iOS even remembers Split View if you quit the primary app and open it again, or after a device restart.\nApple made a good call in making Slide Over and Split View as “sticky” as possible, and especially Split View feels like a feature that can be left on all the time. It could be debated whether iOS should offer settings to launch apps and notifications in the secondary app pane33, but this is a good start.\nAnother problem occurs when the order of recent apps is overridden by the app switcher used in Slide Over and Split View. It took me a long time to figure this out, but here goes: the app switcher used in these two modes doesn’t show the same list of recent apps you see in the system app switcher (the one that comes up with a double-click on the Home button). Instead:\nThe Slide Over app switcher only shows apps that support Slide Over (of course);\nOf these apps, the first three icons (from the bottom) are the apps you’ve recently used in Slide Over and Split View;\nFrom the fourth icon and upwards in the switcher, you’ll see the same order of recent apps as they appear in the system app switcher, regardless of whether you used them in Slide Over and Split View recently.\nI couldn’t make out the reason behind this choice initially. Now, I see Apple’s motivation and the kind of experience they’re going for – but it could be a confusing one for those accustomed to a certain spatiality in the app switcher.\nOne order here…\n…another here.\nApple thinks that most users will tend to frequently use the same apps in Slide Over and Split View. Whether it’s an iMessage next to a webpage in Safari or a mind map alongside a podcast episode, the new multitasking app switcher is built on the assumption that frequency of use is more convenient than hierarchy and spatiality at all costs. From a design perspective, it’s an interesting conundrum: would it be better to always display the same order of apps when double-clicking the Home button and swiping from the right edge of the screen, or does the new app switcher deserve its own layer of “recently used” apps that can override whatever you recently used outside of Slide Over and Split View?\nIt took me a while to warm to this idea, but after a few months I see Apple’s perspective and they have a point. When I’m doing something that requires interacting with two apps at the same time, I tend to go back to the same pair of apps over and over. Mail next to Safari to open links in a web browser (and great to unsubscribe from unwanted newsletters) and Messages next to Calendar; iThoughts in Slide Over when I’m writing in Editorial. I have developed my own workflows and routines for multitasking on iOS 9, and, while on principle the spatiality of the main app switcher should be respected, in practice altering the first three apps to be the ones you last used in Slide Over and Split View leads to a faster, more efficient way to multitask. Moving between apps in Slide Over and Split View is faster because of this design.\nThe redesigned app switcher puts the last used app front and center.\nThe system app switcher itself has been redesigned with cards that display an app’s icon and name at the top of a preview of its last state; along with the new look, the order has been reversed, so you’ll now be swiping right to go back in the list of recently closed apps instead of going left.\nI’ve been thinking about the redesigned app switcher a lot, and for a while I couldn’t understand why Apple would want to change the app switcher other than to spice it up. From a design standpoint, I find the new switcher to be nicer and smoother than the old one34, but, on a spatial level, it would make more sense to keep the old left-to-right order to have the Slide Over/Split View app picker fit with the rest of the UI and spatiality of the feature (everything stays to the right).\nI still can’t find a single, perfectly solid argument in favor of the redesigned app switcher – just a few possible explanations.\nBy moving app names and icons to the top of the switcher, Apple has freed up space at the bottom of the UI, where they can now insert proactive recommendations for apps and Handoff.\nIt could be that the company realized that swiping right is a superior metaphor for going back into your list of apps, which are displayed before (to the left of) the app you’re currently using and the Home screen, sort of like a timeline.\nThe revised order of the app switcher works better with iOS’ support for back links and the new app-launching animation: back links shown in the status bar point left, backwards in the list of recently used apps; launching apps (and tapping the Safari button for a Universal Link) moves forward now, pushing the current app to the left.\nThe new app switcher was made for 3D Touch on the iPhone 6s.\nWith Slide Over and Split View, Apple has allocated new UI layers to the limited space of an iPad. For the most part, the company has done a good job at keeping the resulting interactions as obvious as possible, but they’ve created peculiar edge cases and questions, too. As I’ve explored above, some of them haven’t been addressed in this first version of iOS 9.0; given their extremely specific nature and, arguably, philosophical problem rather than practical inconvenience, I wouldn’t be surprised if they’ll take a long time to be “fixed”.\nThe concept of direct manipulation in iOS makes considerations on the spatiality and physical consistency of software intriguing – Slide Over and Split View exist in their own UI and UX space, but they also have to coexist with features and layers that predate iOS 9. Spatially speaking, Slide Over and Split View have managed to find their place on iOS. Can the same be said at a visual level?\nThe Visual Complexities of iPad Multitasking\nBack in the old iPad days, there was a certain romance about opening an app and knowing it would be the one and only app you could look at while it was active. It was a consistent, pure iteration of the iPhone’s app model, glorified for a 10-inch screen and embellished by baroque textures of antique leather materials, fine exotic woods, and futuristic robot servants. It was, in many ways, beautiful.\nToday, this is what you can achieve if you try really hard to push iOS and Split View to their limits:\n\nOver 10 layers of user interface displayed in a multitasking environment that involves two apps being used at the same time. Is this the same device that launched to reviews praising how fantastic it was to be forced to use one app at a time?\nNew features always bring new complexity, and true utility is found in the balance of increased functionality and added confusion. iOS 9 multitasking is no exception: it’s powerful, but it also dramatically increases the amount of information being displayed on the iPad’s screen.\nWe have to ask: is it too much?\nIf you consider Picture in Picture, Split View, Control Center, extensions, notifications, app menus, Command-Tab, and the system keyboard, an iPad can be as visually cluttered as a traditional desktop computer. Unlike a Mac, the iPad’s screen real estate is limited and the software keyboard is part of the UI, which leaves even less room for app content to be displayed. Because of its touch constraints, iOS on an iPad Air 2 can even seem busier than OS X.\nConsider Split View and Picture in Picture. Let’s say you’re working with Notes on the left and Safari on the right while watching a video via Picture in Picture. With both apps shown in landscape mode and the keyboard active, there isn’t much room left for the video overlay, but you can keep it on either side if you don’t always need to look at what you’re writing or researching. There’s already a lot going on.\n\nNow you’re in the middle of your note-taking session, Picture in Picture is playing, and you realize you need to work with an extension in Safari. Picture in Picture is displayed on top of Safari, and you hit the share icon to bring up the share sheet. What happens next?\n\nBecause Picture in Picture sits on top of anything that is displayed underneath, the share sheet you just activated is fully hidden by the video player, which you’ll need to move to the left – the only possible location at this point – if you want to use extensions. Fair? Yes, because Picture in Picture is purposefully designed to always follow you around. More complex than what we’re used to on iOS? Also a “yes”.\nAnd it’s not just Picture in Picture that raises questions on the visual clutter now possible on an iPad with iOS 9. What happens when you combine the split keyboard – it still exists! – with Picture in Picture and a bunch of share sheets? How about a custom keyboard, Split View, and Command-Tab? There’s a lot of different styles and UI elements claiming their own space at once.\n\nAfter using iOS 9 on my iPad for the past three months, I’ve wondered if Apple had gone too far in the name of productivity. The iPad used to be the friendly device that could be a calculator, a book, a newspaper, or the web in the palm of your hands. With iOS 9, we’re presented with a device that can show dozens of layers of content at the same time, often from three different apps, with a UI design that doesn’t bring back any feeling of familiarity from physical objects that you hold in your hands.\nAnd that’s okay.\nFor the iPad to grow up, new challenges and complexities ultimately have to be accepted. The entire point of this discussion – and the thread I’ve tried to develop in this chapter – isn’t to argue that Apple should have built multitasking features with no complexity, but to observe how they reacted to the complexities they were destined to introduce all along.\nThe iPad can’t be more without becoming more complex, but the way that complexity doesn’t become complicated is under Apple’s control. With no restraint, we might have ended up like this. And to truly appreciate the fine balance struck between functionality and confusion, we need to look back at iOS 7.\nIf Apple hadn’t redesigned iOS from the ground up with a focus on text, color, clarity, and content, I don’t think iOS 9 multitasking would be as usable as it is today. Modern apps may have lost some of the craft and skills required to recreate realistic visuals, but as a result designers and developers now have to pay attention to aspects of functionality, accessibility, and inter-app communication that were largely ignored or glossed over before.\nI’ve been writing about iOS apps for a while, and I’m not affected by any kind of nostalgia for the good old days of skeuomorphic trends. This isn’t about taste: from a mere functional standpoint, iOS today is more legible, clear, and better prepared to support apps that work together.\nBy hitting the reset button on a design pendulum that had swung too far in favor of photorealism, Apple has created a more uniform, scalable iOS app ecosystem that feels and works as a cohesive force instead of a bunch of fancy toys. It may be less “fun” for designers who don’t have to spend weeks setting up a particular leather texture in Photoshop. But iOS today is better equipped for users of all types and needs, and true care for design is found in how it serves people, not in how much it’s appreciated by fellow designers on Dribbble.\nThis is evident with new multitasking features in iOS 9 for iPad. When using two apps in Split View, you’re likely not going to end up with a wooden shelf on the left side and a metallic cabinet on the other, all while a small video player with rounded corners floats around the screen projecting heavy shadow effects on top of everything. Instead, using Apple’s apps (and third-party ones) in Split View feels like a unified, consistent experience: two pieces of software based on the same design language, not clashing with each other, distinct but homogenous. The more subdued, unassuming nature of the modern iOS makes sheets of content as well as entire apps feel like part of a whole system – multiple components of the same machine and not competing protagonists of the same stage.\n\nVisual uniformity plays an essential role in making iPad multitasking appear simple and devoid of additional clutter, and it suggests that Apple has been preparing for the increased complexity of iOS 9 for a long time. I wouldn’t be surprised if bigger iPhones and the prospect of iPad multitasking ended up being key factors in the decision process of iOS 7 back in late 2012. Today’s iPad multitasking wouldn’t be possible without an OS that can scale and adapt depending on a user’s task, screen size, language, or preferred device orientation. With the system that Apple has put in place – from San Francisco and the Retina display to the keyboard and extensions – every piece works together.\nThere is an added complexity to the overall iPad experience when using new multitasking features in iOS 9, but it’s always mitigated by understandable constraints. Whether it’s the two possible sizes for Split View or the magnetic behavior of Picture in Picture, the increased capabilities of iOS 9 on the iPad come with acceptable limitations. There are some instances of Apple trying to be too clever or redesigning features for reasons that aren’t completely clear yet; for the most part, though, Apple has been judicious in its implementation of simultaneous apps and system features.\nOn paper, iPad multitasking can be visually complex. In practice, you’re not going to always be dealing with overcrowded Split Views displaying multiple menus. iOS 9 multitasking strikes a great balance of optional power tools and the underlying simplicity that has characterized the iPad for the past five years.\nI bet it’s going to be even better on a 13-inch device.\n\nThe Utility of iPad Multitasking\nMy work for MacStories and Relay FM involves writing posts, doing research in Safari, taking notes, managing email, and communicating with others. I use a lot of apps every day, and while I try to automate tedious and repeating tasks as much as possible, there’s always quite some switching required to jump from app to app, even after the extensibility brought by iOS 8.\niOS 9 has profoundly changed the way I work from my iPad. This has been true for Apple apps, as well as third-party ones I’ve been testing this summer.\nI’ve come to use Apple’s Notes app every day. Thanks to its built-in support for Slide Over and Split View, I’m using Notes as a persistent scratchpad next to Safari, Mail, and Messages. While the use case is the same as before – I’m taking notes or reading text from a note – the speed and the simplicity granted by multitasking are completely new. I’ve been able to read Apple’s documentation for iOS 9 on one side of the screen and take notes at the same time with Notes in Split View; I can scroll my Twitter client, take a quick note with Slide Over, and later reopen that note while I’m talking to Myke so we can go through the thoughts I saved.\nNotes and multitasking have also been phenomenal additions when testing new apps and updates for MacStories reviews. When I’m trying a new app, I can use multitasking to save first impressions about the interface and user experience. Later, when I have to send feedback to the developer, I can put Mail and Notes in Split View, copy and paste text, and I’m done.\n\nThe savings granted by iPad enhancements in iOS 9 may seem small as individual entities, but they add up over time. If I’m listening to an episode of Mac Power Users and I want to open show notes in the browser, I can just put Safari next to Podcasts, tap all the links I want to check out, and Split View will open those links in Safari on the right side.\n\nIf I’m researching a topic and I realize I need to visualize it with a mind map, I can fire up the excellent iThoughts, split the screen to have Safari next to it, and turn my iPad into a more proficient research tool that is both a mind map and a browser at the same time.\nIn fact, this review has been entirely researched, composed, and edited on my iPad Air 2 thanks to iOS 9’s new iPad features.\nSoon after Apple seeded the first beta of iOS 9 to developers, I installed it on my iPad and started working on iOS 9 research. Since then, I only used a Mac twice for tasks related to the review that couldn’t be done on iOS. First, I had to install the beta OS on my iPad and transfer WWDC session videos to the Videos app with iTunes.35 Then, I had to use El Capitan’s Safari Web Inspector to measure performance of Content Blockers in iOS 9’s Safari. These two occasions aside, OS X was never involved in any part of the writing or editing process on my end.\nWith Videos and Picture in Picture, I was able to play a WWDC session while looking up information in Safari and taking notes at the same time. With the new Shortcut Bar and cursor control gesture, formatting and editing text in a note (to fix typos, delete lines, or create headings) was quick and painless. If Picture in Picture was getting in the way, I could dock it to the side and hide it from view. For about two weeks, I only took notes, watched videos, took more notes, and copied and pasted links and images from Safari36 into Notes in Split View. I saved tweets about interesting iOS 9 features from Twitterrific into Notes with the share extension, too.\nThen, when the time came to turn my notes into an outline with some kind of structure, I used a beta version of iThoughts to enter Split View so I could look at the notes and mind map side by side. This way, I was able to have my stream of notes on the right and paste them or rewrite them on the left in the mind map. iThoughts has fantastic shortcurts to create new nodes and sub-nodes with the Return and Space keys of the software keyboard, and that (combined with swiping) was a great way to create branches in seconds. If I needed to insert images in the mind map, I could use the built-in image picker or copy an image from Safari in Split View and paste it into the map. This went on for about a month.\n\nWhen I started writing the actual review in mid-July, I used Slide Over extensively because Editorial – my text editor of choice – didn’t support iOS 9 multitasking. This is the unsung benefit of Slide Over – it brings the convenience of iOS 9 multitasking to any app, regardless of their support for the more powerful Split View.\nI’ve used everything in Slide Over while writing in Editorial. I used it to reply to iMessages without leaving Editorial so Myke and my girlfriend didn’t think I was ignoring them (I was just very focused). Safari, iThoughts, Photos, Notes, and Mail were excellent to look up my mind map and reference material while assembling chapters. Later on, as I got more betas with support for iOS 9 multitasking, I used apps such as PCalc for quick calculations in Slide Over, Terminology to look up words in the dictionary (not even Apple has something like that), and Dispatch to instantly turn new email messages into tasks.\nSafari is the perfect example for Slide Over.\nBecause of iOS 9 multitasking and enhancements to the iPad’s software, for the first time this year I was able to produce a full review on iOS and be happy about it. Picture in Picture, Slide Over, Split View, and keyboard changes were great additions to my writing process, and their effects touched every area of my iPad workflow.\niOS 9 is a watershed moment for iPad users, and a game changer for the iPad platform. I’ve been using the iPad every day for the past three years, and iOS 9 brings a radical new way to work with apps on the device.\nFor a long time, I thought that I didn’t want split screen multitasking on the iPad. I was afraid that such feature would make the iPad less focused – that eschewing the principle of one app at a time would result in the iPad losing its way. After using iOS 9, I can say that I was abundantly wrong.\nWorking with Dispatch and Workflow in Split View.\nWith iOS 9 multitasking, I feel more focused when working on my iPad because I’m switching between apps less. It sounds absurd, but it’s not: because I no longer lose the context of what I’m doing by clicking the Home button, multiple apps aren’t a distraction. I’m not focused on one app at a time anymore. I’m focused on a task. And if that involves multiple apps, the iPad can handle it.\nWith iOS 9, an app on the iPad no longer necessarily commands the entire screen. This makes the iPad more comparable to a traditional computer, with the screen being used not to mimic a single utility but as a canvas for software, in multiple shapes and forms.\nThe complexities created by the ability to manage concurrent apps have been largely kept at bay by Apple’s design choices, which include limited compact sizes, selected corners for Picture in Picture, and, like extensions last year, a lack of tools to programmatically activate multitasking – meaning, you’ll always have to touch the screen to initiate Slide Over and Split View.\niOS 9 multitasking’s most evident pitfall is lack of drag & drop between apps in Split View. You would think that, given two apps side by side and a platform based on touch, Apple would have built a system to move information from one app to another. This isn’t available yet, and it’s my biggest wish for future iterations.\nOther improvements I’d like to see in multitasking would be a faster way to flip the secondary and primary app in Split View, as well as support for external keyboard shortcuts in multitasking. Right now, there’s no way to show Slide Over or manage Split View with an external keyboard, which slows me down when I don’t want to touch the screen.\nThe inconsistencies of the classic app switcher and what’s still missing don’t change the underlying premise. iPad multitasking on iOS 9 transforms how the device can be used as a computer every day. I run MacStories entirely from my iPad, and multitasking has dramatically sped up how I work. But what Apple has done with iOS 9 on the iPad goes beyond multitasking alone.\nFor the first time since its launch in 2010, the iPad is ready to have its own unique OS. Apple’s focus on familiarity and consistency with iPhone OS was a selling point of the iPad five years ago, but with time it became a liability.\nFor the past four years, iOS for iPad has mostly felt like a rushed adaptation of the main iOS for iPhone, with uninspired designs scaled up from the smaller screen. iOS 9 shows some progress on this front – such as a redesigned Notification Center with two widget columns – but it’s possible to come across UIs that have been enlarged from the iPhone without proper consideration. However, the sheer amount of what’s new and exclusive to the iPad in iOS 9 offsets what is left over from the previous era.\nToday, we’re seeing what iOS for iPad should be. A version of iOS that shares the same underlying technologies and design language of the iPhone, but optimized for the different hardware and interactions of the iPad. There’s still work to be done, but what we have today is an impressive step up from iOS 8. With multitasking, keyboard changes, bigger folders, and Picture in Picture, all past mistakes are forgiven.\niOS 9 is the first version of iOS that isn’t afraid to let the iPad be the iPad. Consistent with the iPhone, willing to take its own risks, and reminiscent of a Mac without the baggage of OS X.\nWith iOS 9, the iPad has entered adulthood.\n\nSearch and Deep Linking\niOS 9 introduces a supercharged Spotlight that, under the umbrella of Search, aims to lay a new foundation for finding app content and connecting apps.\niOS 9 Search: accessed by swiping down (center) or from a dedicated page (right).\niOS Search is accessed by swiping down on any Home screen (like the existing Spotlight) or as a standalone page to the leftmost side of the first Home screen in a return to form that does more than just search. When swiping down from the Home screen, the cursor is immediately placed in the search box, ready to type; if opened from the dedicated page, you’ll have to tap the search box or swipe down to bring up the keyboard – an important difference geared at showcasing other features available in this screen. As far as searching is concerned, both modes lead to the same results.\nOn the surface, iOS 9 Search augments the existing Spotlight by extending its capabilities beyond launching apps and searching for data from selected partners. With iOS 9, you’ll be able to look for content from installed apps, such as documents from iCloud Drive, events from a calendar app, or direct messages from a Twitter client. This can be done by typing any query that may be relevant to the content’s title and description; results display rich previews in iOS 9, optionally with buttons to interact with content directly from search.\n\niOS Search is more than a fancier Spotlight. Changes in this release include new APIs for local apps and the open web, highlighting Apple’s interest in web search as an aid to apps. Some aspects of it aren’t clear yet – and Apple has been tweaking quite a few things over the summer – but the nature of the change is deep and intriguing.\nA key distinction to note in Apple’s implementation of Search is that are two different indexes powering results that appear in Spotlight and Safari. A local, on-device index of private user content and data that is never shared with anyone or synced between devices; and a server-side, cloud index that is under Apple’s control and fed by the company’s Applebot web crawler.\nLocal Search: CoreSpotlight and User Activities\nIn iOS 9.0, the focus is mostly on the local index, which will power the majority of queries on user devices.\niOS 9 can build an index of content, app features, and activities that users may want to get back to with a search query. It’s comprised of two APIs: CoreSpotlight, an index of user content built like a database that can be periodically updated in the background; and our friend NSUserActivity, this time employed to index user activities in an app as points of interest.\nResults from Maps and WhereTo have buttons to open directions.\nFrom a user’s perspective, it doesn’t matter how an app indexes content – using the new search feature to find it always works the same way. Search results from apps in iOS 9 are displayed with the name of the app they’re coming from, a title, description, an optional thumbnail image, and buttons for calling and getting directions if those results include phone numbers or addresses. Visually, there is no difference between results powered by CoreSpotlight and those based on user activities: both display rich previews in iOS 9 and can be tapped to open them directly into an app.\nOn a technical level, the difference between CoreSpotlight and NSUserActivity for developers is that while activities are intended to be added to the on-device index as the user views content in an app, CoreSpotlight entries can be updated and deleted in the background even if the user isn’t doing anything in the app at the moment. For this reason, a todo app that uses sync between devices may want to adopt CoreSpotlight to maintain an index of up-to-date tasks: while entries in the index can’t be synced, developers can update CoreSpotlight in the background, therefore having a way to check for changes in an app – in this example, modified tasks – and update available results accordingly.\nApple is also giving developers tools to set expiration dates for the CoreSpotlight index (so entries in the database can be purged after some time to prevent the archive from growing too large) and they can rely on iOS’ existing background refresh APIs and combine them with CoreSpotlight background changes to keep the local index fresh and relevant.\nApple’s own apps make use of both CoreSpotlight and NSUserActivity to build an on-device index of user content. Mail, Notes, Podcasts, Messages, Health, and others allow you to look for things you’ve either created, edited, organized, or seen before, such as individual notes and messages, but also the Heart Rate pane of the Health app or an episode available in Podcasts.\nSearch results from Drafts and Dispatch.\nI’ve also been able to try apps with support for local indexing on iOS 9. Drafts 4.5 enables you to search for text from any draft stored in the app. Clean Shaven Apps has added CoreSpotlight indexing support to Dispatch, allowing you to look for messages already stored in the inbox and a rolling archive of the latest messages from another mailbox. On iOS 9, iThoughts lets you search for any node in a mind map and jump directly to it from search, bypassing the need to find a file in the app, open it, and find the section you’re looking for.\nWhereTo’s iOS 9 update was perhaps the most impressive search-related update I tested: the app lets you search for categories of businesses nearby (such as restaurants, coffee shops, supermarkets, etc.) as points of interest, but you can also get more detailed results for places you’ve marked as favorites, with a button to open directions in Maps from search with one tap.\nApple has given developers a fairly flexible system to index their app content, which, with a proper combination of multiple APIs, should allow users to find what they’re expected to see in their apps.\nHowever, this isn’t all that Apple is doing to make iOS Search richer and more app-aware. Apple is building a server-side index of crawled web content that has a connection to apps – and that’s where their plans get more confusing.\nApple’s Server-Side Index\nIn building iOS 9 Search, Apple realized that apps often have associated websites where content is either mirrored or shared. For the past several months, Apple has been crawling websites they deemed important to index their content with a crawler called Applebot; now, they’re ready to let every website expose its information to Applebot via web markup. The goal is the same: to provide iOS Search with rich results – in this case culled from a much larger source.\nThe server-side index is a database in the cloud of public content indexed on the web. Unlike a traditional search engine like Google, though, Apple’s primary motivation to keep a cloud index is to find web content that has an app counterpart, so users can easily view it in a native app.\nThink of all the services that have native apps for content that is also available on the web: from music services to online publications and websites like Apple’s online store or Foursquare, many of the apps we use every day are based on content that comes from the web and is experienced in an iOS app. From such perspective, Apple’s goal is simple: what about content that can be viewed in an app but that hasn’t been experienced yet by the user? What about a great burger joint near me listed in Foursquare that I still haven’t seen in the Foursquare app, or an article about Italian pasta that I haven’t read in my favorite site’s app yet?\nInstead of having to search Google or use each app’s search feature, Apple is hoping that the iOS Search page can become a universal starting point for finding popular content that can be also be opened in native apps.\nBecause the web is a big place, to understand the relationship between websites and apps Apple has started from an unexpected but obvious place: iTunes Connect. When they submit an app to the App Store, developers can provide URLs for marketing and support websites of an app; Apple can match those websites with the app in their index and crawl them with Applebot for content that could enrich Search. These pieces of content – such as listings from Airbnb or places in Foursquare – will then be available in the Search page and Safari (the browser’s search feature can only search this type of content, as it doesn’t support local app search) and will open in a native app or on the indexed webpage if the app isn’t installed.\nHow Applebot “sees” a MacStories article.\nTo teach Applebot how to crawl webpages for iOS Search and give results some structure, Apple has rolled out support for various web markup technologies. Developers who own websites with content related to an app will be able to use Smart App Banners, App Links, and Twitter Cards to describe deep links to an app; the schema.org and Open Graph standards are used to provide metadata for additional result information.\nApple calls these “rich results”. With schema.org, for instance, Applebot is able to recognize tagged prices, ratings, and currencies for individual listings on a webpage, while the Open Graph image tag can be used as an image thumbnail in search results. The goal is to make web-based results rich in presentation as their native counterparts. When you see rich descriptions and previews for links shared on Twitter, Facebook, or Slack, Open Graph and schema.org are usually behind them. Apple wants the same to be true for iOS search results. They’ve even put together a search API testing tool for developers to see how Applebot crawls their webpages.\nIn practice, it’s been nearly impossible for me to test the server-side index this summer. In my tests from June to early September, I was never able to consistently find results from popular online services (Airbnb, Foursquare, eBay, or Apple’s own online store) that were relevant to my query or capable of enriching the native experience of an app on my device. In the majority of my tests, web results I managed to see in Search were either too generic, not relevant anymore, or simply not performing as advertised.\nFor example, when typing “Restaurants Rome Foursquare” without having the Foursquare app installed on my device, I got nothing in iOS Search. My assumption was that popular Foursquare results would be available as Applebot-crawled options in Search, but that wasn’t the case. Same for Airbnb, except that I occasionally managed to see listings for apartments fetched from the web, but they weren’t relevant to me (one time I typed “Airbnb Rome Prati”, and I somehow ended up with an apartment in France. The result was nicely displayed in Search though, with a directions button for Maps).\n\n\nI’ve started seeing some web results show up in iOS Search over the past couple of days under a ‘Suggested Website’ section. Starting Monday (September 14th), I began receiving results from the Apple online store, IMDb, and even MacStories. I searched for content such as “House of Cards”, “iPad Air 2”, and “MacStories iPad”, and web-based results appeared in Search with titles, descriptions, and thumbnails. In all cases, tapping a result either took me to Safari or to the website’s native app (such as the IMDb app for iOS 9 I had installed). Results in Search were relevant to my query, and they populated the list in a second when typing.\nThe issues I’ve had with web results in iOS 9 Search this summer and the late appearance of “suggested websites” earlier this week leads me to believe that server-side results are still rolling out.\nThe most notable example is that Apple’s own demonstration of web results in search, the Apple online store, isn’t working as advertised in the company’s technical documentation. The web results I saw didn’t appear under a website’s name in Search, but they were categorized under a general ‘Suggested Website’. In Apple’s example, Beats headphones should appear under an ‘Apple Store’ source as seen from the web, but they don’t. My interpretation is that proper server-side results with rich previews are running behind schedule, and they’ll be available soon.\nA change in how NSUserActivity was meant to enhance web results adds further credence to this theory. As I explored in my story from June, Apple announced the ability for developers to tag user activities in their apps as public to indicate public content that was engaged with by many users. According to Apple, they were going to build a crowdsourced database of public user activities, which could help Applebot better recognize popular webpages.\nHere’s how Apple updated its documentation in August:\n\n Activities marked as eligibleForPublicIndexing are kept on the private on-device index in iOS 9.0, however, they may be eligible for crowd-sourcing to Apple’s server-side index in a future release.\n\nDevelopers are still able to tag user activities as public. What is Apple doing with those entries, exactly? Another document explains:\n\n Identifying an activity as public confers an advantage when you also add web markup to the content on your related website. Specifically, when users engage with your app’s public activities in search results, it indicates to Apple that public information on your website is popular, which can help increase your ranking and potentially lead to expanded indexing of your website’s content.\n\nIf this sounds confusing, you’re not alone. To me, this points to one simple explanation: Apple has bigger plans for web results and the server-side index with a tighter integration between native apps (public activities) and webpages (Applebot), but something pushed them back to another release. The result today is an inconsistent mix of webpages populating Search, which, as far as web results alone are concerned, is far from offering the speed, precision, and dependability of Google or DuckDuckGo in a web browser.\nThere’s lots of potential for a search engine that uses web results linked to native apps without any middlemen. If working as promised, iOS Search could become the easiest way to find any popular content from the web and open it in a native app, which in turn could have huge consequences on app discoverability and traffic to traditional search engines – more than website suggestions in Safari have already done.\nHowever, this isn’t what iOS Search is today, and it’s not fair to judge the feature based on the merit of its future potential. The server-side index is clearly not ready yet.\n\nThe Flow of Search\nToday, iOS 9 Search is useful to find content from installed apps. The ability to look for specific pieces of content and quickly get to them has been a major addition to my workflow and, if anything, the biggest hurdle in using Search has been remembering that I can now look for everything on my devices. After years of being used to opening apps first and finding content second, it’s hard to kick a habit entrenched in muscle memory.\nFor their own apps, Apple has done a good job at ensuring content found in Search is properly displayed and highlighted once opened in the relevant app. When you open a message from Search, the message bubble in the Messages app is darkened to indicate it’s the selected result; a reminder opened from Search gets a bold title in the Reminders app; podcast episodes and Mail messages are also shown as selected items in the respective apps after you open them from Search. Part of this was already in place with iOS 8, and it’s been extended to more apps with iOS 9.\nAs for third-party developers, it’s up to them to figure out ways to restore their app’s state when selecting search results, but most of the apps I tried with Search support – Dispatch, Drafts, iThoughts, and others – used similar techniques to update their UIs and restore results.\n\nI’m still learning how to remember that I can now find information and documents more quickly thanks to Search. I’ve become a fan of the ability to look for songs and playlists in My Music and play them right away from Search – and I like how iOS adjusts the ranking of songs based on those I’ve been listening to recently. Searching for messages in Mail has been considerably faster and more accurate when done from Spotlight than the app’s own search feature. I love how the Podcasts app exposes show descriptions and notes to search, and I’ve grown accustomed to jumping to specific sections of the Health app and iThoughts via Search. I can’t wait to see what apps like Slack, Dropbox, Editorial, and Pocket will do with Search and how that will speed up the way I move across apps and tasks.\nMy main concern with new data sources available for Spotlight is that Apple hasn’t built more advanced controls to choose how app content ends up in there. In iOS 9, it’s possible to turn off apps that populate results in Settings > General > Spotlight Search, but there’s no way to reorder apps and make sure that, for instance, Mail results are always at the top. This, combined with the way iOS 9 dynamically ranks results based on engagement and puts some of them in a Top Hits section, has caused me some confusion from a spatial perspective, as results aren’t always in the same position or in the same order.\nAlso, because indexing local app content can be a CPU-intensive task, background updates to the database may not be immediate (though this has been sporadic in my tests) and the search functionality of NSUserActivity and CoreSpotlight is not supported on the iPhone 4s, iPad 2, iPad (3rd generation), iPad mini, and iPod touch (5th generation). Developers will have to carefully consider how to index their app content to avoid consuming too many resources, but I’m optimistic the system will scale gracefully on latest hardware. Results on my iPad Air 2 come up almost instantly as I start typing a query, and they continue to update in real time as I add keywords.\nApple hasn’t built a traditional document-based search feature in iOS 9. For the past two years, the company has been enhancing its Spotlight search tool with external integrations such as Wikipedia results, movie showtimes, and snippets of web results from Bing. iOS 9 expands that to account for the richness of data inside apps.\nWhile users will be able to launch apps and look for documents in the traditional way, Search in iOS 9 is aware of the unique nature of apps, which may include activities, points of interest, sections of a document, and other subsets of content. In a post-PC world, it makes sense to have a new kind of search that focuses on what’s inside apps rather than filenames alone.\nModern apps aren’t static containers of files. They’re rich experiences, and iOS 9 can index the activity that takes place inside them. App search in iOS 9 has lived up to my expectations. I’m waiting to see what Apple has in store for their server-side index.\n\nDeep Links, Back Links, Universal Links\nDeep linking is where the pieces of Apple’s app search and navigation puzzle come together. iOS 9 marks Apple’s long awaited foray into native deep linking, and the company is betting heavily on deep links as a superior way to launch apps, navigate them, index content, and share results.\nDeep linking refers to the ability to link to a specific location within an app. Deep links are URIs that open apps into discrete navigation points, such as the Steps screen in the Health app, a profile view in Tweetbot, or an email message in Mail. With proper support from developers, any iOS app screen or activity can have its own deep link. Over the years, a number of third-party companies have attempted to establish cross-platform standards for deep links in apps; with iOS 9, Apple aims to provide deep linking support in every app natively.\nDeep links provide structure. With deep links, individual app sections and activities can be restored when a user opens a link to them from Search. They power smart reminders so Siri can create todos that contain a deep link to reopen an app’s view. Smart App Banners, used by developers to match their websites to native apps, enable Applebot to associate web links with deep links and prioritize certain webpages over others when indexing webpages.\nUnder the hood, deep links lay a new foundation for launching and navigating apps on iOS. This starts from the app launching animation itself: alongside a revised multitasking switcher, iOS 9 features a new transition for going from one app to the other that pushes the current app to the left and slides onto a new one to the right.\nThanks to this, opening apps on iOS feels more like navigating pages of the OS – a metaphor that reinforces the idea of deep links capable of connecting specific sections of apps together. The animation feels faster than iOS 8 and it makes sense within the spatiality of iOS 9.\nThe new app launching animation does more than offering new visual eye candy – it showcases iOS 9’s deep linking capabilities in the status bar. New in iOS 9, every time you’ll leave an app to open another one by following a link or a notification, you’ll get a back button in the upper left corner of the status bar to go back to the “launcher” app.\n\nThe back button has long been a staple of Android devices; at least initially, it was surprising to see Apple follow a similar approach for iOS 9. The similarities with Android’s back navigation feature are only superficial: iOS’ back button is built into the status bar and it offers a shortcut to return to the app that launched the one you’re in; it’s overridden every time an app launches another one. If you open a link from Messages into Safari, the status bar will show a back button to return to Messages; if you tap a notification when in Safari and open Twitter, the back button will only return you to Safari.\niOS 9’s back button doesn’t go back into the entire navigation stack of recent apps. It sticks to the status bar even if you move around an app, but it’ll disappear after two minutes. It’s not a persistent back button – it’s a temporary shortcut.\nThe back button is useful when combined with deep links that open apps into specific views. When used from Search, the back button makes it easy to view a result, go back to Search with one tap, pick another, and so forth. It also makes tapping notifications less disruptive to the user experience, as returning to what you were doing is one tap away. With it, the role of the Home button is considerably diminished in iOS 9, as returning to the previous app is easier and more contextual.\nThe placement of the back button is problematic. When following a link or a notification into another app, the button will be displayed in the left corner of the status bar, hiding Wi-Fi and carrier information – an essential detail that tells us how our devices are connecting to the Internet. On more than one occasion, I found myself following links and wondering why an app wasn’t loading – I couldn’t tell if the app was having problems, or if my 4G network had poor reception because the back button was covering up everything. I wish Apple had thought of a gesture to manually dismiss the back button; on the iPad, they could have at least placed it next to Wi-Fi and carrier information given the bigger display.\nI sympathize with the struggle to find a placement for this button. Ultimately, an OS with superior deep linking features benefits from a system-wide shortcut that lets users navigate back and forth between apps as they would with webpages, and that corner of the status bar is the least intrusive option. It’s not perfect but it could have been worse; in practice, it’s useful and consistent.\nApple’s plans for deep linking extend beyond search, smart reminders, and the back button. With iOS 9 Apple is introducing Universal Links, a way to launch and link to apps with web links. With Universal Links, Apple is letting websites and apps communicate through the common thread of a URL, with verifiable ownership, graceful fallbacks, and cross-platform support.\nA Universal Link opened from Safari (back button) in MeisterTask.\nUniversal Links are meant to offer a superior option to custom URL schemes for launching apps and sharing links to them. A Universal Link is a web link that, with a file uploaded by developers on their app’s servers and integration in Xcode, iOS 9 can open into a native app instead of its website. Upon first launching an app with Universal Links support in iOS 9, the app will check for the configuration file on its server; from that point on, whenever possible, HTTP links to that domain will open in the app, showing the deep linked view.\nTo understand how Universal Links work, imagine that Twitter will start supporting them for twitter.com URLs and their iOS app. Every time you tap a link to a tweet on iOS 9, that link will open the tweet in the native app instead of Safari if you have it installed. If you don’t have the app or share the link with someone who doesn’t have the app, the link will open in the browser as a fallback because it’s a normal URL.\nImagine this for links to shared projects in a todo app, songs on a streaming service, Slack uploads, Overcast podcast episodes, or Google Docs files. With a regular HTTP link, iOS 9 will take you to the content you’re looking for inside a native app. Universal Links are platform neutral: if the app isn’t installed, they go straight to the web anyway.\nThe flow of a Universal Link from Google to IMDb’s app and webpage.\nWhen writing this review, I was able to test a version of IMDb with Universal Links. When opened from Safari, Google, Search, and any other app, imdb.com links automatically opened in the native IMDb app, showing me the content – such as trailers or movie pages – I would have seen on the web by default in iOS 8.\nUniversal Links are meant to provide a safe, cross-platform way to share links to content in apps without relying on custom URL schemes. Instead of linking to a user profile in Twitter with the custom 'twitter:// URL scheme, you’ll be using the same twitter.com links you see in the browser every day. With a custom URL scheme, if you don’t have the app installed and tap the URL, it does nothing. URL schemes are local; Universal Links are global and local at the same time.\nUniversal Links carry important benefits over the old way to link to specific areas or features of apps. Universal Links are always mapped to the right app: while different apps can claim the same URL scheme on iOS, Universal Links work by matching an app with a JSON file on the app’s server; this ensures that links from a certain domain can only open in its associated app. In Twitter’s case, this could mean that, if installed, twitter.com links will always launch the official Twitter app, and The Iconfactory and Tapbots won’t be able to do anything about it as they can’t control the twitter.com server.\nIt was obvious for Apple to elect web URLs as the best way to link to apps: web links are omnipresent in today’s communications, they work everywhere, and they are the common language of the web.\nUniversal Links are designed to not be noticed and to feel as seamless as possible. For the most part, that’s exactly what using them is like – you tap a link and, if it’s a Universal one, it’ll open in an app.\nThere are some aspects of the process that you can control. When opening a Universal Link, iOS 9 will display a forward button on the right side of the status bar (opposite to the back button) to give you the option to view the link in Safari instead. The same issues mentioned for the back button apply here as well, as the shortcut takes over battery and Bluetooth icons (and looks comically alone on the iPad). However, the ability to jump from native app to webpage with one tap is convenient, and I couldn’t imagine any other place for it.\n\nIf you choose to view a Universal Link in the browser, a banner will sit atop the webpage with an Open button to return to the native app if you change your mind. This is the equivalent of a small Smart App banner, but it’s not as obtrusive. It’s a nice idea, and it lets you cycle through native app and web view for a Universal Link with one tap.\nAll together, iOS 9’s new deep linking features make for a unified app experience built with speed, security, and consistency in mind. They’re also signs of a mature OS and app ecosystem that are ready to talk to each other with links that connect app content to system features, eschewing the numerous hacks and workarounds of custom URL schemes.\nGoing back to iOS 8’s app switching design and limitations feels cumbersome after trying iOS 9. Deep links and Universal Links dramatically speed up moving between apps – and they reduce the Home button to a mere hardware option for going back to the Home screen. The back button, while not perfect and perhaps a bit inelegant at times (reaching it also requires a certain dexterity), is a great shortcut, and I can’t imagine using iOS without it now. I wasn’t able to try many apps with Universal Links support, and, while I believe they won’t be suitable for apps that don’t rely on web content, I believe they offer a superior option to URL schemes in every way.\niOS apps are starting to feel less and less like silos. Aided by the back button, deep links and Universal Links are another step towards more interconnected apps.\n\nIntelligence\nFor many of us, iOS devices are the most important computers in our lives. With iOS 9, Apple is rolling out a series of features for proactive recommendations and intelligent suggestions aimed at making the devices we use every day smarter, more contextual, and personal. The results are mixed, but the beginning of something new is afoot.\nThe source of all of iOS 9’s proactive and intelligent features is our data and daily routine. By using information we store in system apps such as Mail and Calendar and by observing our habits, iOS 9 can discover patterns in the apps we use, when and where we tend to use them, and it can offer shortcuts to show us what we’re most likely going to need next. In broad strokes, this is at the core of Apple’s proactive initiative: by learning from our habits and data, iOS can be a more helpful assistant in everyday life. Unlike similar efforts by other companies (namely Google), Apple has prioritized user privacy when building these functionalities into iOS 9, a design choice with a deep impact on what the OS is capable of suggesting.\nIntelligent and proactive recommendations are scattered throughout iOS 9 with shortcuts in various places. Some of them are labeled as Siri features, while others are new options in existing apps that use content from other apps to save time with suggestions.\nAs a starting point, the new Search page features Siri suggestions for contacts and apps. Displayed at the top of the page and replacing the old contact shortcuts of the iOS 8 app switcher, these shortcuts aren’t indicative of contacts explicitly marked as favorites or apps you’ve recently used. Rather, these suggestions are based on what iOS thinks you’re going to need.\n\nRecommendations are informed by different variables, such as frequency of use, time of the day, day of the week, current location, and other patterns the OS spots and that are used to build up suggestions for apps and people. There’s a chance you’ll see an app you only use on Thursdays and friends you only contact during the weekend. One time, I got a suggestion for my Shopping list in Reminders, and I later realized it was because I was at the grocery store and iOS had memorized the list I frequently used when shopping there.\n\nI also saw a recommendation for this review’s EPUB in iBooks (with a read completion status) when I was assembling the eBook and constantly checking it out in the app.\nWhen working as advertised, Siri suggestions are handy because they bring serendipitous discovery to the Search page. And because they’re not a static set of favorites but a dynamic, continuously updating list of shortcuts that learn from you, they adjust alongside your routine and what you’re likely to do next.\nFor example, I use the Do Button app every night before sleep to tell my girlfriend the exact time I went to bed with an email. Now, iOS 9’s Search page shows me a shortcut to that app every night between 3 and 5 AM, when I typically go to bed. I’ve seen suggestions for Google Maps at specific places where iOS knew I was going to start navigation in the app, and a couple of friends pop up on Saturdays because I tend to text them and ask them to go out for dinner (when tapping a contact, iOS displays phone, message, and FaceTime actions for it). iOS 9 has picked up some of my habits, and when I come across a suggestion that is accurate, I’m glad iOS is helping me save time.\nThat’s not always the case, though. The patterns that I described above are fairly easy to spot as they’re repeatable, discrete routines that stand out from everything else. But I use my iOS devices all day every day, and I switch between apps and conversations a lot. The average result is that, for me, Siri suggestions in the Search page mostly are a random selection of shortcuts with the occasional gem that appears at regular intervals when needed. That’s the problem with a general purpose suggestion feature based on “patterns”: when you use your iOS device too much (as I do), app and contact suggestions tend to feel like a lottery. They can’t spot too many distinguishable patterns just by looking at which apps I launch.\nApps like Slack, Twitterrific, and Twitter are always listed in my Search page, but that’s because I always use them. When I don’t see Twitter and Slack, I see a repeat of apps I’ve recently used – the same entries from the app switcher. Do these make sense as persistent suggestions intermixed with shortcuts based on specific times of the day and locations? Wouldn’t it be better to identify the apps I’m constantly using and display them in some kind of separate Top Hits view? Of course I know I’m going to be reading Twitter and Slack. There’s no point in iOS telling me to do so.\nAnother issue, I believe, is that Siri suggestions in this screen don’t have any explanation attached to them. iOS 9 doesn’t say “Good morning, here’s what happened in Slack last night and I think you’re going to need CityMapper next because you take the subway on Tuesdays”; it just brings up a bunch of shortcuts, leaving it up to you to figure out if they’re useful or not. Sometimes, they are, and it feels nice. Most of the time, they are too generic to warrant a top spot in the Search page, and I think of turning them off entirely. I haven’t yet because I’ve seen how they can be useful at times, but I’d like them to be more than recently used apps.\nAlong the same lines, the Search page offers shortcuts for Nearby businesses in Maps. Besides the fact that, for my area in Rome, Apple’s business database continues to be outdated and lackluster, I don’t understand the kind of Maps suggestions iOS gives me.\nI would expect these recommendations to account for my habits (as tracked by iOS’ Frequent Locations feature) and likelihood of needs to show me businesses relevant to my routine and time of the day. Instead, iOS’ Nearby section has shown me all sorts of business suggestions at the most disparate times: convenience stores at 9 PM alongside “Nightlife” POIs37; coffee shops and restaurants at 3 AM38; “Fun” and “Transport” suggestions in the afternoon, which usually don’t go well together.\nIn three months, I have never found the Nearby suggestions in the Search page to be useful. The categories are too broad for me to understand at a glance whether the place I’m looking for is “Fun” or “Nightlife”, and the time of the day when they are displayed is usually not in line with what I do on a daily basis. I would prefer iOS to provide me with practical advice for individual places I frequently visit, such as the current traffic to get to my neighborhood supermarket or weather conditions at my favorite beach. Alas, this kind of detail and personalization isn’t available for Nearby suggestions, and that’s disappointing.\nThe other system-wide proactive mechanism of iOS 9 is standalone app recommendations. In this case, iOS will suggest an app to launch in the same area where Handoff for apps is displayed in the Lock screen (bottom left) and app switcher (at the bottom in iOS 9).\nApp suggestions can be displayed in the Lock screen (left), or in Handoff.\nThese shortcuts are, like the Search page, accounting for different variables to recommend an app you’re likely going to need. Because of their placement in the UI, they can be more easily noticed when your device enters a scenario that iOS identifies as a pattern.\nIn addition to time of the day and location, iOS 9 can monitor Bluetooth and audio connections and guess which app you may need when that happens. If you tend to open the Music app after plugging in your EarPods, iOS will bring up the Music icon and media controls in the Lock screen, or it’ll display a shortcut in the app switcher telling you that you can open Music because an audio connection has been detected (I like how these suggestions come with an explanation). Or, if you like to watch Netflix with your Beats Wireless on, iOS will also spot that pattern and recommend Netflix as soon as a Bluetooth connection is established.\nHandoff has a new location in iOS 9.\nThe same variable can lead to different recommendations in different times of the day. Listen to audiobooks on your way to work in the morning but to podcasts when going back home? iOS 9 will show different apps for those two scenarios, learning and adjusting over time.\nApple also added contextual awareness support for getting in and out of the car in iOS 9, combining that with proactive suggestions. This isn’t well documented by Apple, but as seen with Reminders, iOS 9 has the ability to recognize user presence in a car by looking at connections to generic car Bluetooth devices as well as CarPlay. The car becomes another dimension for smart suggestions in iOS 9, which can give you app shortcuts based on what you do – such as listening to podcasts or music when driving – but also traffic notifications for where you’re most likely going. The idea of using the car as another layer of user patterns is an intriguing one, and while I couldn’t test this because I don’t have Bluetooth in my car, impressions from hundreds of users I polled on Twitter were positive.\nI find individual app suggestions to be nice, and generally more timely and relevant than what I see on the Search page. While not revolutionary, it’s nice to be able to quickly open Music or Overcast when my Beats are connected via Bluetooth, and I’ve been surprised by how iOS picked up that I was going to need Google Maps or Nuzzel at specific times of the day, putting them on the Lock screen. As more and more sensors fill our homes and clothes going forward, I fully expect iOS to gain support for deeper context recognition – imagine suggestions powered by HomeKit devices, proximity to an Apple TV, or beacons.\nThat’s not to say that app suggestions in the Lock screen and app switcher are perfect: I’d still like to see more targeted suggestions for Music, as iOS hasn’t figured out that I like to listen to Death Cab for Cutie every night before bed. The granularity and timeliness granted by audio and Bluetooth connections have led to more useful app suggestions in my experience, but they can improve.\nNext up is Mail, which is used in iOS 9 as an information database in three ways. In Contacts, you can search for people found in Mail and add them as new contacts with some fields already filled-in, or you can add new email addresses to an existing contact as iOS will match the same person between Contacts and Mail. This has been useful to update old contacts with new email addresses from my own correspondence.\n\nSecondly, when receiving a phone call from a number that’s not in your contacts (we all dread those phone calls), iOS 9 tries to discover who it is by looking into Mail messages. I only had that happen once, but it worked as expected.\nLast, iOS is more proactive in offering to create new calendar events or contacts from messages that contain such information. A new banner displayed at the top of a message provides a shortcut to create new events and address book entries with one tap, which is a more visible option than iOS’ existing support for smart data detectors in message bodies. iOS 9 can also detect events from messages that contain flight details or restaurant reservations and put them in Calendar for your consideration; you can choose to turn off event suggestions based on Mail in Settings.\nUsing Mail as a repository of information for other apps is an interesting idea: despite its somewhat archaic nature, a lot of our communications and notifications still come through email, and scanning Mail to bring up shortcuts and suggestions seems like a good idea to me (and I would like to see more of it). If anything, these features highlight the benefit of using Apple’s native Mail and Calendar apps over third-party clients, which will get none of these integrations. I wouldn’t be surprised to see users keeping Mail fetching messages in the background without using it just to take advantage of suggestions.\niOS 9’s proactive suggestions for Contacts and Calendars don’t stop at Mail. When I was on vacation with my girlfriend, we used Apple Maps in Positano to browse restaurants nearby and call them to make a reservation. I didn’t have their phone numbers in my address book, but I noticed that iOS 9 used the restaurant’s name (taken from Yelp, I’d guess) in the Recent Calls screen, which was a nice touch.\nIn Calendar, events that contain an address now offer the option to use a Time to Leave notification that will send an alert when it’s time to leave for an event depending on your current location and traffic. When receiving the notification, you can snooze it for later or tap it to see directions and get going.\nAt 2 AM, it takes me 5 minutes to get there, not 16.\nThis is another interesting idea, but it hasn’t worked well for me in practice. Events that I knew would take me 10 minutes to get to a location consistently sent Time to Leave notifications 25-40 minutes ahead of time. Not even after “teaching” the system that my driving style and traffic weren’t as imagined by its intelligence did iOS learn that there was no need to send a notification 40 minutes early. I’ve wondered if the wiggle room between the notification and event time could be cultural: perhaps Americans like to arrive early at their events and don’t mind waiting 20 minutes while sipping on their ventis. But in Rome, there’s no such thing as arriving early or spending 20 precious minutes to wait for someone else. Time to Leave is a cool idea, not suited for my habits and local traffic.\nThere are more intelligent and proactive features throughout the Search page and the OS, but I’ve found their realization to either be dull or their impact to be minimal. You can ask Siri to bring up photos from specific time period and albums. The Search page has a News section at the bottom, which, as a European using an iOS device with a US region format, I found to be an unappealing mix of news about presidential elections, football, and TV spoilers; in theory, iOS should be able to display news relevant to my location, but given that they’ve always been American-heavy topics, I imagine iOS is basing its news collection skills on the user’s region format. You can also ask for weather, calculations, and sports results in the Search page, but only calculations and unit conversions worked for me (I’d call them an expected utility more than an intelligent feature).\nOur devices are becoming smarter and more context-aware every year, but Apple’s foray into intelligence and proactive suggestions doesn’t substantially alter the user experience of an iOS device. Instead, what Apple has put together is a mix of sometimes-working, nice-to-have additions that feel unfinished or that are poorly realized. Suggestions in the Search page leave much to be desired, with useful patterns that are obfuscated by generic and unmotivated shortcuts. Standalone app recommendations based on location and audio connections and Mail’s intelligent scanning is where Apple’s vision feels clear and coherent, with delightful discoveries that can save some time every day.\nIf you’re accustomed to the level of automated intelligence in Google services such as Google Now and Inbox, iOS 9 won’t offer that kind of experience. The Search page is far from the uncanny precision of Google Now when plugged into all of your Google data, and Mail’s new shortcuts pale in comparison to the automated processing and organization tools found in Inbox.\nThis is by design: while Google can pull it off thanks to their expertise and investment in looking at patterns across all of your data (which happens to be their business model), Apple has decided to prioritize user privacy as much as possible. That’s why proactive suggestions are (mostly) processed directly on-device, with Apple never syncing any of your usage patterns between devices or matching data from one Apple service to the other to build a more complete profile of you.\nIt comes down to personal preference and the level of potential creepiness you allow in your computing life. Google tends to deliver impressive intelligent features and shortcuts, at the expense of a wealth of data given in return. Apple’s efforts in iOS 9 are more modest in scale, but deeply integrated with the OS and built with privacy in mind.\nIt’s not about arguing who’s better; it’s about choosing what works better for you. Are you comfortable with Google’s impressive intelligence and suggestion tools knowing that they need as much data about you to power them, or do you prefer Apple’s more private but also less effective approach?\nMy stance on these issues has changed a lot since two years ago, especially after trying Inbox, Google Now, and having to go back to Google Apps’ Gmail due to slow IMAP sync and search. While I conceptually don’t like the fact that my data is being used by an army of algorithms, the service I get in return is useful and it lets me work faster every day. When it comes to faster work, measurable efficiency trumps ideological stances. As an Italian movie once said, you can’t buy groceries with ideals.\nIn Google’s Inbox, search is crazy fast, the app detects sentences that look like reminders, and it categorizes emails for me. When I was in San Francisco earlier this year, the Google app automatically pulled in my flight and hotel information from Gmail, displayed it with handy cards in the main view, and it figured out when I arrived at SFO and showed me weather reports and currency exchange rates.\nThat was pretty amazing and useful, and it’s not something that iOS 9’s intelligence is able to provide just yet. And I have to wonder if it ever will, given that what Google does – the depth of its user tracking and cross-service integration – could only be possible with constant, cloud-based data collection that doesn’t fit with today’s Apple.\nImagine, though, if Apple was willing to look for patterns inside apps, understanding what we write in private communications and what we search for to spot more useful patterns than app launches and EarPods connections. Would they have the skills required to build such intelligence? Would they want to?\nUltimately, it’s not fair to compare iOS 9’s intelligence to Google services: Google will never have this kind of access to device hardware and daily user patterns. iOS 9 delivers on small, periodic proactive enhancements that are meant to save time and surprise users. Their impact is not dramatic: some of them are nice shortcuts, but by not deeply aggregating user data from multiple sources, most of them are generic and stale.\nCaught between the tension of respecting user privacy and deepening data collection for proactive features, will Apple be able to ship more useful suggestions in the future? And how will they build it all?\nThere’s a lot of work to do. It’s up to Apple to figure out what their ideals can allow.\n\nPerformance and Low Power Mode\nEveryone struggles with battery life. Talk to any iPhone owner, and you’ll never hear them wish for shorter battery life on their device. There’s no such thing as enough battery. With iOS 9, Apple is taking some steps toward improving battery life on all devices, with particular attention to the iPhone.\nAcross the entire OS, Apple claims to have optimized apps and key technologies to be more efficient and consume less energy. On top of this, iOS 9 can use the iPhone’s proximity and ambient light sensors to detect when it’s lying facedown on a surface, and it won’t turn on the screen when a notification comes in. By itself, this sounds like a small change, but if you receive a lot of notifications every day, every drop counts.\n\nApple has increased the information displayed for battery usage, too. In the new Settings > Battery page, the list of apps and system functions consuming energy on your device includes some new entries (such as energy consumed by recently deleted apps) as well as additional detail that can be displayed by tapping the list of apps or the clock icon in the top right. This will list the minutes spent by apps in the foreground and background for the selected time range, which makes it easier to assess the consumption of an app depending on actual usage. More importantly, it simplifies the process of determining whether keeping background app refresh turned on for an app that consumes a lot of energy is worth it.\nThe protagonist of Apple’s battery life improvements is Low Power Mode. Available exclusively on the iPhone, Low Power Mode temporarily reduces power consumption by reducing or turning off Mail fetch, background app refresh, automatic downloads, and “some visual effects”. CPU and GPU performance may also be reduced, and Auto-Lock in 30 seconds is also enforced. When Low Power Mode is active, the iPhone’s battery icon turns yellow, and the battery percentage is displayed even if you normally keep it off.\n\nLow Power Mode can be activated in two ways. If you want to activate it manually, you can go into Settings > Battery and toggle it. There’s (almost) nothing stopping you from running your device in Low Power Mode all the time. The most effective way to activate Low Power Mode, though, is to wait for your device to reach 20% or 10% of battery left: when that happens, an alert will allow you to turn on Low Power Mode and start saving on energy until you can charge your iPhone.\nLow Power Mode isn’t some kind of placebo effect. During the iOS 9 beta period, I activated Low Power Mode every time my iPhone 6 Plus reached 20% of battery left, and I noticed how it gave it a few extra minutes (about 30 in my experience) I could use to get home via Maps navigation or to go back to my car and charge it.\nSurprisingly enough, Apple has thought about the possibility of users keeping Low Power Mode always on, and to discourage such usage – which could lead to an inferior user experience when there’s no need to – iOS turns it off automatically once an iPhone reaches ~80% of charge (Apple calls it a “sufficient level”). This won’t make it impossible to use an iPhone in Low Power Mode even if it’s not running short of battery, but it should be a deterrent.\nIn practice, I didn’t really notice any major functional difference when Low Power Mode was on. The “visual effects” that iOS 9 turns off are the animations of Dynamic Wallpapers and the parallax effect in Perspective Zoom, both of which I don’t use and that therefore don’t impact the visual appearance of my iPhone when they’re disabled. If other effects are reduced, I can’t notice: translucencies and other zoom animations are still available in Low Power Mode, so it’s not like iOS turns into a static, opaque mix of colors when Low Power Mode is enabled. As Apple states, background app refresh and Mail message fetch are also disabled in this mode – a fair trade-off when battery life is at stake. For the most part, Low Power Mode is unobtrusive and it yields the results promised by Apple.\nWhat’s going to be interesting, I think, is how third-party apps will react to Low Power Mode to reduce their own energy consumption. iOS 9 adds a new Foundation API with a lowPowerModeEnabled property of NSProcessInfo that changes its state when Low Power Mode is enabled. Developers can also listen for a NSProcessInfoPowerStateDidChangeNotification notification that, as the name implies, informs an app on the state of Low Power Mode. Apple is advising developers to check the state of Low Power Mode in their apps, reducing costly computations such as frequent network activity and high frame rates to save energy.\nIn Breslan’s case, his app Departure Board could support Low Power Mode by decreasing the usage of an API that checks for train departures every minute, allowing an iPhone to consume less energy by pinging a server less frequently. I tested a few apps with Low Power Mode integration in iOS 9: the email client Dispatch, for instance, is going to disable CoreSpotlight indexing and profile pictures in Low Power Mode to prevent an iPhone from indexing new messages in the inbox and fetching avatars from the Internet.\nMax Litteral’s Television Time, a TV show tracker, takes similar measures to reduce CPU consumption when the user enables Low Power Mode: the app will stop animations when loading images into table view cells, it’ll disable downloading and animating thumbnails in search results, and it’ll also disable face detection on show posters. Litteral is looking into disabling show sync in Low Power Mode as well, which would further reduce energy consumption by stopping network activity.\nThe work that Apple has done at a system level to prolong battery life with more efficient apps and Low Power Mode should already be enough to give most users a few minutes of extra battery. But in an ideal world, additional savings could be granted by third-party developers being good platform citizens and supporting Low Power Mode by adjusting their apps’ behavior to consume less energy whenever possible.\nPerplexingly, Low Power Mode isn’t available on the iPad – a missing feature in stark contrast with the iPad focus of iOS 9. I don’t understand what pushed Apple to make Low Power Mode an iPhone-only perk of iOS 9, and I’d like to see it coming to the iPad as soon as possible. Even if not as essential as an iPhone for emergency scenarios, Low Power Mode could be useful for iPad users looking for some extra minutes of battery life every day.\nThe last two major versions of iOS have been riddled with bugs and performance issues in their first releases, and it’s good to see Apple prioritizing efficiency, storage, and stability this year. In addition to battery-themed enhancements, iOS 9 introduces smaller software updates for consumers (which I couldn’t measure with developer betas), an option to install iOS updates at a later stage, and App Thinning, a set of optimization tools (slicing, bitcode, and on-demand resources) to decrease the footprint of apps by delivering only the resources needed by a user’s device. I couldn’t test App Thinning either, but I expect it to have major implications for the future of local device storage, App Store downloads, and gaming content.\nStability-wise, iOS 9 is a step up from iOS 8. In my experience, I have seen only a couple of Home screen crashes and random reboots in three months of iOS 9 betas (down from at least twice a week, both on the iPhone and iPad), fewer interface glitches, and increased stability on the iPad Air 2 with Apple apps, using the app switcher, and extensions. New multitasking features on the iPad Air 2 have been rock solid, with a smooth Slide Over, stable Split View, and flawless Picture in Picture.\nUnfortunately, memory pressure continues to be an issue on the iPhone 6 Plus, which can still lag in terms of frame rate compared to the Air 2 and that often purges apps from memory to free up resources for the current app. On the 6 Plus, I’m still seeing Safari frequently reload tabs after switching to another app, a slower than usual Camera, and slightly delayed touch responses when moving between apps.\nApple’s house cleaning in iOS 9 offers a superior experience than older versions of iOS – at least on recent hardware – and it removes much of the cruft that had accumulated over the past two years. The real effect of features such as App Thinning and Low Power Mode for apps will only be seen in the next few months if developers opt into them, and it’s intriguing to imagine a future where they’ll be the norm.\n\nEverything Else and What I Couldn’t Test\nThere are dozens of other features in iOS 9 that, either in Apple apps or developer APIs, make iOS devices fast and more efficient. A few highlights:\nUpdated setup process. By default, iOS 9 prompts users to create a complex (6-digit) passcode when setting up a device for the first time. Other passcode types (such as 4-digit numeric code) are still available by tapping an options button. iOS 9’s setup also features a new option to move data from an Android device with a dedicated Android app made by Apple.\nYes, these are glorious.\nNotification quick replies for third-party apps. In iOS 9, apps can implement quick replies in actionable notifications. Introduced with Messages last year, Apple has opened up the API to developers this year, allowing them to mix custom buttons with a text field to type a reply inside the notification without opening the app. I tested this feature with beta versions of Twitter clients and other apps, and it’s incredibly convenient. I expect all messenger-type apps to adopt this.\nTrust developer certificates. Installing beta versions of third-party apps from outside TestFlight gets a little more convoluted (but also more secure) in iOS 9 with a new system that requires you to manually trust enterprise developer certificate. To do so, you’ll need to go into Settings > General > Profiles > Enterprise Apps and select an installed app you can trust; otherwise, it won’t launch.\nSafari can save PDF to iBooks. In iOS 9, you no longer need a third-party app to save a webpage as PDF. A new iBooks share extension lets you to save any webpage as a PDF document into the app – perfect for reading articles later or sharing PDFs via email and annotating them with Markup.\nWi-Fi Assist. Available in Settings > Cellular Data, this new toggle allows an iOS device to use its mobile network when Wi-Fi connectivity is poor. A lot of people have welcome this option, but I prefer to keep it turned off as I like to know exactly which network I’m using, and the automatic switch between cellular and Wi-Fi is not clear in the UI.\nReplayKit for recording gameplay. For the first time, Apple is offering game developers an API to natively record gameplay without using a third-party SDK. ReplayKit can be initiated automatically by an app as the user is playing, or it can be started manually. Apps can ask users to record the screen only or also access the microphone to include game commentary. A video file generated with ReplayKit is then passed to the system share sheet in-app, so users can save it to the Photos app, send it to other apps, or share it on social networks. With ReplayKit, creating Let’s Plays and video reviews for iOS games should be easier and better integrated with the system.\nShared Links extensions. Apps in iOS 9 can plug into Safari’s Shared Links section to give users the ability to view links alongside RSS and Twitter links already supported in the browser. I suspect that news readers, social apps, and RSS apps will consider this extension type to make their items available outside in Safari, but I haven’t been able to test any of them yet.\nMaps Nearby and route delays. In addition to public transit information, Maps for iOS 9 features Nearby categories (a more complete section of the same options in Search) to browse businesses nearby and explore places around you. Also, during navigation, Maps will show banners at the top of the screen for upcoming closed roads, roadwork, and faster routes.\nSwipe down to dismiss photo previews. In iOS 9, swiping down when viewing a photo in Photos or Messages lets you close the preview instead of reaching out to the Done button at the top of the screen. Another example of a sloppy gesture that simplifies interaction and that cuts down the number of taps required to operate the OS. I’m using this every day now.\n\niPad Today view gets bigger. The Today view of the iPad in landscape mode has received a new two-column layout split in Today view and Widgets view. Widgets can still be added as before, but now you can choose to display them with a bigger column on the left (Today) or in a narrow column on the right. This brings a better visual organization of widgets without any functional difference between the two columns, and it takes advantage of the bigger screen to show more widgets at the same time. I’m looking forward to trying this improved layout on the iPad Pro.\n\nNew widgets. Speaking of the Today view, iOS 9 adds new widgets for Find my Friends and to glance at the battery level of connected Bluetooth accessories as well as the device itself. The new ‘Batteries’ widget is automatically installed once you pair an Apple Watch with your iPhone – and it’s been a convenient way to check on my Watch battery level directly from my iPhone. Plus, iPhone, Apple Watch, iPad, and even Beats wireless headphones get nice icons to complement the widget’s appearance.\n\nSpotlight unit conversions and calculations. The improved Spotlight of iOS 9 goes beyond Search and it borrows from OS X to let you perform quick calculations and unit conversions. You can convert currencies (type or dictate “10 USD to EUR”), perform operations (also with values such as “pi”), convert temperatures, and more. I’ve been using this regularly, and it reduces the number of apps I have to keep on my devices for basic conversions and operations.\n\nApple Music gets a new contextual menu. The Music app went through a major redesign for Apple Music and Beats 1 in June, and iOS 9 brings a revamped contextual menu that packs more options in a cleaner presentation. The menu now displays a larger album artwork at the top that clearly indicates you can tap it to go to the selected item. Underneath it, new icons allow you to love a song, start a station, and share. A good improvement as Apple continues to fix and clean up the issues that affected the new Music app since launch.\nLast, because of limitations in my country, unavailable hardware, or features that can only be tested after the public launch of iOS 9, here’s a recap of what I couldn’t test for this review:\nSmaller software updates for new iOS releases\nApp Thinning technologies for App Store apps\nCarPlay improvements\nCar Reminders\nPublic transit in Maps\nHomeKit changes\nNew HealthKit categories\nShared Links extensions\nWallet (née Passbook) and Apple Pay\n\nFive Years On\nIn some ways, iOS 9 feels like the third and final installment of the iOS 7 saga.\nWith San Francisco, a revised keyboard, and an interface that’s been polished across the OS, Apple’s new design language has moved past its awkward (and problematic) teenage years to accept its own style and voice. Interfaces aren’t diamonds: they’re not forever, and iOS will change again. For now, iOS 9 is, visually speaking, a culmination of the work started two years ago, ready for what comes next.\nChanges to default apps and new system features also are iterative improvements that complete Apple’s post-iOS 7 vision and get rid of problems accumulated so far. Podcasts, Mail, iCloud Drive, and new features in Safari don’t revolutionize those apps, but they make them substantially better for everyone. Deep links and Universal Links show that Apple has long been thinking about doing more than URL schemes to simplify opening and linking to apps. Safari View Controller extends Safari to web views in any app. User activities and app indexing for Search prove that Apple often plays the long game, showing their hand only when every technology is in place for an arguably basic utility (Spotlight) to become something new.\niOS 9 isn’t Apple’s new Snow Leopard. Just because some additions and changes may not be as massively popular or be instantly recognizable as a new design and custom keyboards were, it doesn’t mean they don’t exist. The company has put more resources into optimizing iOS, and early results are encouraging. Low Power Mode makes a difference when battery is running low, and developers will be able to support it in their apps; the entire OS feels snappier and more stable in daily usage.\nThis isn’t the year of No New Features: under-the-hood optimizations and app enhancements walk together in iOS 9, with Notes being a prime example of it. Apple isn’t taking the foot off the pedal: they’re just being more careful behind the wheel.\nAlong the way, there are some missteps and disappointments that will have to be taken care of in the future. Mail is lagging behind the innovation we’re seeing in email apps from large companies and indie studios. iCloud Drive is far from the functionality offered by Apple’s OS X Finder and other companies’ iOS apps. The intelligence of Apple News and proactive suggestions leaves a lot to be desired, casting doubts on whether it can dramatically improve. And as others are reimagining mobile messaging with integrations and services that were unimaginable a few years ago, major changes are suspiciously absent from Apple’s Messages app this year.\nAs a result, iOS 9 for iPhone is a more efficient version of iOS that many will appreciate with time as they discover what’s new. Its improvements aren’t as easily marketable as iOS 7 and iOS 8, but they’re not any less important. What Apple hasn’t done this year doesn’t make iOS 9 worse: it just adds to a list of low-hanging fruit for next year.\nIf you’re an iPhone user and you’re skeptical on whether to upgrade or not, my recommendation couldn’t be easier this time around: iOS 9 is better than iOS 8 in every way and you should upgrade.\nAnd then there’s the iPad.\nThis year, the iPad is getting the first version of iOS truly made for it. After too many unimaginative releases, Apple has understood the capabilities of the iPad’s display and its nature of modern portable computer. Free of dogmas and preconceptions of what an iPad ought to be, iOS 9 fundamentally reinvents what an iPad can become going forward.\nPicture in Picture rethinks watching video on the device. Slide Over cuts down the time required to jump between apps. New shortcuts make working with Bluetooth keyboards on the iPad a joy. And by using two apps at once, Split View reimagines the role of the iPad in the iOS ecosystem, positioning it between an iPhone and a Mac for people like me who need exactly that.\nI’ve been working on the iPad for the past three years, and the changes in this year’s iOS release have done more to make me work faster than iOS 6, 7, and 8 combined. Apple has created a new beginning for the iPad’s software, and while it’s not perfect and the experience will have to be refined in places, the impact of iOS 9 for iPad users – especially iPad Air 2 users – will be felt in the following years.\nIt’s not easy to blend tradition with a fresh start, but that’s what iOS 9 does to the iPad. The iPad is still the device where you can immerse yourself in one app at a time. But now you can also do more, using the large screen to switch between multiple apps and tasks with multitouch. And in doing so, you’ll discover that iOS 9 multitasking for iPad doesn’t carry over the complexities and rules that applied to traditional desktop OSes.\nApple has leveraged years of design and interaction constraints to give more freedom to iPad users, creating an experience that can be more complex but still intuitive. The iPad is not a Mac, but the same argument works in reverse, too: a Mac is not an iPad – it can’t have its portability, it doesn’t have its app ecosystem, and it’s not a screen you can hold in your hands. With iOS 9, there are even more reasons to consider an iPad as a new kind of computer: capable of true multitasking, and built with the strengths of iOS in mind.\nWith iOS 9, Apple is ready to admit that the iPad is a computer without the baggage of the PC. I know that I’m not going back to a Mac, and this review wants to be a tangible proof of it. Writing this wouldn’t have been possible without iOS 9, and I’ve never felt more focused.\niOS 9 shows where the future of iPad is. The leap has been taken.\nFive years later, it’s just like starting over.\n\n\nWell, technically five. ↩︎\n\n\n\nAlthough Apple didn't go as far as the popular Teehan+Lax concept, the result is close. ↩︎\n\n\n\nNot to mention that, for visually impaired users, being able to discern character status when the entire keyboard changes is a stronger visual cue than a single, confusing Shift key. ↩︎\n\n\n\nPersonally, I chose to keep it off, as having lowercase letters on the keyboard is enough for me now. ↩︎\n\n\n\nUnlike iOS 8, you can slide your finger across buttons without lifting it off the screen, and you'll be able to select different buttons when sliding. ↩︎\n\n\n\nI'm curious about the ability for developers to initialize Safari Reader programmatically, perhaps with a custom button. I haven't seen this done anywhere yet, but it should be possible. ↩︎\n\n\n\nIf user activities aren't supported in an app, Siri will create a reminder called \"This\". ↩︎\n\n\n\nAccording to developers I spoke with, the deep link information can be accessed when reading smart reminders via CalDAV, as it's hidden metadata stored in a base64 archive. ↩︎\n\n\n\nThird-party apps like MindNode (which I tested for this review) can use an iCloud Drive version API to show and restore versions. However, Apple isn't using this API in their iCloud Drive app. If you want to browse versions of files in iCloud Drive, you'll need an app with support for that feature. ↩︎\n\n\n\nOr, you can tap a dedicated button in the Shortcut Bar on iPad. ↩︎\n\n\n\nFor most file types, you can also save an attachment elsewhere with a new action extension. ↩︎\n\n\n\nIf you convert this article to American units, it's only 10,000 words long. That's how it works, right? ↩︎\n\n\n\nPreviously only available on a Mac. ↩︎\n\n\n\nLongtime Mac users will remember this as one of Apple's finest touches on OS X, often brought up as an example of the company's attention to detail. ↩︎\n\n\n\nIn overly simplistic terms and broad strokes. ↩︎\n\n\n\nApple relies on Open Graph for its web crawler and search, and it's my assumption that Notes parses similar meta tags (including, possibly, Twitter Cards) to build its web link previews. ↩︎\n\n\n\nThis has been very useful to avoid rickrolls and other silly links on Twitter. ↩︎\n\n\n\nAt least a network spinner up in the status bar! ↩︎\n\n\n\nOr the left edge, if you're using iOS 9 with a right-to-left language. In iOS 9, Apple has introduced support for completely mirrored interfaces thanks to enhancements to UIKit; as a result, Slide Over will appear on the left side of the screen for right-to-left languages. ↩︎\n\n\n\nThe main difference is that Notification/Control Center can be fully revealed with a swipe from anywhere at the top or bottom (not just the center), and they also work on the Home and Lock screens. It's not totally obvious that Slide Over can be activated by swiping just around where the vertical center of the screen may be; this decision makes sense when you consider the potential conflicts with apps that already rely on swipe gestures from the right edge. ↩︎\n\n\n\nWith one notable exception: the rightmost area of the iOS status bar. Slide Over's own title bar covers the space where iOS displays the battery, Bluetooth, and location icons. I believe this was a deliberate design choice, as it reinforces the idea that Slide Over is meant for temporary, brief interactions to get in and out of apps (because you wouldn't want your status bar to be always covered by something else). ↩︎\n\n\n\nBut both Control Center and Notification Center can be opened without dismissing Slide Over. ↩︎\n\n\n\nIf you want to have a good time, swipe up and down in the Slide Over app switcher to watch app icons grow and shrink again. ↩︎\n\n\n\nWhat happens if you press the volume button to take a picture in Split View with two camera apps? What about processor usage and battery life for two live camera views? ↩︎\n\n\n\nI suppose this is powered by UIFieldBehavior, new in iOS 9 as an enhancement to UIKit Dynamics. According to Apple, \"a field behavior defines an area in which forces such as gravity, magnetism, drag, velocity, turbulence, and others can be applied\". ↩︎\n\n\n\nProvided the keyboard isn't shown, otherwise it's down to two. ↩︎\n\n\n\nThis, again, likely made possible by UIFieldBehavior. ↩︎\n\n\n\nOn a related note, I don't think it's too crazy to imagine third-party shortcut extensions as system-wide options, similar to custom keyboards, but as accessories to Apple's software keyboard. ↩︎\n\n\n\nIt's worth mentioning that, in some apps, you're not limited to swiping on the keyboard to control cursor and text selection: you can swipe anywhere on the text field (even if the software keyboard isn't shown because you're using an external keyboard) to move the cursor and select text. This is less intuitive than using the keyboard as a trackpad; you can try it with Notes if you're curious. ↩︎\n\n\n\nNotably, you couldn't send an iMessage by pressing a combination of keys on a keyboard. ↩︎\n\n\n\nFind in Page won't be available if you're not on a webpage, for example. ↩︎\n\n\n\nWe all miss you, webOS. ↩︎\n\n\n\nIf you're in Split View and tap a link to an app (or a notification), the primary app will switch to show another app, putting up a back link in the top left of the status bar. ↩︎\n\n\n\nThe way the icons subtly fade as you swipe between apps, or how an adjacent card snaps to the right as you quit an app is well done. ↩︎\n\n\n\nI don't know what's worse here – the fact that I had to use iTunes, or that the Videos app is still around in its gloriously retro feature set. ↩︎\n\n\n\nWhen in Split View, Safari also has a handy Open in Background button that lets you open new tabs in the background, despite the app's compact size. I wish this option was also available on the iPhone. ↩︎\n\n\n\nI don't think I've ever been at one of those, let alone at 9 PM. I wish my lifestyle allowed me to hit the club at 9 PM, though. Maybe iOS 9 is encouraging a secret wish of my subconscious. ↩︎\n\n\n\nMy espresso addiction may be bad, but it's not that bad. ↩︎\n\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2015-09-16T07:40:24-04:00", "date_modified": "2018-03-20T13:15:30-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iOS 9", "iOS Reviews", "iPad", "stories" ] }, { "id": "http://www.macstories.net/?p=36455", "url": "https://www.macstories.net/stories/ios-8-changed-how-i-work-on-my-iphone-and-ipad/", "title": "iOS 8 Changed How I Work on My iPhone and iPad", "content_html": "\nWhen I reviewed iOS 7 last year, I took a different approach and tried to consider Apple’s redesigned OS from the perspective of someone who uses iPhones and iPads for work and personal tasks on a daily basis. I noted that a new structure enabled developers to make more powerful apps, and I concluded hoping that Apple would “consider revamping interoperability and communication between apps in the future”.
\nWith today’s release of iOS 8, Apple isn’t merely improving upon iOS 7 with minor app updates and feature additions. They’re also not backtracking on the design language launched last year, which has been refined and optimized with subtle tweaks, but not fundamentally changed since its debut in June 2013.
\nApple is reinventing iOS. The way apps communicate with each other and exchange functionality through extensions. How status awareness is being brought to iPhones, iPads, and Macs with Handoff and Continuity. Swift and TestFlight, giving developers new tools to build and test their apps. Custom keyboards and interactive notifications.
\nThere are hundreds of new features in iOS 8 and the ecosystem surrounding it that signal a far-reaching reimagination of what iOS apps should be capable of, the extent of user customization on an iPhone and iPad, or the amount of usage data that app developers can collect to craft better software.
\nSeven years into iOS, a new beginning is afoot for Apple’s mobile OS, and, months from now, there will still be plenty to discuss. But, today, I want to elaborate on my experience with iOS 8 in a story that can be summed up with:
\niOS 8 has completely changed how I work on my iPhone and iPad.
\n\nA year after iOS 7, I would say that it hasn’t been a perfectly smooth transition, but, at least for me, it’s hard to look back at what iOS used to be and miss it.
\nFollowing the launch of iOS 7, it became clear that Apple hadn’t had much time to optimize the OS for a bug-free experience that also needed to perform reasonably well on older hardware. For someone who relies on the iPad for work purposes, the first few months of iOS 7 were rough: in spite of Apple’s initial bug fix updates, I kept getting Home screen crashes, random reboots, hard resets, BSoDs, and, generally speaking, a bevy of graphical glitches that were new to iOS – traditionally, a highly polished and stable platform.
\niOS 7’s technical problems weren’t exclusive to Apple’s own apps and features: as a result of lingering bugs in the final OS and the developer SDK, third-party apps exhibited a variety of text-related issues, inconsistent animations, and crashes. I know of several developers who needed to work around Apple’s bugs to avoid crashes and glitches…which eventually led to other bugs after Apple began releasing iOS 7 updates.
\nThe launch of iOS 7 seemed to confirm that Apple wasn’t the kind of company that could handle a complete redesign while adding major framework and feature additions and hope to release a stable OS. My experience with the iPad as my primary computer was, from a technical perspective, worse than iOS 6. Until Apple released iOS 7.1, I had to cope with more bugs and crashes than I ever expected.
\nFrom a big picture perspective, however, I think that iOS 7 was necessary. Breaking with old design trends and longstanding UI conventions allowed Apple to modernize iOS and kickstart a process that would see the company and third-party developers rediscover the personalities of their software. Whether it’s Apple experimenting with different designs for music players or Evernote continuing to tweak its app to find the right balance of updated interface and functionalities, the undeniable truth is that we’ve ended up with fantastic pieces of iOS 7 software such as Elevate, Skitch, Overcast, and thousands of other apps that I doubt would have been possible hadn’t Apple drawn a line in the sand with iOS 7.
\niOS 7 changed the conversation from software that had to look somewhat realistic to apps that work well with a focus on content, clarity, and color. The redesign wasn’t the end goal – it was the motivation to start fresh and make better apps.
\niOS 7 was a bitter medicine – and I believe that the ecosystem is stronger because of it.
\n\n\nAt the peak of criticism last year, many thought that iOS 7’s redesign was a fashionable excuse – a facade – to cover the fact that Apple was running out of ideas. Instead, I now see many of Apple’s decisions with iOS 7 as functional and directly related to this year’s deep changes in iOS 8. Just to name a few: improved background refresh and a more consistent visual style will allow App Extensions to be more versatile and consistent than they would have been without iOS 7; the Today view – useful but limited – can now become an area for interactive widgets; Near Me, tested for over a year, will be integrated in a much more useful Explore section on the App Store.\n
On the eve of WWDC 2014, I was looking through our archive of app reviews on MacStories and I realized how much had changed from a visual perspective over the years and how little things had improved from a functionality standpoint. Despite Apple’s (strenuous, but ultimately rewarding) efforts to modernize iOS, antiquated paradigms had remained at its core: with no unified system to let apps collaborate on a common task or exchange documents across multiple apps without creating duplicates, iOS 7 was still fundamentally rooted in old limitations that had no reason to exist in 2014.
\nWith iOS 8, Apple is making good on their promise of entering the post-PC era with features that are unexpected and that will take time to digest, but that are still uniquely iOS.
\nOver the years, I developed a series of habits and built workflows to get work done on my iPad with the same degree of functionality of my Mac. That wasn’t an experiment to prove a point: it was a necessary consequence of not being able to sit at my desk every day. I needed the portability of the iPad, so I reinvented the way I worked with it.
\nThe limitations of iOS soon became clear to me, and I had to set up complex workarounds and scripts to overcome them. Without an open OS capable of exchanging files across apps through a filesystem, I had to rely on specialized utilities that would often generate their own copies of files and waste precious storage space on my device.
\nThe Open In menu was my savior and enemy: I could use it to send a document to any app, but that would create additional copies of the same document in silos unable to communicate with each other. When I needed to annotate a screenshot and use it in a blog post, I’d end up with three copies of same file. If I wanted to proofread an article in a dedicated grammar-checking app, it would result in two versions of the file and lots of manual copying and pasting between apps.
\nAnd that’s just for files and documents. I’ve come up with all sorts of custom commands and hacks to achieve some basic inter-app communication, which resulted in a lot of hours spent fixing problems with Apple’s sandboxing and figuring out the easiest solution for a problem that wouldn’t exist on a Mac.
\nOn OS X, the power of Alfred, Keyboard Maestro, Dropbox, and the Finder is just one click away. On iOS? I tried hundreds of apps that could help me in my everyday work life, and none of them knew about each other. If I wanted to save time and effort in working from my iPad, I needed to find openings in Apple’s closed ecosystem and turn them into automated workflows to mimic my OS X setup.
\nThat’s what I did, and I loved every single minute spent hacking and testing what could be possible with “underpowered” iOS apps. I built scripts to automate image editing and combine that with Dropbox uploads. I connected apps with x-callback-url to let them collaborate on a single task with one tap. I set up shortcuts in Launch Center Pro and chained actions in Drafts. I let Editorial take care of everything else.
\nI’m about to throw most of this stuff away with iOS 8.
\nThanks to Apple’s work on extensibility and new technologies available to third-party developers, apps are finally able to talk to each other, working in unison to offer their services when and where they make sense. With iOS 8 extensions, apps can become features available in other apps. And while that won’t mean the end of some of my automation workflows, dozens of workarounds I set up for basic inter-app communication won’t be neeeded anymore.
\nAs a recap of the Extensibility primer that I wrote in June, there are different types of extensions that apps can include in iOS 8:
\nExtensions aren’t sold as separate apps on the App Store – they are bundled into regular apps and they need to be activated in specific parts of the OS called “extension points”. Extensions are installed, but not enabled, when you download an app from the Store, and they are deactivated and removed when you delete an app. Instead of rehashing the explanation that I published after WWDC, I want to focus on the practical changes that extensions have brought to my daily iOS usage.
\nExtensions are such a big change for iOS productivity, I still tend to forget about them: it’ll take time to realize that iOS can now complete tasks that used to be exclusive to Macs.
\nFor people who want to work on the iPad, iOS 8 extensions make sense only when considered in the context of third-party apps and what developers will create for the new OS. This made it harder to test and reflect upon iOS’ changes this summer: last year, you could use an iPhone running iOS 7 and get a sense of the design differences between Apple apps and third-party software; this year, new third-party apps are required to understand the true potential of extensibility.1
\niOS 8 is, first and foremost, all about third-party apps and the possibilities they create for users and developers. For the past three months, I’ve been testing dozens of updated apps – from either big companies and smaller indie shops – and installed extensions in an effort to understand what iOS 8 would bring. In the process, iOS 8 apps reinvented the way I work from my iPhone and iPad.
\nWhen Apple introduced the Today view in Notification Center last year, I lauded the glanceable and contextual information that it offered through small blocks of content that could preview my upcoming calendar appointments, current traffic, and local weather conditions. I relied heavily on Reminders back then, and I welcomed the ability to mark tasks as completed directly from Notification Center without having to open an app. I noted that breaking notifications in two distinct layers was confusing, but, overall, I was positively impressed with the Today view and I imagined it’d have an interesting future.
\nIn iOS 8, Apple has simplified Notification Center and turned the Today area into an extension point that apps can use for glanceable and interactive widgets. When Craig Federighi unveiled widgets as part of the extensibility announcements at WWDC, I thought that they would turn out to be a modern take on the old OS X Dashboard – somewhat handy, but too detached from the main experience to make an impact. I was wrong.
\niOS 8 widgets are extensions that can preview content and have lightweight interactions with their host app. This seemingly obvious statement is the raison d’être of widgets: unlike Dashboard widgets on OS X, iOS 8 widgets are deeply integrated with the app they belong to; therefore, using them has far less friction than switching back and forth between apps and the Dashboard on OS X.
\nApple’s built-in examples aren’t that great: like last year, there’s a widget to see your current calendar events and weather forecasts, a list of tasks, and a summary of your schedule. Apple’s widgets are extremely basic in that they preview content and allow you to tap them to jump to their respective apps. They fit well with the Today view’s underlying premise – these are widgets for stuff you need to do or see today. They’re useful, but they don’t show the full potential of the new Today view.
\nThird-party widgets (like other extensions) are bundled with apps from the App Store and they need to be manually activated by tapping Edit in Notification Center and adding them to the Today area. Widgets can be deactivated without uninstalling the host app, but they will be permanently deleted if the host app is removed from your device. If you downloaded apps that offer widgets you haven’t enabled yet, Notification Center will tell you with a message at the bottom of the screen.
\n\nSince June, I’ve seen developers making widgets for all kinds of apps and not necessarily for content that is relevant “today” or that is based on time-related components. Apple advised on creating lightweight widgets with minimal UIs and interactions, and that hasn’t stopped developers from coming up with ideas that are far more useful than a weather preview.
\nThe Evernote widget has buttons to quickly create a new text note, a note from the camera, or save a photo from the device’s library. These buttons don’t bring up a keyboard in the Today view – it’s impossible for apps to invoke text input in Notification Center – but they let me jump directly into Evernote and the section associated with the button I tapped. The “text” button opens the app in typing mode; the camera button opens the app and launches its camera feature; the photos button opens Evernote with a photo picker.
\nI’ve been using this widget several times a day: it saves me time I would spend launching the app and navigating its interface to create new notes, and it’s a much more native and integrated system than hacks like URL schemes because it’s always just a swipe away.
\nIn fact, I suspect that the whole “app section launcher” idea will take off as one of the most popular uses for Today widgets. A productivity app I was testing offered a grid of actions that, once configured in the app, could be launched from a widget that displayed custom icons for each action. A read-later app is adding a widget with a preview of articles saved on the current day; tapping an article title’s in the Today view opens the app directly into that article. A fitness app to track steps and runs called Runtime is coming out with a widget to view the amount of steps taken on the current day (as returned by the iPhone’s M7 chip) and a button to start a new run directly from Notification Center.
\nThe concept of action launchers that was popularized by Launch Center Pro will feel right at home in the Today view for iOS 8, and I believe it’ll be a better fit thanks to Notification Center’s system-wide presence. But it’s the interactivity allowed in the Today view that turned widgets into must-have extensions for my daily workflow.
\nA few weeks ago, I began testing Clips, an upcoming clipboard manager by the creators of Dispatch to save and access snippets of text previously copied on an iPhone or iPad (it’s launching soon).
\nClipboard management has always been one of the biggest advantages of using OS X over iOS for “real work” – on the Mac, you can use apps such as Alfred or ClipMenu to constantly monitor everything you copy, archive it, and paste it at any time, even if you copied a string of text two days ago. On iOS, developers were never able to create desktop-like clipboard managers to monitor what you copy and paste 24/7: iOS just doesn’t give developers access to that data. So if you wanted to use a clipboard manager, you’d have to cope with the limitations of utilities like EverClip, an app that could monitor your clipboard activity for 10 minutes at a time and that would then be killed in the background by the OS. The high friction required to use these apps was the reason I never truly got into them.
\niOS 8 is not going to allow apps to constantly monitor the clipboard in the background, and there isn’t going to be a mobile version of ClipMenu or LaunchBar’s clipboard menu on the App Store today. The developers of this new clipboard manager, though, came up with the idea of using a widget as a quick entry input option for manual clipboard archiving: after you’ve copied something, slide down Notification Center and hit the app’s “+” button in the widget. What you copied will be saved as clipped text into the app without actually opening the app, and it will be previewed inside the widget for future copying with one tap.
\nI’m using this widget every day and it has allowed me to do research for MacStories and Relay without wishing I had a Mac. It’s not as full-featured as ClipMenu or Alfred, but the ease of access of the widget makes the act of saving your clipboard effortless because Notification Center is always there. I’ll accept the trade-off of having to archive my clipboard manually if it means I can do it visually and quickly with a widget that I can always bring up with a swipe in any app I’m using.
\nIn my tests, I noticed some technical issues with widgets that I believe are related to third-party apps and bugs left in iOS 8.
\nIf you install several widgets on your device, you may notice a brief loading time for content that needs to be visualized when you swipe down to open Notification Center. This problem was more noticeable in the previous betas of iOS 8 and refresh speed has gotten considerably better on the public release, but it can still occurr. I ran my tests with about a dozen third-party widgets on an iPhone 5s and a second-gen iPad mini.
\nThe second (and particularly minor) annoyance is that the iPad can’t load widgets from iPhone apps installed in compatibility mode. While I don’t generally like to keep iPhone apps on my iPad, I wanted to use a widget that’s available in an iPhone-only app and discovered that, despite emulation, the iPad couldn’t use it.
\nLike other types of extensions, there will be an explosion of widgets on the App Store. If the apps I’m trying are of any indication, most developers will want to offer some kind of shortcut in the Today view, but at that point you’ll have to wonder how much you’ll benefit from a widget that you need to locate like an app on a crowded Home screen.
\nI’ve tried to be selective with the widgets I want to keep on my devices. There’s no point in scrolling a page full of widgets, even if curiosity will push you to install many of them. That’s normal, but I’ve discovered that I prefer (and ultimately benefit from) keeping only a few widgets around.
\nWidgets validate the idea of app launchers and action shortcuts. In spite of the “Today” name, the best widgets that I tested weren’t about the next 24 hours or today’s forecast – they were interactive menus and app extensions that allowed me to save text, launch specific actions in a couple of seconds, and navigate less around my Home screen. In my daily workflow, widgets enabled me to complete tasks that were already possible (such as moving bits of text from Safari to a Pages document) more easily and quickly, giving me extra time to do something else on my iPad.
\nThe “Today” name is, at this point, non-descriptive of what widgets are bringing to Notification Center. I wouldn’t be surprised to see two tabs – “Widgets” and “Notifications” – in an Action Center/Dashboard rebranding next year.
\nIt’s in the action and share extensions, however, that iOS is finding its maturity as a platform and a new beginning for a rich ecosystem of apps. Action and share extensions have changed the way I get work done on iOS and they mark an important new chapter for third-party apps.
\nFirst, some history. iPhone OS was built from the technological roots of OS X, but it never gained the Services menu that, for decades, had allowed developers to abstract features from their apps and make them available as shortcuts in other apps. Mockups and concept videos didn’t take long to appear, but Apple never caved to the pressure of power users who wanted an easy way to let apps communicate and exchange functionality.
\nThe umbrella term that “inter-app communication” became over the years stood for a fairly primitive need: apps should be able to collaborate on text, files, and other types of data without requiring too much user interaction. Why would you need to manually copy and paste a Safari webpage that you want to turn into a todo? Wouldn’t it make sense for apps to offer their services to other apps and make users happier, more productive, and more satisfied with their software purchases?
\nIt did make sense, but without an Apple-sanctioned solution developers had to come up with their own fragmented, flaky, often unreliable technologies to enable some kind of communication between iOS apps.
\nA few examples come to mind:
\nWith the exception of GoodReader, I relied on all these tools to get more done on iOS in less time and to a higher degree of what the platform could normally provide. It was all I could use, but I knew – as I often argued here at MacStories – that none of them was a solution.
\nThere are two ways to look at iOS’ old limitations. There were so-called power users like me, Alex, Eric, and hundreds of others who liked to tweak their devices and were willing to invest hours in creating workflows that could save them a few minutes each day. And then there are the millions of people who simply don’t care. The people who buy iPads and want to write a college essay or prepare academic research on it don’t want or need URL schemes.
\nBetween empowering the masses and pleasing a vocal circle, Apple will always choose the former. A cornucopia of app functionality was being wasted before extensions. Action and share extensions demonstrate how Apple has been thinking about these problems to create a system that’s far more powerful than old hacks and workarounds, secure by design, and user-friendly in a way that, like widgets, makes sense.
\nAction and share extensions are installed alongside their respective apps and can only be launched from the system share sheet. To make this clear: on iOS 8.0, you will never be able to launch an action or share extension without tapping on its icon in the share sheet.
\nFor this first version of iOS 8, Apple chose to confine extensions to specific areas of the OS called extension points: widgets are displayed in the Today view, custom keyboards can be loaded from the system keyboard, and actions can be activated from the share sheet. I don’t want to get in the technicalities of Apple’s system, which likely took years to develop as Apple wanted to build a secure inter-app communication system that wouldn’t put user data at risk while also remaining simple and easy to activate. Creating that kind of secure opening in the sandboxing model must have been a huge effort among Apple engineers, but I want to focus on the user experience.
\nIn the three months I spent with iOS 8 on the iPad and iPhone I use for work every day, action and share extensions have been amazing. They are app features available in other apps with their own custom interfaces and they’re compatible with any app that supports the system share sheet. Action and share extensions expose specific functionalities from the apps you already use and they feel like the next logical step for the Services menu.
\nAction and share extensions coexist in the share sheet and the differences between them can be blurry. While Apple has been adamant about the fact that share extensions should be intended for social services (see the company’s original share sheets) and actions for everything else, I have been trying share extensions that save the extension’s input locally without ever posting it to an online service. For this reason, I would say that the best way to think about them is this: share extensions are at the top of the share sheet and they’re used to save the extension’s input (a link, some text, etc) somewhere else; action extensions perform a more complex task in the app that’s currently being used.
\nLike widgets, I believe that action and share extensions are going to be extremely popular among developers of utilities and productivity apps. The important aspect of Apple’s decision to let extensions live inside the share sheet is that this limitation doesn’t create any confusion for users and developers: you’re not going to find different ways to activate extensions because iOS 8 will have to show a share sheet first. You may find a custom sharing icon and a share sheet filtered to show only some extensions, but the activation behavior will always be consistent. While Apple will probably end up giving up some control here in the future, there is a certain consistency and welcome simplicity that was nowhere to be found in the mess of URL schemes and inter-app communication hacks.
\n\nThis elegance carries over to what action and share extensions look and work like, too. Like the share sheets introduced back in iOS 5, these extensions display custom sheets or full-screen views on top of the app you’re using. I’ve tried extensions to quickly capture notes (with the upcoming Drafts 4, the current Safari page is saved into a note sheet), Pinboard sharing sheets (every Pinboard client is going to have a share extension), and read-later confirmation sheets.
\nThe share extension of the upcoming Drafts 4 will let you capture text from any app.
I also tested more advanced extensions, such as Linky, which allowed me to cross-post a link to various social networks. And, of course, there was the 1Password extension, which makes 1Password ubiquitous.
\nYes, this is real life.
Extensions don’t split the screen in two and they don’t chain multiple apps together – they let an app provide a subset of its features to any other app. You will see the extension carry the interface and branding of the app it comes from but it will be presented as a sheet out of its typical environment, keeping your context.
\nApps become features.
\nIn practical scenarios, action and share extensions have changed my iPad workflow and I know there’s much more coming, starting today. Several tasks that I used to launch with automated workflows, URL schemes, or bookmarklets have been replaced by visual, integrated, and more powerful extensions.
\nAnd there’s more. I could mention the visual note-taking app that accepts anything you throw at its extension, whether it’s a PDF, an image, or text. Or the app that lets you build personalized commands and run them on-demand from its extension inside other apps. Or what Readdle is launching with iOS 8.
\nThe beauty of Apple’s system is also that, in theory, apps are capable of determining which kind of input should be passed to an extension, enabling or disabling extensions accordingly. In Safari, for example, action and share extensions can get the current webpage title, URL, or selected text; in a read-later app, an extension will likely get the same values, but for an article you’re reading; in a document management app, an extension may work with the file you’re viewing, its file name, or other pieces of data related to it.
\nThe system was intended to scale elegantly and quietly in the background, but, in the early days of iOS 8, there will be confusion in regard to which extensions can be used where and, unfortunately, bugs.
\nOn several occasions, I launched share extensions that couldn’t work with the input shared by the app I was using; part of this, I imagine, was related to bugs and design problems of the apps I was testing, but clearly developers have been facing problems with Apple’s SDK. I assume that developers will need more time and better tools to understand how their share extensions work with thousands of other apps on the App Store, so don’t be surprised if an extension is failing or misinterpreting an app’s input right now.
\nI’m disappointed to see a lack of extension support in Apple’s own apps, and particurlarly in Mail. It just makes sense, in my opinion, to be able to turn messages into tasks or archived documents, but Apple hasn’t integrated extensions with Mail yet.
\nWhat I realized in using extensions is how necessary last year’s redesign of iOS was. Imagine if Apple didn’t ship a new design with iOS 7: today, we’d have sheets of stitched leather or shiny metal on top of apps that look like agendas or little robots. The cohesiveness and subdued style that iOS 7 brought with its precise structure and hierarchy allows extensions to integrate nicely with apps, feeling like extra actions rather than eerily realistic objects.
\nThe impact of action and share extensions on my workflow has been massive even with only a few apps and bugs left in iOS 8. I use my iPad more because I spend less time switching between apps, copying text around, or moving files between different containers. I’m more efficient thanks to extensions because the quality of the software I use every day has increased considerably. I can’t imagine what we’ll start seeing today with action and share extensions on the App Store.
\nAfter Editorial, Safari is the app I use the most on my iPhone and iPad. Last year, I decided to switch to Safari as my primary browser (I used to be committed to Chrome’s cause) and I’ve never regretted leaving Google’s platform. I found Safari for iOS 7 to be noticeably faster, cleaner, and more integrated with the system than Chrome, and I liked the improvements that went into tab management and iCloud sync.
\nI’m even more impressed by this year’s updates to Safari as they tighten the browser’s relationship with other iOS apps through iCloud Keychain, make the iPad version truly desktop-class, and allow third-party apps to use the same (faster) rendering engine.
\niCloud Keychain has been one of the most pleasant surprises of the past year. While I’m a heavy user of 1Password and I continue to use it for all my logins, credit card informationm, secure notes, and other private data, I’ve enjoyed the ability to automatically fill logins in Safari with iCloud and have those changes sync across devices with no management required. iCloud Keychain has a long way to go to replace a full-featured app like 1Password (you can’t even search for logins in iCloud Keychain), but its integration with Safari is top notch.
\niCloud Keychain in Safari (left) and in a third-party app (right).
In iOS 8, third-party apps that work with a web service that you’ve already logged in with Safari can request to access your credentials stored in iCloud Keychain, automatically filling their login fields for you. The new version of Screens (out today on the App Store) supports this new feature: if you have a screensconnect.com account saved in iCloud Keychain, the app can ask to use a “Safari saved password” so you won’t have to type it. It’s a minor addition (Screens also supports the new 1Password extension), but it really makes a difference thanks to Safari’s deep integration with the rest of the OS.2
\nThe new Safari for iPad is, by far, the best version of iOS’ Safari yet. The main interface hasn’t changed substantially: there’s still a bookmarks bar, a tab bar, and Safari Reader is accessible by tapping on an icon in the address bar.
\nGestures and tabs have been redesigned and rewritten to allow for a much faster and smoother interaction that is an absolute pleasure to use on the iPad’s larger screen.
\nAt any point during navigation, you can pinch-to-close with two fingers and the current webpage will zoom out and shrink to reveal a new birds-eye view of all your open tabs. This new tab view, inspired by Safari on the upcoming OS X Yosemite, takes advantage of the iPad’s screen with visual previews for tabs that offer more information and context than a tab bar.
\nIn this view, the tap targets of the pages’ titles are large enough that you can tap to open them and pinch-to-zoom to get closer to a stack of pages and get a bigger preview. You can tap & hold to reorder pages, swipe them to the left to close them, and press the “+” button to see recently closed tabs. The experience is extremely tactile and, combined with a beautiful and elegant design, some of Apple’s best work on iOS.
\nI appreciate a lot of small tweaks and changes that may appear irrelevant taken individually, but that add up over time and make Safari for iOS 8 friendlier and faster to operate.
\nI’d like to have a special mention for Shared Links. When Apple added this feature to Safari last year, I thought it was a cool way to read links from Twitter timeline; it’s a simple filter for links on Twitter, built into Safari, and I use it quite a bit when I don’t have time for tweets and i just want some links. I’ve discovered many interesting articles thanks to Shared Links – I almost wish that Apple made it a separate News app.
\nThis year, Apple added RSS support to Shared Links for iOS 8, which is an interesting turn of events as people regularly like to announce the end of RSS and Apple removed support for RSS feeds two years ago in Safari for Mountain Lion.
\nUnifying Shared Links with tweets and RSS feeds sounds strange, but it works. It’s nice to be able to see a list of links from your subscriptions in the browser, which you can tap to open in the current tab immediately. Shared Links will likely never become my preferred destination for news, but I’m glad it’s there.
\nI wish that Apple improved the way downloads are managed in Safari, but it looks like we may have to wait until next year for that. There is still no download manager in Safari, a surprising omission given the addition of iCloud Drive’s filesystem layer in iOS 8: Apple could have given Safari its own file container and let the browser save downloads in there and sync them with iCloud. Instead, tapping on a file that Safari can’t render still shows no progress for the download, but when the file is finally previewed you get the old menu that offers you to open the file in another app. At least you can run extensions on “downloaded” files.
\nThe addition of frequently visited websites is a nice one. Available in the Favorites menu by tapping the address bar, Frequently Visited shows large icons for websites you often go to – it’s like Top Sites, but it takes up less space.
\nOne of my other favorite features of Safari is related to its engine, which is now available to third-party apps. In the old days of iOS and until last year, it used to be that Apple had a faster Safari rendering engine called Nitro that, due to security concerns, was exclusive to Safari. Third-party apps that had web views (such as Twitter clients and news readers) couldn’t use the same JavaScript engine and were thus slower to render webpages. You may have noticed that Safari got faster over the years, but third-party apps remained slow. That’s changing with iOS 8.
\nThanks to a new developer API called WKWebView, any iOS 8 app can now use the same rendering engine of Safari. This sounds trivial – just use the same technology of the system browser – but it’s actually quite a technical achievement on Apple’s part. The difference between apps that use the old web views and iOS 8 apps that use the new APIs is noticeable, and it makes me open less tabs in Safari because I can read webpages in apps with the same speed and performance. I like this democratization of web views as it puts an end to third-party apps being second-class citizens when it comes to web performance.
\nI continue to be impressed by Safari’s speed, elegant interface, and fluid scrolling and zooming. In iOS 8, Apple brought changes to tab management that I found particularly noteworthy and useful on the iPad; these changes, combined with iCloud Keychain and higher performance, have made my browsing faster and more efficient.
\nWith iOS 8, Apple overhauled Notification Center and the way notifications are managed by letting you:
\nThese are solid changes, and they share the common theme of making you save time when handling the incoming stream of notifications, which can be daunting these days. I’ve only been able to test interactive notifications with a handful of third-party apps, but what I saw was enough to confirm that these improvements were long overdue on iOS.
\nNotifications can be swiped to the left in Notification Center to show a delete button and, if available, custom buttons to act on a notification without opening the app that sent it. In email messages, for example, you can swipe to show Mark as Read and Delete, which will perform the respective action directly from Notification Center without launching Mail.3
\n\nNotifications are also interactive when they come in through banners. Swipe down, and you’ll get action buttons if the developers added them to the app. I haven’t used interactive notifications from banners much: I don’t usually want to act on email right away, and I couldn’t test many third-party apps that supported them.
\nI’ve used the quick reply feature for messages a lot, and I cherish the time it’s saving me on a daily basis. Quick replies are one of those features that just make sense: of course you should be able to respond to messages on the fly without having to switch apps. Android had this for years, and I’m glad that Apple began to open up with inline replies with iOS 8, even though they’re limited to the Messages app for now (you won’t be able to quickly reply to a Twitter DM or Slack message).
\nQuick replies allow me to be polite and send a short reply immediately without losing the context of what I’m doing. Prior to iOS 8, I often ignored notifications and upset people because I forgot to open the Messages app and respond to them; now, the distraction is still there, but the decreased friction incentivates me to reply and deal with it immediately because it’s easier.
\nMy only gripe is that the quick reply banner doesn’t update if a new message comes in while you’re typing; thankfully, you can tap the banner and you’ll be taken to the Messages app with your text already filled in.
\nI can’t wait for quick replies to become available to third-party apps.
\nMail is always one of the first apps I set up with a fresh install of iOS, and I was curious to check out Apple’s additions in iOS 8. Mail has largely stayed the same, with the exception of new gestures to act on messages or dismiss them to take a quick look at your inbox.
\nThe swipe-to-close gesture was demoed at WWDC and it works on both the iPhone and iPad. When you’re writing an email, swipe down from the title bar to “dock” your message so you can go back to your inbox and see other messages that you may need to reference before continuing to write. This gesture works beautifully and I’ve been using it a lot to see other messages or double-checking information without having to save a message as draft.
\nThere are also new gestures to quickly archive, flag, or delete messages, which are clearly inspired by the short/long-swipe mechanism popularized by Mailbox. These gestures can be configured in the Settings and they make processing email faster, but unfortunately they’re only partially available to third-party developers.
\nWhat I don’t understand is the lack of extensions and document picker in Mail. Apple has built a new secure system to integrate with apps and bring better document management to iOS, but both are completely missing from Mail. It would only make sense, in my opinion, to be able to turn emails into todos using a share extension or to print to PDF with an action directly in Mail without installing a third-party client.
\nSimilarly, Apple has long featured a button to insert a photo or video in a message in the copy & paste menu, but they haven’t done the same for the iCloud Drive document picker. I want to believe that both of these features were cut due to time constraints and that they will be available in the future.
\nLast, I would like to see an email button in the new Recents/Favorites menu of the multitasking view. I’ve been using those shortcuts quite a bit, but they’re limited to starting calls or messages.
\nAs someone who writes on the iPad every day, I was looking forward to improvements in the keyboard area. With iOS 8, Apple decided to add a new predictive engine to the system keyboard called QuickType, and then give developers the tools to create custom keyboards that can (almost) replace the default keyboard.
\nAs far as Apple’s keyboard goes, QuickType hasn’t had a meaningful impact on my typing habits yet. Available as an extra row on top of the regular keyboard, QuickType displays three buttons for words you likely want to type; Apple says that QuickType is able to understand the context of the current conversation in Mail and Messages and that it should adapt to your writing style.
\nI have mixed feelings about QuickType. Occasionally, suggestions provided by QuickType are accurate and match what I want to write – it’s usually expressions like “cool, see you later” or “Talk tomorrow?” that I want to send as short replies. On the other hand, QuickType is often too formal or generic to be used in real conversations with my friends, who would easily realize I’m sending impersonal canned messages.
\nQuickType lacks personality in many of its predictions, which explains why I’ve been using it more as a fancy autocorrection tool than a predictive engine. I believe QuickType has a bright future ahead of it though, especially because of the deep system integration that Apple can build without being creepy about it. QuickType already works with multiple languages, and I look forward to seeing how it’ll grow.
\nWhat surprised me and continues to intrigue me is the amount of innovation I’m seeing in custom keyboards.
\nPerhaps it’s the fact that iOS users couldn’t enjoy the versatility of custom keyboards before, but besides keyboards that are only bringing themes and other questionable interfaces to the standard iOS keyboard, I’ve seen developers come up with fantastic ideas to enhance the typing and data input experience on an iPhone and iPad.
\n\nSwiftKey delivers accurate and personalized predictions based on your writing style (the keyboard can plug into your Twitter and Facebook accounts to learn how you write) with a swipe input method on the iPhone. The TextExpander keyboard expands your existing snippets as you type. There are going to be hundreds of custom keyboards on the App Store starting today and, like extensions, Apple opening up this area of iOS to third-party developers means we’ll see an explosion of new ideas – some of them useless and ugly, others elegant and powerful.
\nGenerally speaking, I’ve been testing a few custom keyboards for iOS 8 and I still have to find a balance between the keyboard I’ve always used and new experiences offered by developers. Keyboards that provide a specific feature such as clipboard management or emoji insertion were easier to fit in my workflow because I know I use them when I need them. They are utilities, available in a new area of the OS. Single-purpose custom keyboards that deal with data and text in new ways will be huge on iOS 8.
\nFull-featured alternative keyboards are seeing a steeper adoption curve in my personal experience because I keep thinking that maybe I’m just fine with Apple’s keyboard. I assume that it’s a matter of time – for better or worse, I’ve been using the default keyboard for seven years now – and I’m curious to see if a keyboard will slowly gain an edge over Apple’s default solution over the coming months.
\nOne of the big reasons behind my doubts for general-purpose custom keyboards are the limitations that Apple put around them, and I suspect that a lot of users will go through this phase.
\nCustom keyboards can’t:
\nFurthermore, app developers can choose to reject custom keyboards in their apps; and, while keyboards must display the keyboard switcher (globe) icon to return to the system keyboard, they can’t show a quick switch menu on tap & hold, which considerably slows down the process of moving across keyboards.
\nYou can’t use Apple’s keyboard switcher in custom keyboards.
All these limitations can be overcome by setting preferences in each custom keyboard and accepting the fact that iOS will often default to the system keyboard due to privacy concerns. And that is fine: I appreciate the secure model that Apple built to protect customer data as much possible and I like that Internet access needs to be granted manually to custom keyboards.
\nMultiple keyboards, multiple settings.
But from a user’s perspective, this lack of deep system integration, little trade-offs, and increased friction add up over time when trying to use a general-purpose custom keyboard as your only keyboard. At least in my experience, I’ve found it easier to switch to simpler custom keyboards when I needed them rather than keyboards meant to be used all the time.
\nThere are some great ideas coming out of developers of custom keyboards for iOS 8 and it’ll be interesting to see this market shape itself in the next year.
\nUntil then, any developer who fixes Apple’s Shift key design is welcome.
\niOS 8 includes detailed battery usage statistics by app (also known as “battery shaming”) meant to expose apps that consume a lot of energy. These stats are available in Settings > General > Usage > Battery Usage and they can be viewed for the last 24 hours or x number of days (usually 3, 5, or 7 days).
\nIt’s a simple addition taken from Mavericks, but it made an important difference in my everyday usage of the iPhone and iPad. I’ve already deleted a bunch of apps that were consuming too much battery in the background and that I didn’t know about, which has made my devices’ battery last longer. I can see which apps or system features are using my battery and how (Beats Music, for instance, says “Audio”) and there are handy suggestions to increase battery life such as enabling auto-lock or decreasing screen brightness.
\nThis is a welcome addition, especially for people who spend a lot of hours on an iOS device each day. I’m hoping that next year we’ll get even more detailed stats with graphs and battery usage by app version.
\nI’ve only briefly used Spotlight Suggestions and iCloud Drive on my devices and, while I believe they have potentially huge implications for search and file management, I’m not ready to have a complete opinion on them yet.
\niCloud Drive is, in theory, a good idea: Apple realized that users like to organize their documents in folders after all, so they came up with a unified location that combines the old iCloud (files are inside their respective apps) with a traditional filesystem. In iCloud Drive, you’ll see folders with app icons on them, but you can also move files around, create new folders, and drop files anywhere you want.
\niCloud Drive is shared across apps, meaning that any app can access files from other apps and save them without creating a duplicate copy. iCloud Drive is meant to replace the old Open In system and it shows Apple giving up on some of its vision for the sake of user convenience and simplicity, but it’s too early for me to tell whether it’ll have an impact on how I work on iOS.
\nThe reason is twofold: I don’t use document-based iCloud apps and I was more interested to try document provider extensions, also new in iOS 8. In the past three months, none of the Dropbox-enabled apps I use switched to iCloud Drive and I’ve only been able to try one document provider (Transmit). I could elaborate on the principles behind these features, but I very much prefer to wait for a practical comment (plus, I already wrote about this topic in June).
\nThe issue with Spotlight Suggestions is different: I only managed to activate them a couple of weeks ago as they weren’t available in Italy during the iOS 8 beta. Spotlight Suggestions are a great idea: they put web content such as Wikipedia pages, iTunes and App Store results, Maps locations, and more directly into Spotlight without having to jump through Google or use the search field in an app. Apple wants to push Spotlight as the universal way to search and land on relevant results rather than an application launcher or search utility for your local data.
\n\n\nThis is, potentially, a big problem for Google. In my two weeks of testing Spotlight Suggestions, I’ve already avoided Google a few times and arrived directly on Wikipedia or suggested websites thanks to Spotlight Suggestions (they are available in both Spotlight and Safari). Suggestions show a small snippet of text and a photo before you tap the result and they take you directly to the relevant page without seeing ads or small links to choose from. Not only is it faster, it’s a better user experience.
\nApple is, however, facing years of habits and expectations when it comes to search. People associate search with Google and they’re used to open google.com even just to launch websites. Spotlight Suggestions display one or two top results based on what Apple think it’s appropriate, and they don’t obviously come with all the features of Google Search (namely faster suggestions and Google Images). For simple search queries, though, I think that Spotlight Suggestions are fine, and they’re growing on me.
\nIt’s also interesting to think about Spotlight as a search engine for iOS apps. Spotlight Suggestions can look for results embedded in specific apps (iTunes Store, Mail, etc) and launch them without forcing you to open the app first. I would love to see Spotlight become a search feature that gets results from any app on my devices and that can be opened from anywhere, not just the Home screen.
\nContinuity is Apple’s initiative to let iOS devices and Macs seamlessly exchange data and activities either automatically or manually with Handoff, and I’ve come to rely on some of its features quite a bit.
\nIt seems obvious now that iPhones and iPads should relay messages and calls to each other and let you reply or respond on whichever device you prefer, and it’s already difficult for me to remember that this is a new functionality of iOS 8.
\nI love being able to pick a phone call and read SMS on my iPad without going in another room to find my iPhone. I’m often working with my iPad around the house, and I get the occasional phone call that I need to take but I don’t want to get up. Same with SMS – I have a couple of Android friends who still send regular messages and, of course, two-factor authentication systems rely on standard SMS.
\nBefore iOS 7, I would get a confirmation code on my iPhone and I’d have to quickly find it and type the code manually into a login field of an app. With Continuity, I get the SMS relayed to my iPad so I can just copy and paste it.
\nMy parents still like to call me with regular old phone calls (as does my doctor), and thanks to Continuity the phone part of the iPhone is turning into “just an antenna”. I rarely used my iPhone as a phone before, but now that I can receive and start phone calls on my iPad – which I’m always using for work all day – I’ve noticed that the Phone app is barely launched anymore.
\nUnfortunately, Apple decided to delay the SMS Relay feature until October, likely to fix remaining bugs in anticipation of OS X Yosemite. During the iOS 8 beta, SMS Relay worked very well on my iPhone 5s and iPad mini, and I’m surprised that Apple decided not to ship it for iOS 8 devices today.
\nI was impressed by Handoff when Apple announced the feature at WWDC and I still think that it’s a powerful concept executed elegantly, but, for practical purpose, Handoff hasn’t clicked for me yet.
\nHandoff lets you continue an activity on another device without losing your progress or data. For instance, you can start writing an email on your iPhone and decide to finish it on the iPad; by handing off that message, you’ll pick up the draft right where you left off – text and everything. You can do the same with webpages, iWork documents, notes, and third-party apps can implement Handoff as well, letting you seamlessly continue tasks across devices.
\nOn iOS 8, Handoff is activated by swiping up an icon in the bottom left corner of the Home screen. That icon indicates the app you’re sending to another device, but it doesn’t hint at what you were doing inside that app; once swiped up, the app will simply resume and load your previous content and context as accurately as possible. Safari will open the webpage you were reading on another device, but it won’t resume your scroll position or text selection.
\nI tested Handoff with Apple’s apps and a couple of third-party apps that used the feature to switch between devices. Handoff makes for a good demo: activities are “beamed” instantly between my devices (you don’t need to be on the same local WiFi network) and apps can suggest fallback options if handed-off content can’t be opened in the same app. A Pinboard client that supported Handoff, for instance, could open a webpage I was reading in the app on my iPhone into the same app on the iPad; if I didn’t have the same app installed on the other device, Handoff could open the webpage in Safari instead.
\nMy problem is that, besides technical tests, I never actively sought out Handoff when I was working on my iPad or iPhone. Usually, if I begin writing an email message on my iPhone it’s because I know I can finish it there; iCloud features like iCloud Photo Library and iCloud Tabs remove the need to get another device to find photos or webpages, and because both my iPhone and iPad are always with me, Handoff is mostly unnecessary.
\nI suppose that the Continuity picture will be complete and better judged once Yosemite is out. I love the automatic and passive propagation of my phone and FaceTime calls across devices; Handoff is neat, but I didn’t find a good use for it on my iOS devices. I’ll check back in October.
\nThe reinvention of iOS starts from the fundamentals of user experience: how apps communicate with each other, how text is entered on the screen through a digital keyboard, or how documents are managed and stored in the cloud. iOS 8 shows an Apple that’s been listening and identifying the weaknesses of its mobile OS, paying attention to what users and developers wanted from their apps.
\niOS used to be, for many, a limited platform where work couldn’t get done and apps couldn’t collaborate or exchange information. With iOS 8, the conversation has been changed to what can be done next. With a more versatile OS capable of being extensible but secure, Apple is creating a huge opportunity for every party involved: an opportunity for users to discover how iOS can help them in their daily lives; an opportunity for developers to explore new app markets and craft powerful software that wasn’t possible before.
\nIn typical Apple fashion, new features introduced with iOS 8 will likely be updated with fixes and more options over the next releases and in iOS 9. Apple hasn’t showed many examples of action extensions and widgets in their apps, and the OS isn’t without its fair share of bugs (but they’re far less disruptive than last year’s).
\nMore extension points will probably be added in the future (Copy & Paste and Control Center extensions in iOS 9 would be great), going beyond the share sheet and allowing developers to attach extensions to custom interfaces. The current implementation of action and share extensions is already an important step towards the elimination of proprietary sharing menus and the adoption of a more consistent, versatile extension system.
\nThere are hundreds of new features in iOS 8 that I haven’t covered in this article because I wanted to focus on my experience with using iOS 8 for work every day. The apps that I’ve been trying over the past three months have profoundly changed the way I can work from my iPad all day: I could run MacStories on an iPad before, but iOS 8 has allowed me to stop finding workarounds to make the iPad do something it wasn’t built for.
\nAnd I’ve only tried a few apps for iOS 8. Starting today, we’re going to see new categories of apps on the App Store, built by developers who can work with brand new technologies for document management, inter-app communication, keyboard input, notification management, glanceable information, and more. If I was able to reconsider every aspect of my iOS workflow in three months with just a few iOS 8 apps, I can’t imagine the amount of innovation we’ll see over the next year.
\niOS 8 is an incredibly democratic OS. Apple realized that a single company couldn’t provide all the tools for modern consumer needs, and they’ve decided to put the ball in the developers’ court. More than ever (and bugs aside), it feels like developers are building the iOS platform alongside Apple rather than competing with them. Three months into using iOS 8 every day, I sometimes forget that all these new features are now available to iOS users: many existing apps are having new beginnings, and adjusting expectations to new functionalities will take time.
\nThe scope of iOS 8’s changes will truly make sense as developers keep building brand new experiences over the coming months. iOS has begun to open up, and there’s no stopping at this point.
\niOS 8 has changed how I work from my iPhone and iPad. I’ve never been more excited about its future.
\nFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.
\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.
\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;
\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;
\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.
\nLearn more here and from our Club FAQs.
\nJoin Now", "content_text": "When I reviewed iOS 7 last year, I took a different approach and tried to consider Apple’s redesigned OS from the perspective of someone who uses iPhones and iPads for work and personal tasks on a daily basis. I noted that a new structure enabled developers to make more powerful apps, and I concluded hoping that Apple would “consider revamping interoperability and communication between apps in the future”.\nWith today’s release of iOS 8, Apple isn’t merely improving upon iOS 7 with minor app updates and feature additions. They’re also not backtracking on the design language launched last year, which has been refined and optimized with subtle tweaks, but not fundamentally changed since its debut in June 2013.\nApple is reinventing iOS. The way apps communicate with each other and exchange functionality through extensions. How status awareness is being brought to iPhones, iPads, and Macs with Handoff and Continuity. Swift and TestFlight, giving developers new tools to build and test their apps. Custom keyboards and interactive notifications.\nThere are hundreds of new features in iOS 8 and the ecosystem surrounding it that signal a far-reaching reimagination of what iOS apps should be capable of, the extent of user customization on an iPhone and iPad, or the amount of usage data that app developers can collect to craft better software.\nSeven years into iOS, a new beginning is afoot for Apple’s mobile OS, and, months from now, there will still be plenty to discuss. But, today, I want to elaborate on my experience with iOS 8 in a story that can be summed up with:\niOS 8 has completely changed how I work on my iPhone and iPad.\n\nThe Future’s Legacy\nA year after iOS 7, I would say that it hasn’t been a perfectly smooth transition, but, at least for me, it’s hard to look back at what iOS used to be and miss it.\nFollowing the launch of iOS 7, it became clear that Apple hadn’t had much time to optimize the OS for a bug-free experience that also needed to perform reasonably well on older hardware. For someone who relies on the iPad for work purposes, the first few months of iOS 7 were rough: in spite of Apple’s initial bug fix updates, I kept getting Home screen crashes, random reboots, hard resets, BSoDs, and, generally speaking, a bevy of graphical glitches that were new to iOS – traditionally, a highly polished and stable platform.\niOS 7’s technical problems weren’t exclusive to Apple’s own apps and features: as a result of lingering bugs in the final OS and the developer SDK, third-party apps exhibited a variety of text-related issues, inconsistent animations, and crashes. I know of several developers who needed to work around Apple’s bugs to avoid crashes and glitches…which eventually led to other bugs after Apple began releasing iOS 7 updates.\nThe launch of iOS 7 seemed to confirm that Apple wasn’t the kind of company that could handle a complete redesign while adding major framework and feature additions and hope to release a stable OS. My experience with the iPad as my primary computer was, from a technical perspective, worse than iOS 6. Until Apple released iOS 7.1, I had to cope with more bugs and crashes than I ever expected.\nFrom a big picture perspective, however, I think that iOS 7 was necessary. Breaking with old design trends and longstanding UI conventions allowed Apple to modernize iOS and kickstart a process that would see the company and third-party developers rediscover the personalities of their software. Whether it’s Apple experimenting with different designs for music players or Evernote continuing to tweak its app to find the right balance of updated interface and functionalities, the undeniable truth is that we’ve ended up with fantastic pieces of iOS 7 software such as Elevate, Skitch, Overcast, and thousands of other apps that I doubt would have been possible hadn’t Apple drawn a line in the sand with iOS 7.\niOS 7 changed the conversation from software that had to look somewhat realistic to apps that work well with a focus on content, clarity, and color. The redesign wasn’t the end goal – it was the motivation to start fresh and make better apps.\niOS 7 was a bitter medicine – and I believe that the ecosystem is stronger because of it.\n\nAt the peak of criticism last year, many thought that iOS 7’s redesign was a fashionable excuse – a facade – to cover the fact that Apple was running out of ideas. Instead, I now see many of Apple’s decisions with iOS 7 as functional and directly related to this year’s deep changes in iOS 8. Just to name a few: improved background refresh and a more consistent visual style will allow App Extensions to be more versatile and consistent than they would have been without iOS 7; the Today view – useful but limited – can now become an area for interactive widgets; Near Me, tested for over a year, will be integrated in a much more useful Explore section on the App Store.\n\nOn the eve of WWDC 2014, I was looking through our archive of app reviews on MacStories and I realized how much had changed from a visual perspective over the years and how little things had improved from a functionality standpoint. Despite Apple’s (strenuous, but ultimately rewarding) efforts to modernize iOS, antiquated paradigms had remained at its core: with no unified system to let apps collaborate on a common task or exchange documents across multiple apps without creating duplicates, iOS 7 was still fundamentally rooted in old limitations that had no reason to exist in 2014.\nWith iOS 8, Apple is making good on their promise of entering the post-PC era with features that are unexpected and that will take time to digest, but that are still uniquely iOS.\nExtensible\nOver the years, I developed a series of habits and built workflows to get work done on my iPad with the same degree of functionality of my Mac. That wasn’t an experiment to prove a point: it was a necessary consequence of not being able to sit at my desk every day. I needed the portability of the iPad, so I reinvented the way I worked with it.\n\nThe limitations of iOS soon became clear to me, and I had to set up complex workarounds and scripts to overcome them. Without an open OS capable of exchanging files across apps through a filesystem, I had to rely on specialized utilities that would often generate their own copies of files and waste precious storage space on my device.\nThe Open In menu was my savior and enemy: I could use it to send a document to any app, but that would create additional copies of the same document in silos unable to communicate with each other. When I needed to annotate a screenshot and use it in a blog post, I’d end up with three copies of same file. If I wanted to proofread an article in a dedicated grammar-checking app, it would result in two versions of the file and lots of manual copying and pasting between apps.\nAnd that’s just for files and documents. I’ve come up with all sorts of custom commands and hacks to achieve some basic inter-app communication, which resulted in a lot of hours spent fixing problems with Apple’s sandboxing and figuring out the easiest solution for a problem that wouldn’t exist on a Mac.\nOn OS X, the power of Alfred, Keyboard Maestro, Dropbox, and the Finder is just one click away. On iOS? I tried hundreds of apps that could help me in my everyday work life, and none of them knew about each other. If I wanted to save time and effort in working from my iPad, I needed to find openings in Apple’s closed ecosystem and turn them into automated workflows to mimic my OS X setup.\nThat’s what I did, and I loved every single minute spent hacking and testing what could be possible with “underpowered” iOS apps. I built scripts to automate image editing and combine that with Dropbox uploads. I connected apps with x-callback-url to let them collaborate on a single task with one tap. I set up shortcuts in Launch Center Pro and chained actions in Drafts. I let Editorial take care of everything else.\nI’m about to throw most of this stuff away with iOS 8.\nThanks to Apple’s work on extensibility and new technologies available to third-party developers, apps are finally able to talk to each other, working in unison to offer their services when and where they make sense. With iOS 8 extensions, apps can become features available in other apps. And while that won’t mean the end of some of my automation workflows, dozens of workarounds I set up for basic inter-app communication won’t be neeeded anymore.\nAs a recap of the Extensibility primer that I wrote in June, there are different types of extensions that apps can include in iOS 8:\nToday: widgets for the Today view of Notification Center\nShare: post content to web services or share content with others\nActions: app extensions to view or manipulate inside another app\nPhoto Editing: edit a photo or video in Apple’s Photos app with extensions from a third-party app\nStorage Providers: an interface between files inside an app and other apps on a user’s device\nCustom Keyboards: system-wide alternative keyboards\nExtensions aren’t sold as separate apps on the App Store – they are bundled into regular apps and they need to be activated in specific parts of the OS called “extension points”. Extensions are installed, but not enabled, when you download an app from the Store, and they are deactivated and removed when you delete an app. Instead of rehashing the explanation that I published after WWDC, I want to focus on the practical changes that extensions have brought to my daily iOS usage.\nExtensions are such a big change for iOS productivity, I still tend to forget about them: it’ll take time to realize that iOS can now complete tasks that used to be exclusive to Macs.\nFor people who want to work on the iPad, iOS 8 extensions make sense only when considered in the context of third-party apps and what developers will create for the new OS. This made it harder to test and reflect upon iOS’ changes this summer: last year, you could use an iPhone running iOS 7 and get a sense of the design differences between Apple apps and third-party software; this year, new third-party apps are required to understand the true potential of extensibility.1\niOS 8 is, first and foremost, all about third-party apps and the possibilities they create for users and developers. For the past three months, I’ve been testing dozens of updated apps – from either big companies and smaller indie shops – and installed extensions in an effort to understand what iOS 8 would bring. In the process, iOS 8 apps reinvented the way I work from my iPhone and iPad.\nWidgets\nWhen Apple introduced the Today view in Notification Center last year, I lauded the glanceable and contextual information that it offered through small blocks of content that could preview my upcoming calendar appointments, current traffic, and local weather conditions. I relied heavily on Reminders back then, and I welcomed the ability to mark tasks as completed directly from Notification Center without having to open an app. I noted that breaking notifications in two distinct layers was confusing, but, overall, I was positively impressed with the Today view and I imagined it’d have an interesting future.\nIn iOS 8, Apple has simplified Notification Center and turned the Today area into an extension point that apps can use for glanceable and interactive widgets. When Craig Federighi unveiled widgets as part of the extensibility announcements at WWDC, I thought that they would turn out to be a modern take on the old OS X Dashboard – somewhat handy, but too detached from the main experience to make an impact. I was wrong.\n\niOS 8 widgets are extensions that can preview content and have lightweight interactions with their host app. This seemingly obvious statement is the raison d’être of widgets: unlike Dashboard widgets on OS X, iOS 8 widgets are deeply integrated with the app they belong to; therefore, using them has far less friction than switching back and forth between apps and the Dashboard on OS X.\nApple’s built-in examples aren’t that great: like last year, there’s a widget to see your current calendar events and weather forecasts, a list of tasks, and a summary of your schedule. Apple’s widgets are extremely basic in that they preview content and allow you to tap them to jump to their respective apps. They fit well with the Today view’s underlying premise – these are widgets for stuff you need to do or see today. They’re useful, but they don’t show the full potential of the new Today view.\nThird-party widgets (like other extensions) are bundled with apps from the App Store and they need to be manually activated by tapping Edit in Notification Center and adding them to the Today area. Widgets can be deactivated without uninstalling the host app, but they will be permanently deleted if the host app is removed from your device. If you downloaded apps that offer widgets you haven’t enabled yet, Notification Center will tell you with a message at the bottom of the screen.\nLeft to right: Workflow, Day One, Fragment, and Hours widgets.\nSince June, I’ve seen developers making widgets for all kinds of apps and not necessarily for content that is relevant “today” or that is based on time-related components. Apple advised on creating lightweight widgets with minimal UIs and interactions, and that hasn’t stopped developers from coming up with ideas that are far more useful than a weather preview.\n\nThe Evernote widget has buttons to quickly create a new text note, a note from the camera, or save a photo from the device’s library. These buttons don’t bring up a keyboard in the Today view – it’s impossible for apps to invoke text input in Notification Center – but they let me jump directly into Evernote and the section associated with the button I tapped. The “text” button opens the app in typing mode; the camera button opens the app and launches its camera feature; the photos button opens Evernote with a photo picker.\nI’ve been using this widget several times a day: it saves me time I would spend launching the app and navigating its interface to create new notes, and it’s a much more native and integrated system than hacks like URL schemes because it’s always just a swipe away.\nIn fact, I suspect that the whole “app section launcher” idea will take off as one of the most popular uses for Today widgets. A productivity app I was testing offered a grid of actions that, once configured in the app, could be launched from a widget that displayed custom icons for each action. A read-later app is adding a widget with a preview of articles saved on the current day; tapping an article title’s in the Today view opens the app directly into that article. A fitness app to track steps and runs called Runtime is coming out with a widget to view the amount of steps taken on the current day (as returned by the iPhone’s M7 chip) and a button to start a new run directly from Notification Center.\nThe concept of action launchers that was popularized by Launch Center Pro will feel right at home in the Today view for iOS 8, and I believe it’ll be a better fit thanks to Notification Center’s system-wide presence. But it’s the interactivity allowed in the Today view that turned widgets into must-have extensions for my daily workflow.\nA few weeks ago, I began testing Clips, an upcoming clipboard manager by the creators of Dispatch to save and access snippets of text previously copied on an iPhone or iPad (it’s launching soon).\nClipboard management has always been one of the biggest advantages of using OS X over iOS for “real work” – on the Mac, you can use apps such as Alfred or ClipMenu to constantly monitor everything you copy, archive it, and paste it at any time, even if you copied a string of text two days ago. On iOS, developers were never able to create desktop-like clipboard managers to monitor what you copy and paste 24/7: iOS just doesn’t give developers access to that data. So if you wanted to use a clipboard manager, you’d have to cope with the limitations of utilities like EverClip, an app that could monitor your clipboard activity for 10 minutes at a time and that would then be killed in the background by the OS. The high friction required to use these apps was the reason I never truly got into them.\n\niOS 8 is not going to allow apps to constantly monitor the clipboard in the background, and there isn’t going to be a mobile version of ClipMenu or LaunchBar’s clipboard menu on the App Store today. The developers of this new clipboard manager, though, came up with the idea of using a widget as a quick entry input option for manual clipboard archiving: after you’ve copied something, slide down Notification Center and hit the app’s “+” button in the widget. What you copied will be saved as clipped text into the app without actually opening the app, and it will be previewed inside the widget for future copying with one tap.\nI’m using this widget every day and it has allowed me to do research for MacStories and Relay without wishing I had a Mac. It’s not as full-featured as ClipMenu or Alfred, but the ease of access of the widget makes the act of saving your clipboard effortless because Notification Center is always there. I’ll accept the trade-off of having to archive my clipboard manually if it means I can do it visually and quickly with a widget that I can always bring up with a swipe in any app I’m using.\nIn my tests, I noticed some technical issues with widgets that I believe are related to third-party apps and bugs left in iOS 8.\nIf you install several widgets on your device, you may notice a brief loading time for content that needs to be visualized when you swipe down to open Notification Center. This problem was more noticeable in the previous betas of iOS 8 and refresh speed has gotten considerably better on the public release, but it can still occurr. I ran my tests with about a dozen third-party widgets on an iPhone 5s and a second-gen iPad mini.\nThe second (and particularly minor) annoyance is that the iPad can’t load widgets from iPhone apps installed in compatibility mode. While I don’t generally like to keep iPhone apps on my iPad, I wanted to use a widget that’s available in an iPhone-only app and discovered that, despite emulation, the iPad couldn’t use it.\nLike other types of extensions, there will be an explosion of widgets on the App Store. If the apps I’m trying are of any indication, most developers will want to offer some kind of shortcut in the Today view, but at that point you’ll have to wonder how much you’ll benefit from a widget that you need to locate like an app on a crowded Home screen.\nI’ve tried to be selective with the widgets I want to keep on my devices. There’s no point in scrolling a page full of widgets, even if curiosity will push you to install many of them. That’s normal, but I’ve discovered that I prefer (and ultimately benefit from) keeping only a few widgets around.\nWidgets validate the idea of app launchers and action shortcuts. In spite of the “Today” name, the best widgets that I tested weren’t about the next 24 hours or today’s forecast – they were interactive menus and app extensions that allowed me to save text, launch specific actions in a couple of seconds, and navigate less around my Home screen. In my daily workflow, widgets enabled me to complete tasks that were already possible (such as moving bits of text from Safari to a Pages document) more easily and quickly, giving me extra time to do something else on my iPad.\nThe “Today” name is, at this point, non-descriptive of what widgets are bringing to Notification Center. I wouldn’t be surprised to see two tabs – “Widgets” and “Notifications” – in an Action Center/Dashboard rebranding next year.\nTake Action\nIt’s in the action and share extensions, however, that iOS is finding its maturity as a platform and a new beginning for a rich ecosystem of apps. Action and share extensions have changed the way I get work done on iOS and they mark an important new chapter for third-party apps.\nFirst, some history. iPhone OS was built from the technological roots of OS X, but it never gained the Services menu that, for decades, had allowed developers to abstract features from their apps and make them available as shortcuts in other apps. Mockups and concept videos didn’t take long to appear, but Apple never caved to the pressure of power users who wanted an easy way to let apps communicate and exchange functionality.\nThe umbrella term that “inter-app communication” became over the years stood for a fairly primitive need: apps should be able to collaborate on text, files, and other types of data without requiring too much user interaction. Why would you need to manually copy and paste a Safari webpage that you want to turn into a todo? Wouldn’t it make sense for apps to offer their services to other apps and make users happier, more productive, and more satisfied with their software purchases?\nIt did make sense, but without an Apple-sanctioned solution developers had to come up with their own fragmented, flaky, often unreliable technologies to enable some kind of communication between iOS apps.\nA few examples come to mind:\nThe TextExpander SDK, which developers needed to manually add to their apps. Notably, Smile relied on hacks that Apple eventually noticed and asked to shut down.\nURL schemes and x-callback-url. Primarily aimed at text, this widely popular workaround allowed apps to exchange data and automatically open other apps.\nThe GoodReader SDK, built for document management and effectively MIA.\nJavaScript bookmarklets, officially supported by Safari but limited, difficult to configure on iOS, and generally user-hostile.\nPython scripting with Editorial and Pythonista, developed by Ole Zorn and compatible with various iOS frameworks.\nLaunch Center Pro and Drafts, which both started out as simple utilities and evolved into full-featured apps capable of making other apps communicate.\nWith the exception of GoodReader, I relied on all these tools to get more done on iOS in less time and to a higher degree of what the platform could normally provide. It was all I could use, but I knew – as I often argued here at MacStories – that none of them was a solution.\nThere are two ways to look at iOS’ old limitations. There were so-called power users like me, Alex, Eric, and hundreds of others who liked to tweak their devices and were willing to invest hours in creating workflows that could save them a few minutes each day. And then there are the millions of people who simply don’t care. The people who buy iPads and want to write a college essay or prepare academic research on it don’t want or need URL schemes. \nBetween empowering the masses and pleasing a vocal circle, Apple will always choose the former. A cornucopia of app functionality was being wasted before extensions. Action and share extensions demonstrate how Apple has been thinking about these problems to create a system that’s far more powerful than old hacks and workarounds, secure by design, and user-friendly in a way that, like widgets, makes sense.\n\nAction and share extensions are installed alongside their respective apps and can only be launched from the system share sheet. To make this clear: on iOS 8.0, you will never be able to launch an action or share extension without tapping on its icon in the share sheet.\nFor this first version of iOS 8, Apple chose to confine extensions to specific areas of the OS called extension points: widgets are displayed in the Today view, custom keyboards can be loaded from the system keyboard, and actions can be activated from the share sheet. I don’t want to get in the technicalities of Apple’s system, which likely took years to develop as Apple wanted to build a secure inter-app communication system that wouldn’t put user data at risk while also remaining simple and easy to activate. Creating that kind of secure opening in the sandboxing model must have been a huge effort among Apple engineers, but I want to focus on the user experience.\nIn the three months I spent with iOS 8 on the iPad and iPhone I use for work every day, action and share extensions have been amazing. They are app features available in other apps with their own custom interfaces and they’re compatible with any app that supports the system share sheet. Action and share extensions expose specific functionalities from the apps you already use and they feel like the next logical step for the Services menu.\nAction and share extensions coexist in the share sheet and the differences between them can be blurry. While Apple has been adamant about the fact that share extensions should be intended for social services (see the company’s original share sheets) and actions for everything else, I have been trying share extensions that save the extension’s input locally without ever posting it to an online service. For this reason, I would say that the best way to think about them is this: share extensions are at the top of the share sheet and they’re used to save the extension’s input (a link, some text, etc) somewhere else; action extensions perform a more complex task in the app that’s currently being used.\nLike widgets, I believe that action and share extensions are going to be extremely popular among developers of utilities and productivity apps. The important aspect of Apple’s decision to let extensions live inside the share sheet is that this limitation doesn’t create any confusion for users and developers: you’re not going to find different ways to activate extensions because iOS 8 will have to show a share sheet first. You may find a custom sharing icon and a share sheet filtered to show only some extensions, but the activation behavior will always be consistent. While Apple will probably end up giving up some control here in the future, there is a certain consistency and welcome simplicity that was nowhere to be found in the mess of URL schemes and inter-app communication hacks.\nHow a share extension is activated, launched, and used (Linky for iOS 8).\nThis elegance carries over to what action and share extensions look and work like, too. Like the share sheets introduced back in iOS 5, these extensions display custom sheets or full-screen views on top of the app you’re using. I’ve tried extensions to quickly capture notes (with the upcoming Drafts 4, the current Safari page is saved into a note sheet), Pinboard sharing sheets (every Pinboard client is going to have a share extension), and read-later confirmation sheets.\nThe share extension of the upcoming Drafts 4 will let you capture text from any app.\nI also tested more advanced extensions, such as Linky, which allowed me to cross-post a link to various social networks. And, of course, there was the 1Password extension, which makes 1Password ubiquitous.\nYes, this is real life.\nExtensions don’t split the screen in two and they don’t chain multiple apps together – they let an app provide a subset of its features to any other app. You will see the extension carry the interface and branding of the app it comes from but it will be presented as a sheet out of its typical environment, keeping your context.\nApps become features.\nIn practical scenarios, action and share extensions have changed my iPad workflow and I know there’s much more coming, starting today. Several tasks that I used to launch with automated workflows, URL schemes, or bookmarklets have been replaced by visual, integrated, and more powerful extensions.\nAn upcoming note-taking gives me an extension to capture whatever I’m looking at – whether it’s a webpage in Safari or a paragraph of text. I can select what I want, save it in the app without opening it, and I’m done.\nLinky lets me cross-post to multiple networks at once. Its extension can automatically fill in a webpage’s title and link if opened from Safari, or it can insert the input text by other apps into its share sheet. It even understands whether I want to share selected text or not.\nI used to switch back and forth between 1Password and iOS apps several times a day. No more. With the 1Password extension, my logins are always available in a 1Password mini-vault that I can unlock with Touch ID from any app.\nBookmarks saved on Pinboard? No more bookmarklets and URL schemes. On iOS 8, your favorite Pinboard app will likely offer a share sheet that has a full-blown interface and that supports tag autocompletion, privacy settings, and more.\nAnd there’s more. I could mention the visual note-taking app that accepts anything you throw at its extension, whether it’s a PDF, an image, or text. Or the app that lets you build personalized commands and run them on-demand from its extension inside other apps. Or what Readdle is launching with iOS 8.\nThe beauty of Apple’s system is also that, in theory, apps are capable of determining which kind of input should be passed to an extension, enabling or disabling extensions accordingly. In Safari, for example, action and share extensions can get the current webpage title, URL, or selected text; in a read-later app, an extension will likely get the same values, but for an article you’re reading; in a document management app, an extension may work with the file you’re viewing, its file name, or other pieces of data related to it.\nThe system was intended to scale elegantly and quietly in the background, but, in the early days of iOS 8, there will be confusion in regard to which extensions can be used where and, unfortunately, bugs.\nOn several occasions, I launched share extensions that couldn’t work with the input shared by the app I was using; part of this, I imagine, was related to bugs and design problems of the apps I was testing, but clearly developers have been facing problems with Apple’s SDK. I assume that developers will need more time and better tools to understand how their share extensions work with thousands of other apps on the App Store, so don’t be surprised if an extension is failing or misinterpreting an app’s input right now.\nI’m disappointed to see a lack of extension support in Apple’s own apps, and particurlarly in Mail. It just makes sense, in my opinion, to be able to turn messages into tasks or archived documents, but Apple hasn’t integrated extensions with Mail yet.\nWhat I realized in using extensions is how necessary last year’s redesign of iOS was. Imagine if Apple didn’t ship a new design with iOS 7: today, we’d have sheets of stitched leather or shiny metal on top of apps that look like agendas or little robots. The cohesiveness and subdued style that iOS 7 brought with its precise structure and hierarchy allows extensions to integrate nicely with apps, feeling like extra actions rather than eerily realistic objects.\nThe impact of action and share extensions on my workflow has been massive even with only a few apps and bugs left in iOS 8. I use my iPad more because I spend less time switching between apps, copying text around, or moving files between different containers. I’m more efficient thanks to extensions because the quality of the software I use every day has increased considerably. I can’t imagine what we’ll start seeing today with action and share extensions on the App Store.\nSafari\nAfter Editorial, Safari is the app I use the most on my iPhone and iPad. Last year, I decided to switch to Safari as my primary browser (I used to be committed to Chrome’s cause) and I’ve never regretted leaving Google’s platform. I found Safari for iOS 7 to be noticeably faster, cleaner, and more integrated with the system than Chrome, and I liked the improvements that went into tab management and iCloud sync.\nI’m even more impressed by this year’s updates to Safari as they tighten the browser’s relationship with other iOS apps through iCloud Keychain, make the iPad version truly desktop-class, and allow third-party apps to use the same (faster) rendering engine.\n\niCloud Keychain has been one of the most pleasant surprises of the past year. While I’m a heavy user of 1Password and I continue to use it for all my logins, credit card informationm, secure notes, and other private data, I’ve enjoyed the ability to automatically fill logins in Safari with iCloud and have those changes sync across devices with no management required. iCloud Keychain has a long way to go to replace a full-featured app like 1Password (you can’t even search for logins in iCloud Keychain), but its integration with Safari is top notch.\niCloud Keychain in Safari (left) and in a third-party app (right).\nIn iOS 8, third-party apps that work with a web service that you’ve already logged in with Safari can request to access your credentials stored in iCloud Keychain, automatically filling their login fields for you. The new version of Screens (out today on the App Store) supports this new feature: if you have a screensconnect.com account saved in iCloud Keychain, the app can ask to use a “Safari saved password” so you won’t have to type it. It’s a minor addition (Screens also supports the new 1Password extension), but it really makes a difference thanks to Safari’s deep integration with the rest of the OS.2\nThe new Safari for iPad is, by far, the best version of iOS’ Safari yet. The main interface hasn’t changed substantially: there’s still a bookmarks bar, a tab bar, and Safari Reader is accessible by tapping on an icon in the address bar.\n\nGestures and tabs have been redesigned and rewritten to allow for a much faster and smoother interaction that is an absolute pleasure to use on the iPad’s larger screen.\nAt any point during navigation, you can pinch-to-close with two fingers and the current webpage will zoom out and shrink to reveal a new birds-eye view of all your open tabs. This new tab view, inspired by Safari on the upcoming OS X Yosemite, takes advantage of the iPad’s screen with visual previews for tabs that offer more information and context than a tab bar.\nIn this view, the tap targets of the pages’ titles are large enough that you can tap to open them and pinch-to-zoom to get closer to a stack of pages and get a bigger preview. You can tap & hold to reorder pages, swipe them to the left to close them, and press the “+” button to see recently closed tabs. The experience is extremely tactile and, combined with a beautiful and elegant design, some of Apple’s best work on iOS.\nI appreciate a lot of small tweaks and changes that may appear irrelevant taken individually, but that add up over time and make Safari for iOS 8 friendlier and faster to operate.\nWhen you switch to private browsing, Safari no longer asks you to close or keep all your tabs. It just keeps them all and switches to private mode.\nYou can close iCloud Tabs open on other devices. Swipe to delete a tab, and it’ll close on the device it belongs to. I like that I no longer launch Safari for iPhone to twenty open tabs every day.\nTap the address bar, swipe down, and you get options to add the current URL to your favorites or request the desktop version of the website you’re viewing. The latter is a great addition because it means I don’t have to see unusable mobile themes anymore, but I wish there was a preference to always request the desktop version for specific websites.\nI’d like to have a special mention for Shared Links. When Apple added this feature to Safari last year, I thought it was a cool way to read links from Twitter timeline; it’s a simple filter for links on Twitter, built into Safari, and I use it quite a bit when I don’t have time for tweets and i just want some links. I’ve discovered many interesting articles thanks to Shared Links – I almost wish that Apple made it a separate News app.\nThis year, Apple added RSS support to Shared Links for iOS 8, which is an interesting turn of events as people regularly like to announce the end of RSS and Apple removed support for RSS feeds two years ago in Safari for Mountain Lion. \nUnifying Shared Links with tweets and RSS feeds sounds strange, but it works. It’s nice to be able to see a list of links from your subscriptions in the browser, which you can tap to open in the current tab immediately. Shared Links will likely never become my preferred destination for news, but I’m glad it’s there.\n\nI wish that Apple improved the way downloads are managed in Safari, but it looks like we may have to wait until next year for that. There is still no download manager in Safari, a surprising omission given the addition of iCloud Drive’s filesystem layer in iOS 8: Apple could have given Safari its own file container and let the browser save downloads in there and sync them with iCloud. Instead, tapping on a file that Safari can’t render still shows no progress for the download, but when the file is finally previewed you get the old menu that offers you to open the file in another app. At least you can run extensions on “downloaded” files.\n\nThe addition of frequently visited websites is a nice one. Available in the Favorites menu by tapping the address bar, Frequently Visited shows large icons for websites you often go to – it’s like Top Sites, but it takes up less space.\nOne of my other favorite features of Safari is related to its engine, which is now available to third-party apps. In the old days of iOS and until last year, it used to be that Apple had a faster Safari rendering engine called Nitro that, due to security concerns, was exclusive to Safari. Third-party apps that had web views (such as Twitter clients and news readers) couldn’t use the same JavaScript engine and were thus slower to render webpages. You may have noticed that Safari got faster over the years, but third-party apps remained slow. That’s changing with iOS 8.\nThanks to a new developer API called WKWebView, any iOS 8 app can now use the same rendering engine of Safari. This sounds trivial – just use the same technology of the system browser – but it’s actually quite a technical achievement on Apple’s part. The difference between apps that use the old web views and iOS 8 apps that use the new APIs is noticeable, and it makes me open less tabs in Safari because I can read webpages in apps with the same speed and performance. I like this democratization of web views as it puts an end to third-party apps being second-class citizens when it comes to web performance.\nI continue to be impressed by Safari’s speed, elegant interface, and fluid scrolling and zooming. In iOS 8, Apple brought changes to tab management that I found particularly noteworthy and useful on the iPad; these changes, combined with iCloud Keychain and higher performance, have made my browsing faster and more efficient.\nNotifications\nWith iOS 8, Apple overhauled Notification Center and the way notifications are managed by letting you:\nDismiss individual notifications in Notification Center;\nQuickly reply to iMessages without opening the app;\nTake action on notifications with buttons embedded inside the notification banner.\nThese are solid changes, and they share the common theme of making you save time when handling the incoming stream of notifications, which can be daunting these days. I’ve only been able to test interactive notifications with a handful of third-party apps, but what I saw was enough to confirm that these improvements were long overdue on iOS.\n\nNotifications can be swiped to the left in Notification Center to show a delete button and, if available, custom buttons to act on a notification without opening the app that sent it. In email messages, for example, you can swipe to show Mark as Read and Delete, which will perform the respective action directly from Notification Center without launching Mail.3\n\nNotifications are also interactive when they come in through banners. Swipe down, and you’ll get action buttons if the developers added them to the app. I haven’t used interactive notifications from banners much: I don’t usually want to act on email right away, and I couldn’t test many third-party apps that supported them.\n\nI’ve used the quick reply feature for messages a lot, and I cherish the time it’s saving me on a daily basis. Quick replies are one of those features that just make sense: of course you should be able to respond to messages on the fly without having to switch apps. Android had this for years, and I’m glad that Apple began to open up with inline replies with iOS 8, even though they’re limited to the Messages app for now (you won’t be able to quickly reply to a Twitter DM or Slack message).\nQuick replies allow me to be polite and send a short reply immediately without losing the context of what I’m doing. Prior to iOS 8, I often ignored notifications and upset people because I forgot to open the Messages app and respond to them; now, the distraction is still there, but the decreased friction incentivates me to reply and deal with it immediately because it’s easier.\nMy only gripe is that the quick reply banner doesn’t update if a new message comes in while you’re typing; thankfully, you can tap the banner and you’ll be taken to the Messages app with your text already filled in.\nI can’t wait for quick replies to become available to third-party apps.\nQuick Comments on Mail\nMail is always one of the first apps I set up with a fresh install of iOS, and I was curious to check out Apple’s additions in iOS 8. Mail has largely stayed the same, with the exception of new gestures to act on messages or dismiss them to take a quick look at your inbox.\n\nThe swipe-to-close gesture was demoed at WWDC and it works on both the iPhone and iPad. When you’re writing an email, swipe down from the title bar to “dock” your message so you can go back to your inbox and see other messages that you may need to reference before continuing to write. This gesture works beautifully and I’ve been using it a lot to see other messages or double-checking information without having to save a message as draft.\nThere are also new gestures to quickly archive, flag, or delete messages, which are clearly inspired by the short/long-swipe mechanism popularized by Mailbox. These gestures can be configured in the Settings and they make processing email faster, but unfortunately they’re only partially available to third-party developers.\nWhat I don’t understand is the lack of extensions and document picker in Mail. Apple has built a new secure system to integrate with apps and bring better document management to iOS, but both are completely missing from Mail. It would only make sense, in my opinion, to be able to turn emails into todos using a share extension or to print to PDF with an action directly in Mail without installing a third-party client.\nSimilarly, Apple has long featured a button to insert a photo or video in a message in the copy & paste menu, but they haven’t done the same for the iCloud Drive document picker. I want to believe that both of these features were cut due to time constraints and that they will be available in the future.\nLast, I would like to see an email button in the new Recents/Favorites menu of the multitasking view. I’ve been using those shortcuts quite a bit, but they’re limited to starting calls or messages.\nQuickType and Custom Keyboards\nAs someone who writes on the iPad every day, I was looking forward to improvements in the keyboard area. With iOS 8, Apple decided to add a new predictive engine to the system keyboard called QuickType, and then give developers the tools to create custom keyboards that can (almost) replace the default keyboard.\nAs far as Apple’s keyboard goes, QuickType hasn’t had a meaningful impact on my typing habits yet. Available as an extra row on top of the regular keyboard, QuickType displays three buttons for words you likely want to type; Apple says that QuickType is able to understand the context of the current conversation in Mail and Messages and that it should adapt to your writing style.\nI have mixed feelings about QuickType. Occasionally, suggestions provided by QuickType are accurate and match what I want to write – it’s usually expressions like “cool, see you later” or “Talk tomorrow?” that I want to send as short replies. On the other hand, QuickType is often too formal or generic to be used in real conversations with my friends, who would easily realize I’m sending impersonal canned messages.\nQuickType lacks personality in many of its predictions, which explains why I’ve been using it more as a fancy autocorrection tool than a predictive engine. I believe QuickType has a bright future ahead of it though, especially because of the deep system integration that Apple can build without being creepy about it. QuickType already works with multiple languages, and I look forward to seeing how it’ll grow.\nWhat surprised me and continues to intrigue me is the amount of innovation I’m seeing in custom keyboards.\nPerhaps it’s the fact that iOS users couldn’t enjoy the versatility of custom keyboards before, but besides keyboards that are only bringing themes and other questionable interfaces to the standard iOS keyboard, I’ve seen developers come up with fantastic ideas to enhance the typing and data input experience on an iPhone and iPad.\n\nSwiftKey delivers accurate and personalized predictions based on your writing style (the keyboard can plug into your Twitter and Facebook accounts to learn how you write) with a swipe input method on the iPhone. The TextExpander keyboard expands your existing snippets as you type. There are going to be hundreds of custom keyboards on the App Store starting today and, like extensions, Apple opening up this area of iOS to third-party developers means we’ll see an explosion of new ideas – some of them useless and ugly, others elegant and powerful.\nGenerally speaking, I’ve been testing a few custom keyboards for iOS 8 and I still have to find a balance between the keyboard I’ve always used and new experiences offered by developers. Keyboards that provide a specific feature such as clipboard management or emoji insertion were easier to fit in my workflow because I know I use them when I need them. They are utilities, available in a new area of the OS. Single-purpose custom keyboards that deal with data and text in new ways will be huge on iOS 8.\nFull-featured alternative keyboards are seeing a steeper adoption curve in my personal experience because I keep thinking that maybe I’m just fine with Apple’s keyboard. I assume that it’s a matter of time – for better or worse, I’ve been using the default keyboard for seven years now – and I’m curious to see if a keyboard will slowly gain an edge over Apple’s default solution over the coming months.\nOne of the big reasons behind my doubts for general-purpose custom keyboards are the limitations that Apple put around them, and I suspect that a lot of users will go through this phase.\nCustom keyboards can’t:\nAccess the system settings of Auto-Capitalization, Enable Caps Lock, and dictionary reset feature\nType into secure text input objects (password fields)\nType into phone pad objects (phone dialer UIs)\nSelect text\nAccess the device microphone\nFurthermore, app developers can choose to reject custom keyboards in their apps; and, while keyboards must display the keyboard switcher (globe) icon to return to the system keyboard, they can’t show a quick switch menu on tap & hold, which considerably slows down the process of moving across keyboards.\nYou can’t use Apple’s keyboard switcher in custom keyboards.\nAll these limitations can be overcome by setting preferences in each custom keyboard and accepting the fact that iOS will often default to the system keyboard due to privacy concerns. And that is fine: I appreciate the secure model that Apple built to protect customer data as much possible and I like that Internet access needs to be granted manually to custom keyboards.\nMultiple keyboards, multiple settings.\nBut from a user’s perspective, this lack of deep system integration, little trade-offs, and increased friction add up over time when trying to use a general-purpose custom keyboard as your only keyboard. At least in my experience, I’ve found it easier to switch to simpler custom keyboards when I needed them rather than keyboards meant to be used all the time.\nThere are some great ideas coming out of developers of custom keyboards for iOS 8 and it’ll be interesting to see this market shape itself in the next year.\nUntil then, any developer who fixes Apple’s Shift key design is welcome.\nBattery Usage\niOS 8 includes detailed battery usage statistics by app (also known as “battery shaming”) meant to expose apps that consume a lot of energy. These stats are available in Settings > General > Usage > Battery Usage and they can be viewed for the last 24 hours or x number of days (usually 3, 5, or 7 days).\n\nIt’s a simple addition taken from Mavericks, but it made an important difference in my everyday usage of the iPhone and iPad. I’ve already deleted a bunch of apps that were consuming too much battery in the background and that I didn’t know about, which has made my devices’ battery last longer. I can see which apps or system features are using my battery and how (Beats Music, for instance, says “Audio”) and there are handy suggestions to increase battery life such as enabling auto-lock or decreasing screen brightness.\nThis is a welcome addition, especially for people who spend a lot of hours on an iOS device each day. I’m hoping that next year we’ll get even more detailed stats with graphs and battery usage by app version.\nA Note on iCloud Drive and Spotlight Suggestions\nI’ve only briefly used Spotlight Suggestions and iCloud Drive on my devices and, while I believe they have potentially huge implications for search and file management, I’m not ready to have a complete opinion on them yet.\niCloud Drive is, in theory, a good idea: Apple realized that users like to organize their documents in folders after all, so they came up with a unified location that combines the old iCloud (files are inside their respective apps) with a traditional filesystem. In iCloud Drive, you’ll see folders with app icons on them, but you can also move files around, create new folders, and drop files anywhere you want.\niCloud Drive is shared across apps, meaning that any app can access files from other apps and save them without creating a duplicate copy. iCloud Drive is meant to replace the old Open In system and it shows Apple giving up on some of its vision for the sake of user convenience and simplicity, but it’s too early for me to tell whether it’ll have an impact on how I work on iOS.\nThe reason is twofold: I don’t use document-based iCloud apps and I was more interested to try document provider extensions, also new in iOS 8. In the past three months, none of the Dropbox-enabled apps I use switched to iCloud Drive and I’ve only been able to try one document provider (Transmit). I could elaborate on the principles behind these features, but I very much prefer to wait for a practical comment (plus, I already wrote about this topic in June).\nThe issue with Spotlight Suggestions is different: I only managed to activate them a couple of weeks ago as they weren’t available in Italy during the iOS 8 beta. Spotlight Suggestions are a great idea: they put web content such as Wikipedia pages, iTunes and App Store results, Maps locations, and more directly into Spotlight without having to jump through Google or use the search field in an app. Apple wants to push Spotlight as the universal way to search and land on relevant results rather than an application launcher or search utility for your local data.\nSpotlight Suggestions.\nResults opened from Spotlight Suggestions.\nThis is, potentially, a big problem for Google. In my two weeks of testing Spotlight Suggestions, I’ve already avoided Google a few times and arrived directly on Wikipedia or suggested websites thanks to Spotlight Suggestions (they are available in both Spotlight and Safari). Suggestions show a small snippet of text and a photo before you tap the result and they take you directly to the relevant page without seeing ads or small links to choose from. Not only is it faster, it’s a better user experience.\nApple is, however, facing years of habits and expectations when it comes to search. People associate search with Google and they’re used to open google.com even just to launch websites. Spotlight Suggestions display one or two top results based on what Apple think it’s appropriate, and they don’t obviously come with all the features of Google Search (namely faster suggestions and Google Images). For simple search queries, though, I think that Spotlight Suggestions are fine, and they’re growing on me.\nIt’s also interesting to think about Spotlight as a search engine for iOS apps. Spotlight Suggestions can look for results embedded in specific apps (iTunes Store, Mail, etc) and launch them without forcing you to open the app first. I would love to see Spotlight become a search feature that gets results from any app on my devices and that can be opened from anywhere, not just the Home screen.\nContinuity\nContinuity is Apple’s initiative to let iOS devices and Macs seamlessly exchange data and activities either automatically or manually with Handoff, and I’ve come to rely on some of its features quite a bit.\nIt seems obvious now that iPhones and iPads should relay messages and calls to each other and let you reply or respond on whichever device you prefer, and it’s already difficult for me to remember that this is a new functionality of iOS 8.\nI love being able to pick a phone call and read SMS on my iPad without going in another room to find my iPhone. I’m often working with my iPad around the house, and I get the occasional phone call that I need to take but I don’t want to get up. Same with SMS – I have a couple of Android friends who still send regular messages and, of course, two-factor authentication systems rely on standard SMS.\nBefore iOS 7, I would get a confirmation code on my iPhone and I’d have to quickly find it and type the code manually into a login field of an app. With Continuity, I get the SMS relayed to my iPad so I can just copy and paste it.\nMy parents still like to call me with regular old phone calls (as does my doctor), and thanks to Continuity the phone part of the iPhone is turning into “just an antenna”. I rarely used my iPhone as a phone before, but now that I can receive and start phone calls on my iPad – which I’m always using for work all day – I’ve noticed that the Phone app is barely launched anymore.\nUnfortunately, Apple decided to delay the SMS Relay feature until October, likely to fix remaining bugs in anticipation of OS X Yosemite. During the iOS 8 beta, SMS Relay worked very well on my iPhone 5s and iPad mini, and I’m surprised that Apple decided not to ship it for iOS 8 devices today.\nI was impressed by Handoff when Apple announced the feature at WWDC and I still think that it’s a powerful concept executed elegantly, but, for practical purpose, Handoff hasn’t clicked for me yet.\nHandoff lets you continue an activity on another device without losing your progress or data. For instance, you can start writing an email on your iPhone and decide to finish it on the iPad; by handing off that message, you’ll pick up the draft right where you left off – text and everything. You can do the same with webpages, iWork documents, notes, and third-party apps can implement Handoff as well, letting you seamlessly continue tasks across devices.\nOn iOS 8, Handoff is activated by swiping up an icon in the bottom left corner of the Home screen. That icon indicates the app you’re sending to another device, but it doesn’t hint at what you were doing inside that app; once swiped up, the app will simply resume and load your previous content and context as accurately as possible. Safari will open the webpage you were reading on another device, but it won’t resume your scroll position or text selection.\nI tested Handoff with Apple’s apps and a couple of third-party apps that used the feature to switch between devices. Handoff makes for a good demo: activities are “beamed” instantly between my devices (you don’t need to be on the same local WiFi network) and apps can suggest fallback options if handed-off content can’t be opened in the same app. A Pinboard client that supported Handoff, for instance, could open a webpage I was reading in the app on my iPhone into the same app on the iPad; if I didn’t have the same app installed on the other device, Handoff could open the webpage in Safari instead.\nMy problem is that, besides technical tests, I never actively sought out Handoff when I was working on my iPad or iPhone. Usually, if I begin writing an email message on my iPhone it’s because I know I can finish it there; iCloud features like iCloud Photo Library and iCloud Tabs remove the need to get another device to find photos or webpages, and because both my iPhone and iPad are always with me, Handoff is mostly unnecessary.\nI suppose that the Continuity picture will be complete and better judged once Yosemite is out. I love the automatic and passive propagation of my phone and FaceTime calls across devices; Handoff is neat, but I didn’t find a good use for it on my iOS devices. I’ll check back in October.\nNew Horizons\nThe reinvention of iOS starts from the fundamentals of user experience: how apps communicate with each other, how text is entered on the screen through a digital keyboard, or how documents are managed and stored in the cloud. iOS 8 shows an Apple that’s been listening and identifying the weaknesses of its mobile OS, paying attention to what users and developers wanted from their apps.\niOS used to be, for many, a limited platform where work couldn’t get done and apps couldn’t collaborate or exchange information. With iOS 8, the conversation has been changed to what can be done next. With a more versatile OS capable of being extensible but secure, Apple is creating a huge opportunity for every party involved: an opportunity for users to discover how iOS can help them in their daily lives; an opportunity for developers to explore new app markets and craft powerful software that wasn’t possible before.\nIn typical Apple fashion, new features introduced with iOS 8 will likely be updated with fixes and more options over the next releases and in iOS 9. Apple hasn’t showed many examples of action extensions and widgets in their apps, and the OS isn’t without its fair share of bugs (but they’re far less disruptive than last year’s).\nMore extension points will probably be added in the future (Copy & Paste and Control Center extensions in iOS 9 would be great), going beyond the share sheet and allowing developers to attach extensions to custom interfaces. The current implementation of action and share extensions is already an important step towards the elimination of proprietary sharing menus and the adoption of a more consistent, versatile extension system.\nThere are hundreds of new features in iOS 8 that I haven’t covered in this article because I wanted to focus on my experience with using iOS 8 for work every day. The apps that I’ve been trying over the past three months have profoundly changed the way I can work from my iPad all day: I could run MacStories on an iPad before, but iOS 8 has allowed me to stop finding workarounds to make the iPad do something it wasn’t built for.\nAnd I’ve only tried a few apps for iOS 8. Starting today, we’re going to see new categories of apps on the App Store, built by developers who can work with brand new technologies for document management, inter-app communication, keyboard input, notification management, glanceable information, and more. If I was able to reconsider every aspect of my iOS workflow in three months with just a few iOS 8 apps, I can’t imagine the amount of innovation we’ll see over the next year.\niOS 8 is an incredibly democratic OS. Apple realized that a single company couldn’t provide all the tools for modern consumer needs, and they’ve decided to put the ball in the developers’ court. More than ever (and bugs aside), it feels like developers are building the iOS platform alongside Apple rather than competing with them. Three months into using iOS 8 every day, I sometimes forget that all these new features are now available to iOS users: many existing apps are having new beginnings, and adjusting expectations to new functionalities will take time. \nThe scope of iOS 8’s changes will truly make sense as developers keep building brand new experiences over the coming months. iOS has begun to open up, and there’s no stopping at this point.\niOS 8 has changed how I work from my iPhone and iPad. I’ve never been more excited about its future.\n\n\nApple didn’t even provide extensions to test in their own apps during the iOS 8 beta testing period. ↩︎\n\n\nThird-party apps can also create new accounts in iCloud Keychain. ↩︎\n\n\nTapping Delete on the Lock screen’s Notification Center will also ask for a passcode or Touch ID, which confused me initially (I thought it was a bug). It’s a good move. I tested a third-party email app with support for the same actionable notifications, and the buttons worked as expected. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2014-09-17T14:38:20-04:00", "date_modified": "2018-03-20T13:24:40-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iOS 8", "iOS Reviews", "iOS8review", "stories" ] }, { "id": "http://www.macstories.net/?p=33015", "url": "https://www.macstories.net/stories/living-with-ios-7/", "title": "Living with iOS 7", "content_html": "\niOS 7, released today, is a deep reimagination of Apple’s mobile platform: using familiarity and the need for a reset as catalysts, iOS 7 represents Apple’s attempt to make iOS ready for the future. iOS 7 is, effectively, the epitome of a large company that knows it’s time to get rid of cruft and inconsistencies to bring a new order to a platform that has grown exponentially in the past five years. For developers, iOS 7 brings powerful new tools that will allow for a new generation of more flexible, intelligent, and versatile apps. iOS 7 is not perfect: there are rough spots and some wrong assumptions, but it’s not flawed or, as many will argue in the next few weeks, a “mistake”. It would be extremely silly and shortsighted to judge iOS 7 by the look of its application icons or the gradients Apple has decided to use on some graphics. More than any other Apple product, iOS 7 isn’t just defined but how it looks: iOS 7’s new look is devoted to functionality – to how things work.
\nIt’s difficult for me to offer a comprehensive review of iOS 7 today, because I have only been able to test a fraction of the third-party apps I will use on a daily basis with my iPhone and iPad mini. Mirroring the concept of “design is how it works”, I would say that, for me, iOS isn’t just how Apple’s apps work on it – it’s increasingly become about how apps from third-party developers can take advantage of it.
\nI have been running iOS 7 on my iPhone 5 since Apple released the first beta in June. I later installed the OS on my iPad mini, and have been working with an iOS 7-only setup ever since. As MacStories readers know, I primarily work from my iOS devices, which helped me get a good idea of how iOS 7 will change the way I write, take photos, respond to emails, listen to music and podcasts, and all the other things that I use iOS for.[1] Fortunately, I had the chance to test a good amount of third-party apps that solidified my thoughts on iOS 7 and the way it impacts my digital life and workflow.
\nIt was also hard to get ahold of fellow iOS 7 users in my town. While I imagine that it would be easier to come across a nerd running an iOS 7 beta at a bar in San Francisco, I didn’t have much luck in Viterbo, Italy. I tested new features like AirDrop – which allows you to share files and information locally with other iOS 7 devices – with my iPhone and iPad, and, in the past week, managed to convince my girlfriend to install iOS 7 on her iPhone.
\nI needed to provide this context: my livelihood directly depends on iOS and how I can work from my iPhone and iPad without having to use my Mac. Therefore, if you’re looking for a list of new features and smaller details of iOS 7 (and there are many), bookmark this article. My “review” of iOS 7 will focus on my thoughts on the update, how it made my iPhone and iPad better devices, and what I believe iOS’ future will be going forward.
Until iOS 6, the visual appearance of the operating system was largely related to interface elements and ornamentations aimed at suggesting interaction through the use of real world metaphors. Buttons were shaped as encapsulated buttons, linen drapes adorned the top and bottom of the device for Notification Center and Multitasking, and apps like Notes and Podcasts hinted at their primary functionality by recreating distant relatives of the physical world with pixels on a screen.
\nThe design of iOS 7 is based on what I call “The Four C Principles”:
\nThere are several other concepts, guidelines, and deviations from the main theme at play in iOS 7, but, overall, everything leads back to the four key principles above.
\nIn iOS 7, content is the star of the show. Content comes first; everything else is secondary, playing a supporting role. Content can be text, your photos, videos, gameplay, your music, tweets – content is what you want to produce or consume. Content is what you access when you use an iPhone or iPad, for leisure or work. iOS 7 wants to reclaim content from the interface and present it back to you, elegant and uncluttered. In iOS 7, the interface is deferential to your content.
\nFrom June:
\n\nThe terminology Apple has chosen is different: by Apple’s parlance, iOS 7 is precise, sharp, coherent, simple, and efficient with a focus on clarity, vitality, motion, structure, color, context. Content is paramount; the interface defers control to the user, rather than claiming it. To quote Jony Ive, iOS finds “profound and enduring” beauty in simplicity, “a sense of purpose” in the way the interface takes a step back, reasseses its role, and comes forward again, subdued, neutral, more intimately “connected” to the hardware that lies underneath the OS.
The visual change of iOS 7 is immediately clear from the Lock screen: gone is the iconic arrow of “Slide to unlock”, leaving room for a chevron that points to a “slide to unlock” label that you can drag to move the Lock screen away and get to your Home screen or Passcode unlock screen. There’s a subtle animation that runs across the chevron and text, indicating that you can hold and move those elements; if you just tap them, they bounce, suggesting that something is waiting for you behind them. The keypad no longer resembles a physical keypad – it’s a grid of tappable numbers.
\n\nThe predominance of content in iOS 7 is well exemplified by the OS’ use of whitespace, clean and neutral typography, and use of available space on a device’s screen. Navigation bars and toolbars tend to be white or black, depending on the color theme of the application, and they blend with the system’s status bar, now an element of design that integrates with apps and doesn’t make them feel like objects inside another part of the UI. For the same idea, content extends from edge to edge, using every available pixel to make apps feel more spacious and less constrained. In email messages, inline images go from edge to edge; in Settings, white section headers extend to the full width of the display.
\nIn iOS 7, the edges of the display are even more relevant: in iOS 6, only the top edge could be used to access Notification Center, whereas now the bottom edge can be used to activate the new Control Center, and there is a system-wide gesture to swipe from the left edge of the display to navigate back to the previous screen of an app.
\nThis approach reveals the intent to ensure software can be true to the hardware it runs on: by bringing pixels to the edge of the screen and making those edges areas that can be swiped to access system features or navigate through content, the hardware and software act more as a whole – they complement (instead of limiting) each other. It’s an invisible interplay under the hood, but, to me, it’s a new cohesion that simply makes sense.
\nA side effect of freeing the fundamentals of the iOS interface from real world metaphors, photorealistic objects, and 3D elements like buttons and glossy bars is that iOS 7 comes with borderless buttons and relies on color and text to indicate interactivity and selection.
\nIn a typical iOS 7 app – such as Apple’s Mail – the blue color indicates interactivity and black text means “content”. The Back button, still available in the upper left corner, isn’t contained in a capsule: now, buttons are generally pieces of colored text that lose opacity when tapped and are desatured when inactive. This choice may be confusing at first: in the iOS 6 era, buttons shaped as glossy, “fat” buttons easily meant “tap me here”. In iOS 7, their simpler, textual look disorients at first, but, in my experience, with time the color/text association becomes more natural and obvious. Text labels as tappable items/buttons have some advantages in my opinon:
\nBesides interactivity, color also plays a big role in how it gives your device and your apps a unique identity.
\n\nApple apps tend to use a distinct color scheme that should help users better differentiate and remember them. Notes uses yellow buttons and navigation items with a subtle letterpress effect[2] while Calendar has a white/red theme that is applied throughout the entire UI; Messages’ blue and green tones and gradients for chat bubbles are obviously different from Mail’s black & white look, and the Music app’s new pink color scheme is one of my favorites. For third-party apps, Apple is advising developers to consider color as one of the key elements for their app’s branding; while I’m not sure that color will be enough to avoid apps that look all similar to each other in the long term, from what I’ve seen so far it certainly looks like developers are listening.
\nColor is more notable in how it completely changes the look of your iOS 7 device through wallpapers. As I detailed in my initial overview in June, iOS 7 relies heavily on translucencies and blurring to hint at portions of the user interface hidden behind what’s currently in the foreground. I wrote:
\n\nThen there is translucency. In iPhone OS 1.0 and up until iOS 6, Apple used solid and glossy bars and controls, with the occasional transparency effect that, in the end, tended to feel like a gimmick and not a core part of the experience. In iOS 7, essentially every element that is layered on top of others is translucent, so that you’ll be able to get a hint of what’s underneath. In the process of considering translucency as a catalyst for context and “giving a sense of place” to the user, Apple has taken apart the layers that iOS accumulated over the years and rebuilt them from the ground up while adding new user features.
Look at Apple’s exploded view of iOS 7: your device and your wallpaper share the same level of importance in the OS’ new hierarchy and structure. Apps and software features like Notification Center and Control Center exist on top of the wallpaper level, and as such they inherit its most basic quality: color. In practice, this means that the simple act of changing your wallpaper will also personalize (through translucency) your experience with a device. Control Center will look different depending on your wallpaper; the numeric keypad will gain outlines based on the wallpaper’s color; in apps like Reminders for iPad, the wallpaper will influence the look of the sidebar, making an app feel like an extension of your device, which is exactly the point.
\nAnd yet, in spite of Apple’s fancy exploded view, there is one element that can overlay the system wallpaper: your contact’s photos. This is a simple, delightful touch that I deeply appreciate. iOS 7 has new circle-shaped profile pictures for your contacts, which show up in the Contacts and Phone apps, as well as Game Center. I’m particularly a fan of the fact that your favorite contacts can have profile pictures next to their name, and I also like the editing UI for adding and cropping a photo you want to assign to someone. But more than the tools, I like that when you’re calling someone, that contact’s photo becomes the blurred wallpaper behind the keypad and Phone UI. Effectively, you’ll see a blend of blurred colors behind the keypad, but, because you know what the real contact photo is supposed to be like, the feeling of communicating with that person will be reinforced. It’s a more personal, human experience than the static, plastic, glossy buttons of iOS 6, and it’s gorgeous to look at.
\n\nThe use of translucency in operating systems isn’t new, but iOS 7 takes it to another level by combining its reliance on color and cleaner interface elements with your content to increase the sense of context that using an app gives you.
\nHere’s an example: when I’m using my iPhone, I’m typically listening to music, either on Rdio or the new iTunes Radio, while reading or just browsing around my favorite websites and subreddits. When I follow a link and Safari/Chrome launches, I don’t like watching a blank page for a few seconds, so I bring up Control Center with a swipe from the bottom edge to double check the title of a song or maybe give it a “star” with iTunes Radio’s widget. Because Control Center is a semi-transparent panel, I don’t have to look at the progress bar in the browser’s address field to know if my webpage has finished loading – I’ll just see the webpage’s content peeking from below Control Center.
\nAnother good example that gives you an idea of how much iOS 7 values context are the new translucent bars. Again, in an app like Safari (but also the App Store and Mail) the status bar and toolbar are translucent, showing a portion of blurred content with a subtle transparency effect. This isn’t just good-looking (I personally love the effect), it’s also useful in that, if there’s, say, a photo below the text you’re reading, you’ll know before scrolling to it. iOS 7’s graphics layout engine is smart in this regard as it treats photos and text differently: photos will be blurred with their primary colors, but, in order to avoid confusion with overlapping icons and letters, if text is behind a bar, iOS 7 will automatically increase the opacity of its UI to not show blurred text. This is the reason why, in apps like Mail and Safari, photos that underlap navigation bars and toolbars will be blurred, but text won’t show through the UI, cluttering it.
\nIn February 2012, I wrote about the problem with the iOS Home screen:
\n\nThe problem Apple needs to overcome is that the Home screen tries to be a real object while providing access to the gates of the digital world. To reinvent it, Apple needs to tear apart the whole concept and rebuild it from the ground up.
When I first installed iOS 7 three months ago, I thought that translucencies were cool, but ultimately useless, if not detrimental to the user experience. Today, I think that, if used correctly and strictly where necessary, blurs and translucency can provide utility in how they hint at content available under the interface. Furthermore, I’ve seen developers (and Apple) using blurring effects to show the new spatiality of the OS: iOS 7 and iOS 7-ready apps have a clear structure based on layers of content and interface, and these effects can help make that structure more obvious.
\nThe simplification of the Home screen that I envisioned in 2012 happens in iOS 7 with a refined layout that uses blurs, layers, and transparencies to tell you how elements are arranged in the new structure. Apple has thought this through to the smallest detail, such as making the app icon that you held to activate wiggling mode be on top of the one next to it.
\n\nWhat I don’t get is the new parallax effect. In June, it was described as a way for your device to feel “alive” in your hands by tilting the interface according to your hand’s movements, making for cool implementations such as being able to see behind the icons on top of a wallpaper. I wrote:
\n\nParallax effects haven’t been enabled everywhere in the OS yet, and I’m looking forward to understanding whether developers will have APIs to implement them in their apps. In practice, parallax was described as being able to see behind the icons while using a photo of your family as wallpaper. Simple and human.
It is indeed a neat effect that makes for a great demo to friends on a Saturday night when the alcohol intake has reached the maximum allowed for the week and you want to show them “how much phones have progressed over the years, now they follow your eyes too”. I get that. But its novelty wears off fast, and I have yet to see a practical implementation that considerably augments the user experience in a way that goes beyond the “it’s cool” factor.
\nApple has applied parallax to various iOS elements like alert dialogs, icons, badges, and it has provided developers with APIs to add parallax and other motion effects to their apps. While parallax contributes to iOS 7’s feeling of depth and layering, I think it’s mostly a gimmick, and not as effective in communicating certain aspects of the user experience as translucency, color, precise typography, or animations.
\nApple has built a physics engine into iOS 7. I had a feeling that Apple had an interest in making interfaces feel more human when they first showed iBooks Author, which, back then, was highly reminiscent of Mike Matas’ work on Push Pop Press (a technology that was later acquired by Facebook). Like other past experimentations that are referenced in the new OS’ design[3], iOS 7’s take on physics is a full-on assault: Apple uses its new engine in several apps, and developers have been given an API to integrate it in their own apps. This new engine has helped Apple (and will help third-party developers) create animations and transitions that give the operating system a new kind of personality, one that is perhaps less immediately visible, but more tangible when using iOS 7 apps.
\n\niOS 7’s guiding principle of clarity uses, among other elements, motion to convey information and express functionality to the user. It starts with minor details: when you press the Home or Power button to wake a device, the wallpaper subtly zooms in and the current time and date fade in before the rest of the Lock screen UI. Besides looking nice, it’s a useful touch as most people tend to wake their iPhones to look at the time.
\nOn the Lock screen, two flat lines at the top and bottom indicate activation areas for Notification Center and Control Center: swipe, and the indicator starts following a Center’s panel, becoming a an up/down chevron when a panel is fully revealed to communicate that you need to swipe up or down again to dismiss the view.
\nWhen you unlock a device, app icons fly onto the Home screen, revealing their new, slightly tweaked shape that follows Apple’s new grid system to achieve a more harmonious, “tappable” feeling. Many have criticized the new icons’ shape and grid, but I think they are a massive improvement over the old iOS 6 icons, which now feel rigid to my eyes. Especially in apps like iTunes Store, App Store, and Messages, the combination of Apple’s color palette, squircle, and grid makes for more fun, approachable icons.
\n\nAs you start using an iPhone with iOS 7, you’ll begin seeing all sorts of other animations and transitions that leverage the previously mentioned color, context, and content principles to tell the user what’s happening on screen in a delightful new way. When you tap an app, the transition between the Home screen and the app’s launch screen zooms into the icon; when you close an app, iOS zooms out, and the Home screen comes up again. Or when you do something as trivial as tapping the folder button to move an email message into a folder, the current message shrinks and goes into a toolbar at the top of the screen.
\niOS 7 is full of animations that contextualize actions through the use of natural motion. With a mix of color, translucency, and physics the OS’ animation system seems more human. If Apple was trying to make iOS 7 feel “alive” on the screen, I think that animations and transitions do a much better job than parallax in this regard – they are always there, always part of the system, not necessarily making (some sort of) sense only if you tilt a device. They don’t have the over the top, look-at-this-real-moving-object aspect of iOS 6’s Passbook paper shredder, but because they aren’t limited by physical metaphors, they can respond to different, varying qualities such as gravity, mass, or spring.
\nWhen Apple does try to more heavily imitate the real world, such as in the new Weather app for iPhone, both the interface and animation system are done tastefully and thoughtfully. In Weather, conditions become part of the user interface with clouds, fog, sunrays, snowflakes, hail, or snow reflecting or bouncing off text and slowly moving in the background. Again, it comes down to providing context to the user and creating software that makes information more human, easily understandable, and obvious.
\niOS 7’s animations are digital, but, under the surface, they are more real than shredders and torn bits of paper. They don’t try to replicate our physical world through pixels and photorealism – Apple is simply recreating the physical rules that govern our world. That’s a profound, far-reaching vision.
\nMy only problem with iOS 7’s animations is that some of them take too long to execute and complete, and they can become boring after a while. I agree with Marco Arment when he says that some animations are too heavy-handed and patronizing after the first dozen of times you’ve seen them.
\nGenerally speaking, I think it would be better if Apple sped up the Home screen and multitasking animations. But I also think that it’s important to not lose perspective and understand that the majority of iOS users (read: not geeks and developers) won’t likely notice the extra milliseconds that Apple has built into some of the animations. In fact, I’d tend to say that most users will actually like the obvious, in-your-face animations because they make iOS feel new and different. Being iOS 7 the major departure from the past that it is, it’s important to make the transitions stand out to instill the new basics of the OS in the average user. The speed of some of the animations is a bitter medicine for geeks, but a necessary means to say “this is what’s happening on screen on the new iOS”. With time, as users grow more accustomed to iOS 7, my hope is that Apple will start cutting the execution time of animations, making the OS feel snappier.
\niPhone OS 1.0 was designed for a world that didn’t know smartphones and tablets would redefine our digital lives. It was all about making touch interactions obvious through visual cues and metaphors that could easily transition the user from the physical world onto the reality existing inside multitouch screens. Without a mouse or a physical keyboard, Apple’s best decision was to make iPhone OS 1.0 futuristic and advanced but at the same time reminiscent of its apps’ analog counterparts. The Notes app was a legal pad and the OS had buttons shaped like capsules. It had to be done, and it was amazing.
\nSix years have passed since iPhone OS 1.0. The tech world has changed, design trends have matured, and Apple has kept adding feature after feature to iOS, struggling to find physical metaphors to explain additions like Passbook and its player in the Podcast app. At what point do fun designs become an exercise for Photoshop skills and, overall, just gaudy and tacky?
\nApple finds itself in a unique position now, trying to keep the familiarity of iOS 6 while imagining the next five or ten years of the platform. Last week, Craig Federighi stressed how iOS 7 will become the most popular mobile operating system in the world thanks to over 700 million iOS devices sold to date. Even if we assume that just half of those devices will upgrade to iOS 7, that would make for 350 million users who know how to operate iOS. How do you balance the need of explaining iOS to new generations of future customers with millions of existing users who want iOS to be more useful and more delightful?
\nFrom a design standpoint, part of iOS 7 comes down to taste, and the rest is based on principles that should allow a further growth of the operating system for the next several years. I don’t think that iOS 7’s design language is misguided or poorly managed: there are some false steps and rough spots, but I think that iOS’ new design will allow Apple to be more flexible and innovative with the features they’ll add in the next few years.
\niOS 7’s new design doesn’t tell the whole story, though. In the end, people use iPhones and iPads – they don’t spend hours looking at them and over-analyzing their UI designs. It’s important to remember that, after all the discussions about typography and animations, iPhones and iPads have to work for us. They have to be useful.
\niOS 7 has powerful new features and developers tools. Aside from my personal preference for the new look, functionality is my favorite part of iOS 7. But with some caveats.
\nAs I’ve previously shared on MacStories and The Prompt, when I installed the first beta of iOS 7 I decided to run an experiment with my workflow and see if I could move my task management system from OmniFocus to Apple’s Reminders. I was intrigued by iOS 7’s new Today view in Notification Center, which gives you an overview of your day by grouping weather information with events from your Calendar and todos from Reminders. It’s like a summarized assistant, in textual form, up in Notification Center, available anywhere. It sounded like a good deal, so I figured I could try it.
\n\nIt really worked for me. I’ve come to rely on the Today view as a way to quickly see all relevant information for a day, and also a brief glimpse of tomorrow thanks to the iCloud Calendar integration. The Today view works for me for various reasons:
\nIt would be great to have third-party apps in the Today view, but, to my surprise, I’ve been fine with Apple apps in Notification Center’s Today view for a personal overview of my day and the things I have to do. Notification Center has gone from a linen-themed panel to a full-screen translucent view on the iPad, and, even if it’s not Apple’s most original design ever, it is functional.
\nBack in June and July, I had to drive each morning to a local gym for my daily physical therapy session. After a few days of sessions, iOS 7 noticed a pattern in my behavior and, without configuring anything, it started telling me (through the Today view) how many minutes it would take me to drive to the gym at a specific time between 11 AM and 12 PM. Depending on traffic conditions, iOS 7 would change from 5 minutes (normal time) to 7 or 10 minutes. I don’t know how iOS 7 figured out the traffic information (my town, Viterbo, doesn’t have traffic information in Apple Maps), but it worked and it was accurate. My best guess is that iOS 7 used the new Frequent Locations feature (available in Settings > Privacy > Location Services > System Services > Frequent Locations) to understand my driving behavior, driving times, and daily patterns to improve the information it would feed to Notification Center and the Today view.
\nI don’t understand why Apple didn’t go the extra mile and enhance every calendar event that contains location information with the weather and directions. Apple knows how to triangulate and parse this data, because tapping on a location in a Calendar event takes you to the Maps app that displays directions (either driving or walking depending on your default preference, a new feature of iOS 7) and looking up a location in the Weather system should be trivial. Most of all, OS X Mavericks’ Calendar will come with exactly this functionality. Instead, iOS 7’s Calendar requires to jump to the Maps app and displays no weather data, while the Today view can learn from your patterns but can’t display a summary of directions and weather forecasts for each event. It seems like a missed opportunity, especially because, again, Apple is doing it in Mavericks.
\nEven more surprising is the decision to completely hide all-day events from the Today view. I typically create all-day events for days when I know something important will happen and require my complete attention for several hours (a monthly check-up with my doctor, or a big app release) and they are hidden from the Today view because Apple said so. I don’t see any reason why all-day events – which, as the name suggests, are important! – wouldn’t have to appear in a view called Today. It doesn’t make any sense.
\nI like iOS 7’s Today view in Notification Center, but it would be much better and useful if it handled all-day events and embedded weather and driving information next to each event. I could understand Apple’s willingness to restricting Today summaries to built-in apps only, and I could even see why adding weather and directions could clutter the UI[4], but the lack of all-day events is a silly choice.
\nAs a consequence of my increased usage of iCloud Calendar and Reminders, I’ve tried to live with Apple’s native Calendar and Reminders apps. Calendar doesn’t work for me, but I like some things about the Reminders app.
\nOn the iPhone, the Calendar app starts with an elegant year view that lets you drill down into single months and then days with transitions that are smooth and fast. The animations help contextualizing the action of entering a different view by using a zoom in/out effect that is consistent with the in/out transition of the Home screen and app icons. There are delightful touches such as the way a month’s name follows you along from the year layout to month view or how a selected day shifts the entire week to the upper portion of the screen, revealing a day’s list of events in the bottom half. Days with events have gray dots, there is a Today shortcut, and you can swipe horizontally to switch between days. The app is extremely polished, focused, and precise.
\n\nThe problem is that, besides good looks, I don’t like the way the Calendar app works for me. First off, it doesn’t support natural language input like Fantastical and Calendars 5, and I can’t stand adding new events by having to tap menus and operate spinners to set locations and duration. Second, unlike Fantastical’s excellent DayTicker, Apple’s Calendar for iPhone shows a single day view for each day – if you have an empty day, you’ll see empty hours. The view that I like – a list of upcoming events in chronological order, with no empty days or hours – is available by tapping the search icon, which is incovenient to do every time. Fantastical and Calendars 5 start with a list of all my upcoming events (and, in Calendars 5, reminders); in Apple’s Calendar, I have to remember to tap the search icon to see that list, and there is no natural language support. iOS 6 displayed the List view as a tab in the bottom toolbar, and I think Apple should add it back in that original position (possibly giving the option to set it as default launch view).
\nIt doesn’t get any better on the iPad. On the iPad mini, the app doesn’t come with the playful transitions of the iPhone’s counterpart on the iPhone 5, and it adds a Day view that, in a larger layout, lists a single day’s worth of events…with empty hours and empty days if there are no events. The list view is tucked away inside a popover. On iOS 7 for the iPad popovers are generally white and borderless, which often makes it hard to distinguish them from the rest of the (mostly white) interface. Popovers are a fantastic piece of UX on the iPad, but I feel like iOS 6’s ones, with borders and well-defined bounds, were easier to separate from content underneath.
\nOverall, I’m much faster in adding and managing events with Fantastical, Agenda, and Readdle Calendars 5, which may not be as good-looking as Apple’s app, but at least they are more efficient for a power user like me. For people who usually add a couple of events per week I think that Apple’s app will be fine, but for everybody else I would recommend, like on iOS 6, looking for a third-party alternative.[5]
\n\nThe Reminders app is good, and, for me, it comes with a minor but annoying issue. On the iPhone, the app gets rid of leather but keeps the paper texture to display lists as cards that you can tap to open, close, or swipe to take a peek at the first item of a list when they’re closely grouped together. The lists’ names use Apple’s new API to have a nice letterpress effect, and the system wallpaper acts as background in the app, which is a nice touch for user personalization. Thanks to the cards layout, you can tap & hold a card and rearrange it vertically just by dragging it. On the iPad, Apple eschewed the stacked cards representation and opted for a more classic, but convenient split layout that shows lists in a sidebar on the left.
\n\nThe options for creating and managing reminders are unchanged, but there is a handy addition for location-based reminders: you can tweak the radius of a location. The “When I arrive” and “When I leave” settings are now displayed on top of a map view that lets you hold your finger on screen to change the radius of a location’s trigger from 100 meters up to kilometers. If you enlarge the radius far enough, you can create location reminders that will alert you when you leave the country.[6]
\nThe reason I truly like the Reminders app is the Scheduled view, available by tapping the alarm clock icon that resides in the top right corner on the iPhone, or at the bottom of the sidebar on the iPad. The Scheduled view gives you a simple, direct visualization of all your reminders that have due dates, sorted by chronological order from top to bottom. It gathers reminders from all lists, and it’s a great way to see all your reminders in one place.
\n\nAlso worth of mention is iOS 7’s alert dialog for due reminders. When a reminder’s alert fires off (possibly using one of iOS 7’s new, futuristic, beautifully synth-based sounds), you get a dialog with two buttons: Options and Close. The Close one is displayed with bold text, making it the default choice for most users – close it, complete the reminder, then open the app or the Today view and check it off. But if you tap options, you get an expanded dialog with shortcuts to be reminded again in 15 minutes, mark as completed right away, or open the reminder. This is a superior design that speeds up the process of acting on alerts, and I wish Apple did it with more kinds of notifications.
\nMy complaint about the Reminders app is that it doesn’t support tappable URLs. Using Reminders as my main todo system and writing for the web, several of my reminders include URLs in the notes, and iOS 7 doesn’t let me tap them to open them in the browser. To add insult to injury, Apple knows how to do URL matching, because URLs are tappable in both the new Notes app and Calendar. For tappable URLs, I’d recommend using something like Agenda (which also lets you open links in alternative browsers like Chrome), or Due (which has its own sync system, but comes with many other nice touches).
\nSpeaking of the Notes app: it retains the essentiality of its iOS 6 ancestor, it gets rid of leather and yellow paper, and it murdered Marker Felt with the fury of a letterpress machine (the letterpress effect is the same of Reminders). What I don’t like is that a note’s title isn’t repeated in the title bar, which makes it easy to lose context in longer notes. It supports iOS 7’s new Back gesture for navigation and AirDrop, but, in my tests, AirDrop led to duplicate notes on the receiver’s end, which wasn’t cool. I think that some of the Notes animations, especially on the iPad mini, are a bit rough and unfinished (such as swipe to delete).
\nI haven’t been able to use AirDrop much in real-life scenarios, but, from what I could test in my limited home environment, I think it’ll be a great addition for peer-to-peer sharing that will obviate the need for cumbersome solutions like Dropbox and Mail when you just need to share a document or piece of data with a friend or colleague next to you.
\nLike its OS X counterpart, AirDrop for iOS uses an encrypted, WiFi ad-hoc communication to share files between compatible devices nearby. AirDrop is supported only on Apple’s most recent devices like the iPhone 5 and later, iPad 4th gen, iPad mini, and iPod touch 5th gen. In iOS 7, AirDrop settings live in Control Center, where you can tell iOS 7 to make yourself visible through AirDrop to everyone nearby, just contacts, or nobody. If you choose the Contacts setting, only people who are in your contact list and using an iCloud account will show up.
\nUsing AirDrop is extremely easy and I believe it’ll supplant awkward web based sharing solutions for things like photos and URLs. Available by default in the new share sheet, people visible through AirDrop will show up with profile pictures; tap one (or multiple ones) and they will get a request to accept what you’re sharing; when done, the receiver will get the file or information open in the default iOS system app, and you’ll see a progress bar filling the outline of that person’s avatar, followed by a “Sent” label in the share sheet. Everything is familiar if you’re coming from OS X, and even if you’re not, it’s easy to use and intuitive once you’ve tried it a couple of times.
\n\nI found some interesting touches in iOS 7’s AirDrop implementation worth noting. AirDrop’s alert dialogs can contain inline pictures, which is a neat way to see a preview of the file you’re receving (such as a photo) directly in the AirDrop confirmation dialog. So, say you’ve left AirDrop visible to everyone in a public place and someone tries to send you an inappropriate photo, you can see a preview of the photo before accepting, so you can decline and file request and turn off AirDrop. Apple’s inline preview system is well done, as it supports snapshots for web pages shared from Safari, icons for App Store apps, and a screenshot of a location’s view for Maps sharing. It’s a nice addition on a technical level, which ends up being a practical implementation to enhance AirDrop’s security and user experience.
\n\nBy default, AirDrop tries to open a received file or bit of data in the system app that is associated with it. So, for instance, a photo will be received and added to the Photos app, an app’s link will open in the App Store, a map in Maps, and so forth. For files and data that AirDrop can’t launch in a default app, however, Apple added an instance of its Open In menu to the AirDrop confirmation dialog. In iOS 7, the Open In system hasn’t been redesigned, and it lives in the share sheet in the form of application icons. When sharing a document like a .txt file through AirDrop, iOS 7 will ask you to accept the file, and then choose an app to open it with. I would have preferred an Android-like option to always default certain file types to a specific app, but I’ve learned a long time ago not to expect this sort of feature from Apple (it’d always be welcome, though).
\nI’ve always advocated for a version of AirDrop for iOS devices, and its implementation on iOS 7 doesn’t disappoint. Within the existing limitations of iOS that haven’t been addressed in 7.0 (Open In system, lack of user-configurable default apps) AirDrop “just works” thanks to peer-to-peer sharing that is fast and doesn’t require passwords or uploading to cloud services. I would like to see a simpified management of the “contacts-only” setting, but, otherwise, I think that AirDrop sharing will immensely improve things like local photo and video sharing for everyone.
\nThey’re not actionable like some of their Mavericks counterparts, but I’ve found myself liking iOS 7’s new banner notifications. They haven’t received any sort of new functionality that iOS 6 didn’t have before, but they’re now translucent and, for apps that have been designed following Apple’s guidelines, they’ll cover up the exact upper portion of an app where the status bar and navigation controls should be. You can really tell when an app hasn’t been designed using Apple’s advised size for the navigation bar because of how banner notifications will cover parts of the interface they shouldn’t cover.
\nOne minor addition that I do appreciate is that you can pull down a banner notification to reveal it in the full-size Notification Center. It makes for a neat way to see a single notification with more context in regard to others that were sent by the same app; if you want to immediately dismiss a banner notification, you can swipe it up and it’ll quickly go away.
\nI haven’t been using the Missed view of Notification Center at all. According to Apple, the new view is supposed to show you only alerts that you haven’t addressed in the past 24 hours, but, in practice, I always ended up opening the default All view and cleaning notifications from there.
\nWhich leads me to my two personal favorite features of iOS 7’s Notification Center: sync and Lock screen access. The latter is obvious and convenient: you can now pull down from the status bar in your device’s Lock screen to access the real Notification Center, so if you want to manage your missed alerts from the Lock screen while waiting in line at the grocery store, you can now do that. But that’s not the best part of Notification Center.
\nWhat I found truly great is notification sync through (what I assume is) iCloud. Imagine this: you receive a text message on your iPhone, but because you have also an iPad, you get it on that device as well. Now you have the same notification on two devices. Prior to iOS 6, you’d have to manually address the notification both on the iPhone and iPad. No more. On iOS 7, once you’ve addressed a notification on one device, it syncs to the other device automatically, removing the notification from Notification Center. Try it with the Messages app: get a notification, read it on one device, watch what happens to that notification on the other device’s Notification Center. You’ll see it disappear with no manual intervention.
\nNotification sync is amazing if you, like me, rely heavily on apps that can send a lot of notifications on a daily basis (like Messages) for daily communication needs. It’s the way Notification Center should have worked from the start, and I can’t go back to a system that doesn’t sync notifications across devices. I don’t know if only apps built with the iOS 7 SDK will be able to take advantage of this feature, but I wasn’t able to sync notifications for iOS 6 apps like Mailbox and Dropbox on my iOS 7 devices.
\nI’ve also found myself interacting with my iPhone using gestures rather than buttons – more than I used to with iOS 6. Control Center has been a fantastic addition for me, if only to access the Flashlight, the music playback controls, and Bluetooth/WiFi shortcuts. I wish that iOS 7 let me activate the Flashlight during a FaceTime call (it would be useful to show things to people if you’re in a dark room), but, in general, I’ve come to rely on Control Center so much, it seems crazy to think iOS didn’t have this functionality before. The only gripe I have with Control Center is that it’s hard to activate when the keyboard is shown on screen, because you’ll end up inadvertendly hitting some keys before the panel comes up. When the keyboard is visible, I think that the gesture recognition should be improved.
\n\nAside from Notification Center, the other feature activated with a pull down gesture is Spotlight. Previously confined in a separate page on the SpringBoard, Spotlight is now available by pulling down the Home screen, on any screen. If you’re on screen #2 and you want to search for something, you can do it. I believe this is a better design than iOS 6, as it makes Spotlight more accessible without having to go to a separate area, albeit certainly more hidden for first-time users because there is no indicator telling that Spotlight is available by swiping down.
\nThanks to Control Center’s Camera shortcut, I’ve taken a lot more photos and selfies. The Camera app has been redesigned on iOS 7, with a black interface that focuses on the fact that you can swipe to change between four camera modes: Video, Photo, Square, and Pano. HDR, Flash, and the switch button are still available at the top, the Camera Roll is at the bottom left, and a new button in the bottom right lets you access live photo filters.
\nFilters and Square are really meant to complement each other for Instagram users and people who like to apply filters to photos without Instagram.[7] I am not an expert of photography or filters, I never use them in my photos, but I guess it’s nice that the grid displays them in real-time, and that the Photo and Square mode can have separate filters: set a filter for Photo, another for Square, quit the Camera, launch it again, and each mode will still have the filter you chose. When a filter is active, the button is colored (as opposed to desatured).
\nI’m conflicted about the photo taking experience. Swiping between modes is a better solution than iOS 6’s various buttons, but the app needs more visual feedback when pressing the shutter. In iOS 6, you’d get a sound and an animation showing what you’d see on a real camera; in iOS 7, the eschewal of real-life objects has led Apple to replace the shutter animation with a brief flash of the screen accompanied by the same sound. The problem is that the new flash animation, which lasts less than a second, is easy to miss, and if you’re in a loud public place you’re going to miss the sound notification too. That will result in you not knowing whether you took a picture or not, and therefore taking another one “just to make sure”, but then ending up with duplicate photos in the Camera Roll because you actually missed the shutter animation. This is what happened to me, and I’ve heard the same complaint from other users as well. I think this is bad design in the name of change, and I hope that Apple will return to a more obvious camera animation.
\nI am, on the other hand, a fan of the new Photos app. Photos (either from your device or other devices’ Photo Streams) are available in a single Photos tab that organizes items by Years, Collections, and Moments. The last two are smart groupings that divide photos that were taken in different places while still sorting them by time, filtering down by single days when you reach the Moments view.
\n\nMoments, in particular, are more effective than a simple vertical list of photos (what the app used to be) because they provide a logical organization of your photos without you having to do anything about it: your device’s camera already has the time and location information to do the heavy lifting for you. You can tap on the location to view photos on a map (make sure to pinch the photos on the map view to see some cool animations), or, better yet, share a specific moment (or selected photos inside a moment) to Facebook or iCloud.
\nThe new iCloud shared streams are good, and I plan to use them with my family a lot. In iOS 7, you can create a private photo stream shared with selected users, and everyone will gain the ability to upload photos and videos to the stream if you enable the setting. Then, every user will be able to like photos, leave comments, and members of the stream will receive a notification every time there is activity. To catch up on recent activity in a stream, there is an aptly named Activity view. Streams can be published on the web at a public iCloud.com webpage so users who don’t have iOS devices will be able to view them. Overall, it’s a useful and intuitive functionality that I will use with my parents to let them see what I’m seeing on a vacation or a particular day without having to rely on email or message threads. That’s a powerful idea, beautifully developed in iOS 7.
\nMy opinion of Maps has only slightly improved from last year. The app has been redesigned with a white theme, but the map views have stayed the same. For my area, there is still no Flyover or 3D support, but there seem to be more recent businesses listed in the search results. However, the app is still inferior to Google Maps when it comes to parsing search queries and finding results, sometimes bringing up results that are in a different region because it isn’t smart as Google at matching my input. Other times, the app generally picks routes that aren’t the best ones available, and, as I noted last year, voice navigation still uses the system’s language, and not the Siri language, which is, in my opinion, a bad decision (for voice stuff, iOS should pick the language the user sets for Siri, not the interface). There is a night mode now, but I can’t recommend Maps because of a color theme. For me and for my area, I believe Google Maps is superior for search results, quality of voice navigation, listed businesses, and traffic information.
\nA feature that I didn’t initially like and that I’ve criticized on multiple occasions, Siri, is much improved in iOS 7. I actually am using Siri quite a bit more now, and I was surprised by the quality of the Italian voice, its increased speed, clean new design, and new functions. Notably, Siri is now a black translucent panel like Notification Center, showing light text on a dark background. There still isn’t a live text transcription akin to Google’s one, but at least there is a more immediate visual feedback with an audio waveform. In the past few weeks, Siri for iOS 7 has been much faster than its iOS 6 counterpart, and I wonder if this is the reason Apple is now confident enough to say Siri is out of beta.
\nI have noticed that Siri has gotten better at understanding the Italian language as well. The assistant is more capable when it comes to pronouns and subordinate clauses, although it still struggles with conjugations and more advanced sentence constructions (that’s the more advanced stuff though, and it’s understandable). The new commands that Siri supports in iOS 7 are useful: you can change settings, get and return missed phone calls, see what’s playing in iTunes Radio, and tell Siri how to pronounce your name (it did get Federico right on first try, to be fair).
\n\nThe best addition, though, is integration with Wikipedia. When I first demoed Siri to friends two years ago, they would always try to ask common questions like “how many people live in Italy” or “what is a pizza”, and Siri would provide a shortcut to a web search because it didn’t know how to parse that information. With Wikipedia information, you can now run Q&As within Siri, asking the most disparate questions and getting spoken results back with inline text and image previews directly from Wikipedia.
\nAs MacStories readers know, I never used Apple’s Music app much because all the music I need is on Rdio. I only keep a couple of albums in the Music app, and that’s about it. With iOS 7, the Music app has gotten a visual refresh that, to my inexperienced eye, looks mostly good, especially in the new “wall of artists” landscape mode.
\n\nTo my surprise, I’ve been liking iTunes Radio, Apple’s new Pandora/Rdio-like feature to serve up stations of songs that are chosen by iTunes and based on your tastes. You can create stations starting from a specific artist, song, or genre; stations are synced across devices with your iTunes account; and, you can tell the app to play more songs like the current one, never play the current song again, or add it to the new synced wish list of iOS 7. iTunes Radio is, essentially, a gateway to purchase more content on the iTunes Store, which Apple makes easy to do by prominently featuring a Buy button in the title bar.
\n\nBesides Apple’s strategy, though, iTunes Radio is quite good. The algorithm is accurate thanks to Apple’s years of experience in analyzing customer purchases and aggregate listening habits through the Genius feature of iTunes, so, as you’ll keep liking/diskling songs and tuning stations to hits, variety, or discovery mode, iTunes Radio will learn from you and present you songs that it knows you’ll like or at least not hate. In my three months of testing, my Oasis and Bloc Party stations have turned into an endless stream of songs that I really like, so I guess that’s encouraging. I’m not moving away from Rdio, but I enjoy listening to Apple’s iTunes Radio every once in a while, and I think that a lot of people will like it (especially in the free version, where ads aren’t that intrusive).
\niOS 7 hasn’t negatively impacted the way I work from my iPad and iPhone. Some changes have made my workflow better, there are some annoyances with the OS’ performance, and, for everything else, I’ll have to wait for third-party developers to update the apps I use.
\nI haven’t switched from Google Chrome to Safari as my main browser, primarily because Safari still doesn’t come with an inter-app communication system like x-callback-url, which I’ve come to rely upon. There are, however, some changes that I truly like in Safari: most notably, the address bar is now a unified URL + search field, and, on the iPhone, bookmarks are available as website apple-touch-icons right below the address bar when you select it. If you have a lot of bookmarklets installed in Safari, it’s very convenient to be able to tap the address bar and launch one. This is a much better bookmarks menu than iOS 6, and it works especially well in conjunction with Safari’s new Shared Links feature, which collects all links shared in your Twitter timeline, right in Safari. I like to open Shared Links, find something, then send it to other apps with a bookmarklet.
\n\nAnd yet, in spite of Safari’s clean new UI, Reader and Shared Links features, and new bookmarks menu, I can’t bring myself to leave Chrome. Google has done an excellent job with the x-callback-url integration between its apps and third-parties, and I find Google’s sync to be superior (faster, more reliable) than Apple’s, even under iOS 7. The new tab carousel view in Safari for iOS 7 is beautiful, but not practical for me: I open a lot of tabs on a daily basis, and to close them in Safari for iOS 7 I have to swipe them away, remembering that only swiping to the left is supported.
\n\nIf you want to achieve some sort of “close all tabs” feature, there is a workaround to enable Private mode and disable it right away, so that all your tabs will be closed. Compare this to Google Chrome, which automatically closes a tab that was triggered by x-callback-url when you go back to another app (like Tweetbot) and that lets you swipe tabs away in both directions. I won’t get the speed improvements, text selection support on iPad, and better bookmark access on iPhone, but my workflow depends on Chrome’s inter-app communication/sync and I can’t switch to Safari. Plus, I’m curious to see what Google will do with Chrome and iOS 7-specific features.
\nThe iOS 7’s App Store is a huge improvement over the performance-riddled mess that was the iOS 6 App Store. From last year:
\n\nI’m not alone in thinking the App Store on iOS 6 needs serious improvements before being ready for an optimal and stress-free customer experience. As the app is an HTML wrapper for content and styles Apple can fix “server-side”, there’s hope Apple will push fixes to the App Store without software updates. For now, however, the new App Store feels like a step backwards in terms of speed and reliability, and this is not good news for third-party developers who need users to pleasantly browse and discover software through Apple’s storefront.
In iOS 7, the App Store is now native and, as such, it comes with smooth scrolling, fast opening times for app descriptions, sections, and charts, a finally usable Purchased tab that loads thousands of apps in seconds, and an overall unprecedented fluidness. I am serious when I say that the new App Store app is one of my favorite additions to iOS 7: I browse the Store to find apps, see sections, and read updates on a daily basis, and the iOS 6 App Store was terrible for that. The layout hasn’t changed, but the App Store is now a proper Apple app.
\n\nI still think several aspects of the App Store could be improved, though. Search, for instance, is still based on a cards layout and a ranking algorithm that, supposedly, Apple should be improving to lead to more accurate and genuine results. The persistence of cards is interesting, as the Top Charts have switched back to the original, pre-iOS 6 vertical layout, and I would like to see Apple making the same change to regular search too. As I argued in July, the new Near Me feature sounds good for populated areas and tourists who visit a lot of attractions or attend events, but, in practice, it’s always empty for me. When it does have something, it features my local newspaper’s app, which is terrible. I don’t think that Near Me will prove to be a substantial addition to the App Store in the next few months, and I think that Apple should consider replacing it with a more flexible Genius feature (this time, one that actually works). It seems like the sort of feature built for tech journalists who live in San Francisco and always try new apps, not “normal” folks who commute each day from home to work. It’s a neat idea, but ultimately useless in my opinion.
\niOS 7 now lets the App Store automatically update apps for you, and, overall, I like this addition. It can be disabled from Settings > iTunes & App Store, but I’d suggest leaving it on to get the stress out of seeing a red badge on the App Store and having to remember to update apps manually. It’s just too convenient to always have your apps up to date to the latest version, with iOS 7 taking care of downloading updates in the background all the time, letting you know which apps were updated in Notification Center.
\n\nAt the same time, I’d caution against automatic app updates if you’re the type of user who wants to know exactly what kind of update you’re about to install: if you don’t like it when developers remove a feature you really like from your favorite app, maybe it would be better for you to disable automatic updates and retain control over how apps are updated. Ideally, Apple could add more granular controls to disable automatic updates for certain apps, but I understand by they simply added an on/off switch in iOS 7.0.
\nI don’t use Apple’s Mail app because I need the push notifications of the Gmail app and Mailbox (as well as the ability to open links in Google Chrome), but the Mail app has received other nice additions in iOS 7. First off, the app supports Dynamic Type, Apple’s new system-wide control for font sizes. Configurable in Settings > General > Text Size, Dynamic Type allows apps that have been built with the iOS 7 SDK (and that support the feature) to intelligently scale text at various sizes automatically, without developers having to care about advanced attributes like kerning and ligatures because Apple’s iOS 7 typographic engine will take care of them. Obviously, the Mail app takes advantage of the feature, so you can adjust the text size to your liking in the Settings, and you’ll get a bigger or smaller size in Mail messages.[8]
\nSearch in the Mail app still isn’t as fast as the one in Mailbox, but Apple added a feature to create custom mailboxes in the main screen of the app to facilitate the process of, for instance, looking for a specific message in All Mail. Just tap the Edit button in Mail, then Add Mailbox, and choose Gmail’s All Mail folder as the root folder for the new mailbox. You’ll get a custom mailbox in Mail’s initial screen, from where you can start a new search that looks in every message of your account. Apple is also including built-in shortcuts for messages that are CC’d to you, or ones that contain attachments.
\nFor power users, Apple added support for message URLs in iOS 7. A feature that OS X supported since Leopard but that few people know about, message URLs allow you to open a specific message in Apple’s Mail app using the message:// URL scheme. MacStories readers may be familiar with these URLs because I rely on them for some of my Evernote scripts and automation tools.
\nIn iOS 7, if you have a message URL that corresponds to a message, the URL will correctly open it directly in Mail. There are two limitations: the message has to be already downloaded in the Mail app, and, of course, you have to know the URL. So far, I haven’t found a way to create URLs to reference Mail messages on iOS, but the ones you create on your Mac through AppleScript and Mail.app will continue to work on iOS 7 devices. Therefore, if you have scripts that generate these URLs to, say, attach them to OmniFocus or Evernote, you’ll be able to tap them and open the associated message on an iPhone or iPad. I look forward to seeing whether developers will figure out a way to generate message:// URLs on iOS.
\nI spend most of my time writing and taking notes on iOS, but, unfortunately, Apple hasn’t made changes to the iOS text editing and selection mechanisms, which, in my opinion, are showing evident signs of time. I still find it utterly cumbersome to place and move the cursor on iOS 7, and, due to the keyboard’s new lighter “glass” design, I think that keys have lost too much constrast, making it harder to quickly peek at the keyboard if you, like me, can’t type without looking every once in a while. Personally, I still get confused by the Caps Lock key, which has lost the white glow and blue highlight of iOS 6 to leave room for a more subdued black outline and white key color.
\n\nI wasn’t sure about it initially, but I now think that the new multitasking view of iOS 7 is a good step forward. iOS 7 eschews app icons in the multitasking tray and brings full app previews when double-clicking the Home button to switch between apps. While less icons are shown on screen (both on the iPhone and iPad, you get 3 icons/previews), the new multitasking view gives more context as to what you’re really switching to, the status in which you’ve left an app, and new content available in an app. In iOS 7, in fact, apps can update their latest snapshots in the background if something has changed, an obvious example being that you can quickly look at the multitasking view to see if you’ve received a new message without opening the Messages app. While the animation of the multitasking view is slower than iOS 6, the slightly reduced speed is compensated by an increased convenience in switching to an app by tapping the large snapshot in the middle of the screen. One small touch that I like is that iOS 7’s multitasking puts the most recent app in the middle of the screen, which is the most comfortable area to tap, so that switching back and forth between two apps can be easier. The only problem I’ve encountered is that, due to the iPad mini’s amount of memory, apps are often closed in the background, and the multitasking view gives you the illusion of switching to a “live” app while you’ll actually be forced to wait because an app has to relaunch.
\nIn iOS 7, removing an app from the app switcher (by swiping up) stops its background service. The new Background App Refresh system is one of my favorite iOS 7 features and one that I’ve long wished for. I haven’t been able to try many third-party apps with support for background refresh, but, based on what I’ve seen so far, I believe it’ll be an excellent addition to my workflow with minimal impact on battery life.
\nIn iOS 6, it used to be that apps could transfer data in the background only for a limited amount of time (usually 10 minutes) and receive push notifications for new content available remotely. In iOS 6, if Instacast started downloading podcast episodes and 10 minutes passed, you would receive an alert that required you to open the app again or the download session would time out. In OmniFocus, you couldn’t get the app to always be in sync with the Omni Sync Server in the background. In Instapaper, Marco Arment had to come up with a workaround to make the app fetch your latest articles when the user’s location changed, not any time there were new items in your account.
\nIn iOS 7, the system has been completely overhauled. Through a new Background Transfer service, apps can now enqueue multiple downloads in the background, leave them running, and the OS will take care of making them stick after an app is closed or a device is restarted. There are no time restrictions, so you don’t have to launch apps every 10 minutes – you can just leave them in the app switcher, and they will always download new content for you, in the background, automatically. This is great for apps like Pocket Casts and Instacast, which can now always present your up-to-date content without any additional manual operation.
\nIn general, through background fetch, developers have now a way to periodically launch apps and check for new content. When using background fetch, iOS 7 will determine at opportunistic times when it’s appropriate to check and fetch new data: by analyzing user patterns and other criteria like battery life, network, and location, iOS 7 can pick the best time to run the background fetch, coalesce fetches from multiple apps, and then use the Transfer service to download files in the background if necessary. If new content is critical and of immediate importance, developers can send a remote notification (optionally making it a silent one, so a device won’t ring) to tell iOS 7 to fetch updated content right away.
\nFor developers, Apple is now offering proper tools to manage background updates and downloads by making iOS do the hard work of checking device and network conditions. On the user’s side, what I have noticed with the handful of iOS 7-ready apps I could try is that, as long as you keep apps in the app switcher, everything seems to “just work”. It’s great to be able to launch Pocket Casts and find your latest episodes waiting for you without having to wait for a refresh.
\nI’m excited to see how developers will integrate with the new background refresh APIs to make switching between apps and working with them even more seamless and intuitive. I can’t wait to see the day when Day One will always fetch my latest diary entries, when Tweetbot won’t force me to reload timelines and DMs, or when Evernote won’t have to sync every time I launch it.
\nIn my experience, the iOS 7 Golden Master seed that Apple released on September 10th has been more problematic than GM seeds for iOS 5 (2011) and iOS 6 (2012). Assuming that the version that Apple released today is the same one of the Golden Master released to developers a week ago, I can say that iOS 7 has been fairly smooth and fast on my iPhone 5, with occasional system crashes and reboots. On the iPad mini, I have experienced several hiccups in terms of animations and overall performance, with SpringBoard crashes, reboots, and hard resets.[9]
\nIn the past week, I’ve had two random crashes of the SpringBoard on my iPhone 5, and one when using the Phone app to make a phone call. In my workflow, I haven’t noticed any other reproducible serious bugs that prevented me from using my iPhone 5 on a daily basis. On the iPhone 5, animations are, in general, fluid and smooth, if only a bit too slow in certain areas. I can safely recommend updating an iPhone 5 to iOS 7, keeping in mind that, in this first build, crashes and reboots may occur occasionally.
\nI would have given some extra weeks of development to iOS 7 for the iPad. On my iPad mini, aside from an overall inferior level of polish of the interface, graphical glitches, UI inconsistencies, and slower performance, I’ve seen Home screen crashes and hard resets, which forced me to re-configure my iCloud account and Apple ID. I still managed to get my work done on my iPad mini when I was writing in Editorial, but issues were frequent when moving across multiple apps.
\nBoth on the iPhone and iPad, I’ve experienced issues with Reminders synchronization, with completed or new items not syncing changes back and forth, even after I deleted my iCloud account and added a new one. Reminders sync worked mostly well when using Reminders on just an iPhone and iPad, but it started presenting issues when configuring Reminders for OS X and for the new iOS 7-inspired iCloud.com website with my iCloud account.
\nIn many ways, iOS 7 for iPad feels more “beta” than iOS 7 for the iPhone. There are a few rough spots with design issues and inconsistencies on the iPhone, but, in the iPad GM release, there are design problems that go beyond performance.
\n\nMany areas of the OS and Apple apps feel like scaled-up pieces of the iPhone UI that don’t take advantage of the iPad’s larger canvas, such as folders, which are now full-screen and paginated but present the same amount of apps at once as the iPhone. Notification Center doesn’t try anything clever with the new full-screen view and the extra space provided by the iPad’s screen. The Music app is a larger replica of the iPhone’s version, not attempting more audacious features such as inline album expansion a la iTunes 11, but instead spacing out large rows of whitespace and forcing users to navigate inside multiple screens.
\nThe spatiality of apps in the multitasking view is broken by the iPad’s multitasking gestures: while I wouldn’t make a big deal of the fact that the three-finger swipe up is no longer mapped to an actual direct slide-up of the interface (as it was on iOS 6), the four-finger and five-finger horizontal swipes to move between apps are just downright confusing when compared to the multitasking view’s layout. When swiping to the right, you’re taken from App A to App B, and you can swipe to the left to return from App B to App A. However, in the multitasking view, the first time you switch App B becomes App A, and vice versa. In my opinion, this is only confusing at a high-level design consideration (i.e. gesture works, and normal people won’t care), but it shows that it’s time for Apple to rethink several of its old UX and UI choices on the iPad.
\nI would have liked to see Apple experimenting with brand new ideas for iPad interfaces and gesture interaction. My hope is that Apple will iterate quickly with 7.0.1 an 7.0.2 releases with bug fixes and performance improvements, saving possible design and interaction improvements for 7.1 and future major updates.
\nToday, you’ll read many stories saying that iOS 7 is iPhone OS 1.0 for the modern age: a reset of the iOS design language, a fresh start for Apple, and a new opportunity for all developers. To an extent, that’s certainly true. But such a statement can only be representative of reality if the history of iOS 7 is also properly contextualized. iOS 7 is the new iPhone OS 1.0…with six years of iOS software changes and advancements behind it. That’s a big difference.
\nI think that Apple’s greatest accomplishment is to make iOS 7 feel new and cleaner without losing any of the powerful functionality that allows people to use iPhones and iPads as computers. In spite of having more features than iOS 6, Apple’s new design language and structure gets rid of much UI cruft and makes iOS more organized, precise, fluid, and less rigid. When I think of iOS 7 after three months of intensive usage, I don’t get upset about the icons (though they’re still odd) or the gradients, but I associate this update with flexibility, ease of use, and fewer annoyances. It’s extremely hard to add features without complicating a user interface, and I think that Apple achieved its goal here.
\nIt’s important to remember that Apple’s audience for iOS 7 isn’t the vocal minority of people who point out every design inconsistency, but the millions of customers who rely on iPhones and iPads every day for communication, work, entertainment, creation, consumption, and everything in between. A design critique is always necessary – especially for a company like Apple that prides itself upon higher design standards – but it’s not the only metric that should be used to judge iOS 7. That would be inconclusive and shortsighted.
\niOS 7 comes with powerful additions like Control Center, better Siri, AirDrop, improved Notification Center, and Background App Refresh out of the box, and these will be enough for most people to upgrade to iOS 7. Less promoted new features like FaceTime Audio (for audio-only FaceTime calls), possibility to check cellular usage for individual apps (finally), and new options for Find My iPhone will also substantially improve the everyday experience of all iOS users, whether they’re “average users” or geeks.
\nWhen the developer APIs are taken into account, however, is when it’s easy to see Apple’s advantage over the competition: iOS 7 now lets developers implement animations that respect the laws of physics, easy controls for typographic styles, better text layout and reflow controls, background fetch and download APIs, new graphic rendering engines, and much more. Looking ahead at the iPhone 5s, iOS 7 will also let developers access accurate motion data from a dedicated motion coprocessor that is always on with less strain on a device’s battery. iOS 7 is powerful as it is, and even more for the people who will make the apps that we use.
\nIn the short term, you’ll see thousands of apps adopting Apple’s iOS 7 design style with few custom variations on Apple’s take. This is completely normal as it happened before with the launch of the iPhone App Store in 2008 and the iPad in 2010: developers want to be on this new platform today, and the easiest way to do it is to follow Apple’s guidelines for a fast and relatively painless transition, leaving more risk-taking design ideas for later. I wouldn’t be surprised to start seeing iOS 7 app designs that move away from Apple’s default look in 3 or 4 months, when the dust will have settled on iOS 7’s design and the general public will be more familiar with the new OS.
\nYou should also start seeing apps that take advantage of background refresh and better text controls and layouts starting today. Expect a lot of new animations in your favorite apps, major updates to existing apps sold as separate paid versions in the App Store, and, occasionally, an abuse of transitions and translucencies. A lot of new apps will launch as iOS 7-only, and, in a matter of 12 months, I don’t think there will be many popular apps that still support iOS 6. After the initial wave of App Store releases this week and next week, I expect to see text editors with support for custom shortcuts with Bluetooth keyboards, new photo apps with slow motion modes and other effects, and updates to fitness-oriented apps to support the 5s’ M7 chip and CoreMotion API.
\niOS 7 could have used more development time on the iPad, more UI polish in some iPad apps and features, and less minor bugs that can annoy people with an attention to details. I hope that Apple will release bug fixes and performance improvements soon.
\nIt’ll be interesting to see where Apple will take iOS 7 next year. The new design gives the company new areas to explore, such as improvements to Notification Center to support actionable notifications (coming in Mavericks), inline traffic and weather conditions in Calendar (also in Mavericks), or personalization of shortcuts in Control Center (currently, Control Center’s panel in Settings has only two options). With popular user requests such as background downloads, Control Center, and new multitasking now out of the way, it seems reasonable to hope that Apple will consider revamping interoperability and communication between apps in the future. A Siri API has long been rumored, and I like to think that the special widget that iTunes Radio has in Control Center (with controls to like songs, etc) will someday be opened to third-party developers. Text editing and selection, system clipboard, and Open In are still unchanged from the early days of iPhone OS, and now’s the right time to address them.
\niOS 7.0 is beautiful, coherent, structured, and powerful. Apple has made improvements to key areas of the OS that allow me to use iCloud and Siri more, and third-party developers now have the tools to make apps that will let me be even more efficient from my iPad and iPhone. I think that iOS 7 is a great step in the right direction: making the iOS interface more versatile both in terms of design and functionality.
\nI can’t wait to see what’s next.
\nFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.
\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.
\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;
\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;
\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.
\nLearn more here and from our Club FAQs.
\nJoin Now", "content_text": "iOS 7, released today, is a deep reimagination of Apple’s mobile platform: using familiarity and the need for a reset as catalysts, iOS 7 represents Apple’s attempt to make iOS ready for the future. iOS 7 is, effectively, the epitome of a large company that knows it’s time to get rid of cruft and inconsistencies to bring a new order to a platform that has grown exponentially in the past five years. For developers, iOS 7 brings powerful new tools that will allow for a new generation of more flexible, intelligent, and versatile apps. iOS 7 is not perfect: there are rough spots and some wrong assumptions, but it’s not flawed or, as many will argue in the next few weeks, a “mistake”. It would be extremely silly and shortsighted to judge iOS 7 by the look of its application icons or the gradients Apple has decided to use on some graphics. More than any other Apple product, iOS 7 isn’t just defined but how it looks: iOS 7’s new look is devoted to functionality – to how things work.\nIt’s difficult for me to offer a comprehensive review of iOS 7 today, because I have only been able to test a fraction of the third-party apps I will use on a daily basis with my iPhone and iPad mini. Mirroring the concept of “design is how it works”, I would say that, for me, iOS isn’t just how Apple’s apps work on it – it’s increasingly become about how apps from third-party developers can take advantage of it.\nI have been running iOS 7 on my iPhone 5 since Apple released the first beta in June. I later installed the OS on my iPad mini, and have been working with an iOS 7-only setup ever since. As MacStories readers know, I primarily work from my iOS devices, which helped me get a good idea of how iOS 7 will change the way I write, take photos, respond to emails, listen to music and podcasts, and all the other things that I use iOS for.[1] Fortunately, I had the chance to test a good amount of third-party apps that solidified my thoughts on iOS 7 and the way it impacts my digital life and workflow.\nIt was also hard to get ahold of fellow iOS 7 users in my town. While I imagine that it would be easier to come across a nerd running an iOS 7 beta at a bar in San Francisco, I didn’t have much luck in Viterbo, Italy. I tested new features like AirDrop – which allows you to share files and information locally with other iOS 7 devices – with my iPhone and iPad, and, in the past week, managed to convince my girlfriend to install iOS 7 on her iPhone.\nI needed to provide this context: my livelihood directly depends on iOS and how I can work from my iPhone and iPad without having to use my Mac. Therefore, if you’re looking for a list of new features and smaller details of iOS 7 (and there are many), bookmark this article. My “review” of iOS 7 will focus on my thoughts on the update, how it made my iPhone and iPad better devices, and what I believe iOS’ future will be going forward.\nPrinciples of Design\nUntil iOS 6, the visual appearance of the operating system was largely related to interface elements and ornamentations aimed at suggesting interaction through the use of real world metaphors. Buttons were shaped as encapsulated buttons, linen drapes adorned the top and bottom of the device for Notification Center and Multitasking, and apps like Notes and Podcasts hinted at their primary functionality by recreating distant relatives of the physical world with pixels on a screen.\nThe design of iOS 7 is based on what I call “The Four C Principles”:\nContent\nColor\nContext\nClarity\nThere are several other concepts, guidelines, and deviations from the main theme at play in iOS 7, but, overall, everything leads back to the four key principles above.\nIn iOS 7, content is the star of the show. Content comes first; everything else is secondary, playing a supporting role. Content can be text, your photos, videos, gameplay, your music, tweets – content is what you want to produce or consume. Content is what you access when you use an iPhone or iPad, for leisure or work. iOS 7 wants to reclaim content from the interface and present it back to you, elegant and uncluttered. In iOS 7, the interface is deferential to your content.\nFrom June:\nThe terminology Apple has chosen is different: by Apple’s parlance, iOS 7 is precise, sharp, coherent, simple, and efficient with a focus on clarity, vitality, motion, structure, color, context. Content is paramount; the interface defers control to the user, rather than claiming it. To quote Jony Ive, iOS finds “profound and enduring” beauty in simplicity, “a sense of purpose” in the way the interface takes a step back, reasseses its role, and comes forward again, subdued, neutral, more intimately “connected” to the hardware that lies underneath the OS.\nThe visual change of iOS 7 is immediately clear from the Lock screen: gone is the iconic arrow of “Slide to unlock”, leaving room for a chevron that points to a “slide to unlock” label that you can drag to move the Lock screen away and get to your Home screen or Passcode unlock screen. There’s a subtle animation that runs across the chevron and text, indicating that you can hold and move those elements; if you just tap them, they bounce, suggesting that something is waiting for you behind them. The keypad no longer resembles a physical keypad – it’s a grid of tappable numbers.\n\nThe predominance of content in iOS 7 is well exemplified by the OS’ use of whitespace, clean and neutral typography, and use of available space on a device’s screen. Navigation bars and toolbars tend to be white or black, depending on the color theme of the application, and they blend with the system’s status bar, now an element of design that integrates with apps and doesn’t make them feel like objects inside another part of the UI. For the same idea, content extends from edge to edge, using every available pixel to make apps feel more spacious and less constrained. In email messages, inline images go from edge to edge; in Settings, white section headers extend to the full width of the display.\nIn iOS 7, the edges of the display are even more relevant: in iOS 6, only the top edge could be used to access Notification Center, whereas now the bottom edge can be used to activate the new Control Center, and there is a system-wide gesture to swipe from the left edge of the display to navigate back to the previous screen of an app.\nThis approach reveals the intent to ensure software can be true to the hardware it runs on: by bringing pixels to the edge of the screen and making those edges areas that can be swiped to access system features or navigate through content, the hardware and software act more as a whole – they complement (instead of limiting) each other. It’s an invisible interplay under the hood, but, to me, it’s a new cohesion that simply makes sense.\nA side effect of freeing the fundamentals of the iOS interface from real world metaphors, photorealistic objects, and 3D elements like buttons and glossy bars is that iOS 7 comes with borderless buttons and relies on color and text to indicate interactivity and selection.\nIn a typical iOS 7 app – such as Apple’s Mail – the blue color indicates interactivity and black text means “content”. The Back button, still available in the upper left corner, isn’t contained in a capsule: now, buttons are generally pieces of colored text that lose opacity when tapped and are desatured when inactive. This choice may be confusing at first: in the iOS 6 era, buttons shaped as glossy, “fat” buttons easily meant “tap me here”. In iOS 7, their simpler, textual look disorients at first, but, in my experience, with time the color/text association becomes more natural and obvious. Text labels as tappable items/buttons have some advantages in my opinon:\nWithout a button drawn on screen, the text of a button doesn’t have physical limits – because it doesn’t have to fit inside a button of a specific shape or size. This is especially important for international users: a button’s text that looks good in English may not necessarily fit well inside a button shape in German or Chinese. While strings of text are still cut off by iOS if too long for a navigation bar or toolbar, removing the boundaries of a button’s shape gives developers more room for their localizations.\nBy reducing the use of images to indicate interactivity, apps should feel less cluttered. In theory, making interfaces less reliant on heavy, custom images should also make apps snappier and more responsive, because the OS doesn’t need to animate several images at once.\nWithout borders, buttons often feel more integrated with an app’s toolbar.\nBesides interactivity, color also plays a big role in how it gives your device and your apps a unique identity.\n\nApple apps tend to use a distinct color scheme that should help users better differentiate and remember them. Notes uses yellow buttons and navigation items with a subtle letterpress effect[2] while Calendar has a white/red theme that is applied throughout the entire UI; Messages’ blue and green tones and gradients for chat bubbles are obviously different from Mail’s black & white look, and the Music app’s new pink color scheme is one of my favorites. For third-party apps, Apple is advising developers to consider color as one of the key elements for their app’s branding; while I’m not sure that color will be enough to avoid apps that look all similar to each other in the long term, from what I’ve seen so far it certainly looks like developers are listening.\nColor is more notable in how it completely changes the look of your iOS 7 device through wallpapers. As I detailed in my initial overview in June, iOS 7 relies heavily on translucencies and blurring to hint at portions of the user interface hidden behind what’s currently in the foreground. I wrote:\nThen there is translucency. In iPhone OS 1.0 and up until iOS 6, Apple used solid and glossy bars and controls, with the occasional transparency effect that, in the end, tended to feel like a gimmick and not a core part of the experience. In iOS 7, essentially every element that is layered on top of others is translucent, so that you’ll be able to get a hint of what’s underneath. In the process of considering translucency as a catalyst for context and “giving a sense of place” to the user, Apple has taken apart the layers that iOS accumulated over the years and rebuilt them from the ground up while adding new user features.\nLook at Apple’s exploded view of iOS 7: your device and your wallpaper share the same level of importance in the OS’ new hierarchy and structure. Apps and software features like Notification Center and Control Center exist on top of the wallpaper level, and as such they inherit its most basic quality: color. In practice, this means that the simple act of changing your wallpaper will also personalize (through translucency) your experience with a device. Control Center will look different depending on your wallpaper; the numeric keypad will gain outlines based on the wallpaper’s color; in apps like Reminders for iPad, the wallpaper will influence the look of the sidebar, making an app feel like an extension of your device, which is exactly the point.\nAnd yet, in spite of Apple’s fancy exploded view, there is one element that can overlay the system wallpaper: your contact’s photos. This is a simple, delightful touch that I deeply appreciate. iOS 7 has new circle-shaped profile pictures for your contacts, which show up in the Contacts and Phone apps, as well as Game Center. I’m particularly a fan of the fact that your favorite contacts can have profile pictures next to their name, and I also like the editing UI for adding and cropping a photo you want to assign to someone. But more than the tools, I like that when you’re calling someone, that contact’s photo becomes the blurred wallpaper behind the keypad and Phone UI. Effectively, you’ll see a blend of blurred colors behind the keypad, but, because you know what the real contact photo is supposed to be like, the feeling of communicating with that person will be reinforced. It’s a more personal, human experience than the static, plastic, glossy buttons of iOS 6, and it’s gorgeous to look at.\n\nThe use of translucency in operating systems isn’t new, but iOS 7 takes it to another level by combining its reliance on color and cleaner interface elements with your content to increase the sense of context that using an app gives you.\nHere’s an example: when I’m using my iPhone, I’m typically listening to music, either on Rdio or the new iTunes Radio, while reading or just browsing around my favorite websites and subreddits. When I follow a link and Safari/Chrome launches, I don’t like watching a blank page for a few seconds, so I bring up Control Center with a swipe from the bottom edge to double check the title of a song or maybe give it a “star” with iTunes Radio’s widget. Because Control Center is a semi-transparent panel, I don’t have to look at the progress bar in the browser’s address field to know if my webpage has finished loading – I’ll just see the webpage’s content peeking from below Control Center.\nAnother good example that gives you an idea of how much iOS 7 values context are the new translucent bars. Again, in an app like Safari (but also the App Store and Mail) the status bar and toolbar are translucent, showing a portion of blurred content with a subtle transparency effect. This isn’t just good-looking (I personally love the effect), it’s also useful in that, if there’s, say, a photo below the text you’re reading, you’ll know before scrolling to it. iOS 7’s graphics layout engine is smart in this regard as it treats photos and text differently: photos will be blurred with their primary colors, but, in order to avoid confusion with overlapping icons and letters, if text is behind a bar, iOS 7 will automatically increase the opacity of its UI to not show blurred text. This is the reason why, in apps like Mail and Safari, photos that underlap navigation bars and toolbars will be blurred, but text won’t show through the UI, cluttering it.\nIn February 2012, I wrote about the problem with the iOS Home screen:\nThe problem Apple needs to overcome is that the Home screen tries to be a real object while providing access to the gates of the digital world. To reinvent it, Apple needs to tear apart the whole concept and rebuild it from the ground up.\nWhen I first installed iOS 7 three months ago, I thought that translucencies were cool, but ultimately useless, if not detrimental to the user experience. Today, I think that, if used correctly and strictly where necessary, blurs and translucency can provide utility in how they hint at content available under the interface. Furthermore, I’ve seen developers (and Apple) using blurring effects to show the new spatiality of the OS: iOS 7 and iOS 7-ready apps have a clear structure based on layers of content and interface, and these effects can help make that structure more obvious.\nThe simplification of the Home screen that I envisioned in 2012 happens in iOS 7 with a refined layout that uses blurs, layers, and transparencies to tell you how elements are arranged in the new structure. Apple has thought this through to the smallest detail, such as making the app icon that you held to activate wiggling mode be on top of the one next to it.\n\nWhat I don’t get is the new parallax effect. In June, it was described as a way for your device to feel “alive” in your hands by tilting the interface according to your hand’s movements, making for cool implementations such as being able to see behind the icons on top of a wallpaper. I wrote:\nParallax effects haven’t been enabled everywhere in the OS yet, and I’m looking forward to understanding whether developers will have APIs to implement them in their apps. In practice, parallax was described as being able to see behind the icons while using a photo of your family as wallpaper. Simple and human.\nIt is indeed a neat effect that makes for a great demo to friends on a Saturday night when the alcohol intake has reached the maximum allowed for the week and you want to show them “how much phones have progressed over the years, now they follow your eyes too”. I get that. But its novelty wears off fast, and I have yet to see a practical implementation that considerably augments the user experience in a way that goes beyond the “it’s cool” factor.\nApple has applied parallax to various iOS elements like alert dialogs, icons, badges, and it has provided developers with APIs to add parallax and other motion effects to their apps. While parallax contributes to iOS 7’s feeling of depth and layering, I think it’s mostly a gimmick, and not as effective in communicating certain aspects of the user experience as translucency, color, precise typography, or animations.\nApple has built a physics engine into iOS 7. I had a feeling that Apple had an interest in making interfaces feel more human when they first showed iBooks Author, which, back then, was highly reminiscent of Mike Matas’ work on Push Pop Press (a technology that was later acquired by Facebook). Like other past experimentations that are referenced in the new OS’ design[3], iOS 7’s take on physics is a full-on assault: Apple uses its new engine in several apps, and developers have been given an API to integrate it in their own apps. This new engine has helped Apple (and will help third-party developers) create animations and transitions that give the operating system a new kind of personality, one that is perhaps less immediately visible, but more tangible when using iOS 7 apps.\n\niOS 7’s guiding principle of clarity uses, among other elements, motion to convey information and express functionality to the user. It starts with minor details: when you press the Home or Power button to wake a device, the wallpaper subtly zooms in and the current time and date fade in before the rest of the Lock screen UI. Besides looking nice, it’s a useful touch as most people tend to wake their iPhones to look at the time.\nOn the Lock screen, two flat lines at the top and bottom indicate activation areas for Notification Center and Control Center: swipe, and the indicator starts following a Center’s panel, becoming a an up/down chevron when a panel is fully revealed to communicate that you need to swipe up or down again to dismiss the view.\nWhen you unlock a device, app icons fly onto the Home screen, revealing their new, slightly tweaked shape that follows Apple’s new grid system to achieve a more harmonious, “tappable” feeling. Many have criticized the new icons’ shape and grid, but I think they are a massive improvement over the old iOS 6 icons, which now feel rigid to my eyes. Especially in apps like iTunes Store, App Store, and Messages, the combination of Apple’s color palette, squircle, and grid makes for more fun, approachable icons.\n\nAs you start using an iPhone with iOS 7, you’ll begin seeing all sorts of other animations and transitions that leverage the previously mentioned color, context, and content principles to tell the user what’s happening on screen in a delightful new way. When you tap an app, the transition between the Home screen and the app’s launch screen zooms into the icon; when you close an app, iOS zooms out, and the Home screen comes up again. Or when you do something as trivial as tapping the folder button to move an email message into a folder, the current message shrinks and goes into a toolbar at the top of the screen.\niOS 7 is full of animations that contextualize actions through the use of natural motion. With a mix of color, translucency, and physics the OS’ animation system seems more human. If Apple was trying to make iOS 7 feel “alive” on the screen, I think that animations and transitions do a much better job than parallax in this regard – they are always there, always part of the system, not necessarily making (some sort of) sense only if you tilt a device. They don’t have the over the top, look-at-this-real-moving-object aspect of iOS 6’s Passbook paper shredder, but because they aren’t limited by physical metaphors, they can respond to different, varying qualities such as gravity, mass, or spring.\nWhen Apple does try to more heavily imitate the real world, such as in the new Weather app for iPhone, both the interface and animation system are done tastefully and thoughtfully. In Weather, conditions become part of the user interface with clouds, fog, sunrays, snowflakes, hail, or snow reflecting or bouncing off text and slowly moving in the background. Again, it comes down to providing context to the user and creating software that makes information more human, easily understandable, and obvious.\niOS 7’s animations are digital, but, under the surface, they are more real than shredders and torn bits of paper. They don’t try to replicate our physical world through pixels and photorealism – Apple is simply recreating the physical rules that govern our world. That’s a profound, far-reaching vision.\nMy only problem with iOS 7’s animations is that some of them take too long to execute and complete, and they can become boring after a while. I agree with Marco Arment when he says that some animations are too heavy-handed and patronizing after the first dozen of times you’ve seen them.\nGenerally speaking, I think it would be better if Apple sped up the Home screen and multitasking animations. But I also think that it’s important to not lose perspective and understand that the majority of iOS users (read: not geeks and developers) won’t likely notice the extra milliseconds that Apple has built into some of the animations. In fact, I’d tend to say that most users will actually like the obvious, in-your-face animations because they make iOS feel new and different. Being iOS 7 the major departure from the past that it is, it’s important to make the transitions stand out to instill the new basics of the OS in the average user. The speed of some of the animations is a bitter medicine for geeks, but a necessary means to say “this is what’s happening on screen on the new iOS”. With time, as users grow more accustomed to iOS 7, my hope is that Apple will start cutting the execution time of animations, making the OS feel snappier.\niPhone OS 1.0 was designed for a world that didn’t know smartphones and tablets would redefine our digital lives. It was all about making touch interactions obvious through visual cues and metaphors that could easily transition the user from the physical world onto the reality existing inside multitouch screens. Without a mouse or a physical keyboard, Apple’s best decision was to make iPhone OS 1.0 futuristic and advanced but at the same time reminiscent of its apps’ analog counterparts. The Notes app was a legal pad and the OS had buttons shaped like capsules. It had to be done, and it was amazing.\nSix years have passed since iPhone OS 1.0. The tech world has changed, design trends have matured, and Apple has kept adding feature after feature to iOS, struggling to find physical metaphors to explain additions like Passbook and its player in the Podcast app. At what point do fun designs become an exercise for Photoshop skills and, overall, just gaudy and tacky?\nApple finds itself in a unique position now, trying to keep the familiarity of iOS 6 while imagining the next five or ten years of the platform. Last week, Craig Federighi stressed how iOS 7 will become the most popular mobile operating system in the world thanks to over 700 million iOS devices sold to date. Even if we assume that just half of those devices will upgrade to iOS 7, that would make for 350 million users who know how to operate iOS. How do you balance the need of explaining iOS to new generations of future customers with millions of existing users who want iOS to be more useful and more delightful?\nFrom a design standpoint, part of iOS 7 comes down to taste, and the rest is based on principles that should allow a further growth of the operating system for the next several years. I don’t think that iOS 7’s design language is misguided or poorly managed: there are some false steps and rough spots, but I think that iOS’ new design will allow Apple to be more flexible and innovative with the features they’ll add in the next few years.\niOS 7’s new design doesn’t tell the whole story, though. In the end, people use iPhones and iPads – they don’t spend hours looking at them and over-analyzing their UI designs. It’s important to remember that, after all the discussions about typography and animations, iPhones and iPads have to work for us. They have to be useful.\niOS 7 has powerful new features and developers tools. Aside from my personal preference for the new look, functionality is my favorite part of iOS 7. But with some caveats.\nLiving with iOS 7\nAs I’ve previously shared on MacStories and The Prompt, when I installed the first beta of iOS 7 I decided to run an experiment with my workflow and see if I could move my task management system from OmniFocus to Apple’s Reminders. I was intrigued by iOS 7’s new Today view in Notification Center, which gives you an overview of your day by grouping weather information with events from your Calendar and todos from Reminders. It’s like a summarized assistant, in textual form, up in Notification Center, available anywhere. It sounded like a good deal, so I figured I could try it.\n\nIt really worked for me. I’ve come to rely on the Today view as a way to quickly see all relevant information for a day, and also a brief glimpse of tomorrow thanks to the iCloud Calendar integration. The Today view works for me for various reasons:\nThe date is prominently featured at the top with a natural format: day of the week, day of the month, and month. This is how I want to know the current date, and now it’s only a swipe away in Notification Center.\nA quick weather summary is available under the date. It’s just the current condition, plus expected low and high temperatures for the day.\nEvents and reminders are listed at the bottom. If there’s an upcoming event, iOS 7 will tell you “you have event x in x minutes” and then provide a list of events. If you have events for the next day, iOS 7 will tell you what’s the first one of those future events.\nReminders are color-coded (for Reminders lists) and interactive. As you complete a reminder, you can pull down Notification Center, swipe to the Today view, and check it off from there without looking for it in the Reminders app. With iCloud, the completed reminder is synced across devices so you can forget about it.\nIt would be great to have third-party apps in the Today view, but, to my surprise, I’ve been fine with Apple apps in Notification Center’s Today view for a personal overview of my day and the things I have to do. Notification Center has gone from a linen-themed panel to a full-screen translucent view on the iPad, and, even if it’s not Apple’s most original design ever, it is functional.\nBack in June and July, I had to drive each morning to a local gym for my daily physical therapy session. After a few days of sessions, iOS 7 noticed a pattern in my behavior and, without configuring anything, it started telling me (through the Today view) how many minutes it would take me to drive to the gym at a specific time between 11 AM and 12 PM. Depending on traffic conditions, iOS 7 would change from 5 minutes (normal time) to 7 or 10 minutes. I don’t know how iOS 7 figured out the traffic information (my town, Viterbo, doesn’t have traffic information in Apple Maps), but it worked and it was accurate. My best guess is that iOS 7 used the new Frequent Locations feature (available in Settings > Privacy > Location Services > System Services > Frequent Locations) to understand my driving behavior, driving times, and daily patterns to improve the information it would feed to Notification Center and the Today view.\nI don’t understand why Apple didn’t go the extra mile and enhance every calendar event that contains location information with the weather and directions. Apple knows how to triangulate and parse this data, because tapping on a location in a Calendar event takes you to the Maps app that displays directions (either driving or walking depending on your default preference, a new feature of iOS 7) and looking up a location in the Weather system should be trivial. Most of all, OS X Mavericks’ Calendar will come with exactly this functionality. Instead, iOS 7’s Calendar requires to jump to the Maps app and displays no weather data, while the Today view can learn from your patterns but can’t display a summary of directions and weather forecasts for each event. It seems like a missed opportunity, especially because, again, Apple is doing it in Mavericks.\nEven more surprising is the decision to completely hide all-day events from the Today view. I typically create all-day events for days when I know something important will happen and require my complete attention for several hours (a monthly check-up with my doctor, or a big app release) and they are hidden from the Today view because Apple said so. I don’t see any reason why all-day events – which, as the name suggests, are important! – wouldn’t have to appear in a view called Today. It doesn’t make any sense.\nI like iOS 7’s Today view in Notification Center, but it would be much better and useful if it handled all-day events and embedded weather and driving information next to each event. I could understand Apple’s willingness to restricting Today summaries to built-in apps only, and I could even see why adding weather and directions could clutter the UI[4], but the lack of all-day events is a silly choice.\nAs a consequence of my increased usage of iCloud Calendar and Reminders, I’ve tried to live with Apple’s native Calendar and Reminders apps. Calendar doesn’t work for me, but I like some things about the Reminders app.\nOn the iPhone, the Calendar app starts with an elegant year view that lets you drill down into single months and then days with transitions that are smooth and fast. The animations help contextualizing the action of entering a different view by using a zoom in/out effect that is consistent with the in/out transition of the Home screen and app icons. There are delightful touches such as the way a month’s name follows you along from the year layout to month view or how a selected day shifts the entire week to the upper portion of the screen, revealing a day’s list of events in the bottom half. Days with events have gray dots, there is a Today shortcut, and you can swipe horizontally to switch between days. The app is extremely polished, focused, and precise.\n\nThe problem is that, besides good looks, I don’t like the way the Calendar app works for me. First off, it doesn’t support natural language input like Fantastical and Calendars 5, and I can’t stand adding new events by having to tap menus and operate spinners to set locations and duration. Second, unlike Fantastical’s excellent DayTicker, Apple’s Calendar for iPhone shows a single day view for each day – if you have an empty day, you’ll see empty hours. The view that I like – a list of upcoming events in chronological order, with no empty days or hours – is available by tapping the search icon, which is incovenient to do every time. Fantastical and Calendars 5 start with a list of all my upcoming events (and, in Calendars 5, reminders); in Apple’s Calendar, I have to remember to tap the search icon to see that list, and there is no natural language support. iOS 6 displayed the List view as a tab in the bottom toolbar, and I think Apple should add it back in that original position (possibly giving the option to set it as default launch view).\nIt doesn’t get any better on the iPad. On the iPad mini, the app doesn’t come with the playful transitions of the iPhone’s counterpart on the iPhone 5, and it adds a Day view that, in a larger layout, lists a single day’s worth of events…with empty hours and empty days if there are no events. The list view is tucked away inside a popover. On iOS 7 for the iPad popovers are generally white and borderless, which often makes it hard to distinguish them from the rest of the (mostly white) interface. Popovers are a fantastic piece of UX on the iPad, but I feel like iOS 6’s ones, with borders and well-defined bounds, were easier to separate from content underneath.\nOverall, I’m much faster in adding and managing events with Fantastical, Agenda, and Readdle Calendars 5, which may not be as good-looking as Apple’s app, but at least they are more efficient for a power user like me. For people who usually add a couple of events per week I think that Apple’s app will be fine, but for everybody else I would recommend, like on iOS 6, looking for a third-party alternative.[5]\n\nThe Reminders app is good, and, for me, it comes with a minor but annoying issue. On the iPhone, the app gets rid of leather but keeps the paper texture to display lists as cards that you can tap to open, close, or swipe to take a peek at the first item of a list when they’re closely grouped together. The lists’ names use Apple’s new API to have a nice letterpress effect, and the system wallpaper acts as background in the app, which is a nice touch for user personalization. Thanks to the cards layout, you can tap & hold a card and rearrange it vertically just by dragging it. On the iPad, Apple eschewed the stacked cards representation and opted for a more classic, but convenient split layout that shows lists in a sidebar on the left.\n\nThe options for creating and managing reminders are unchanged, but there is a handy addition for location-based reminders: you can tweak the radius of a location. The “When I arrive” and “When I leave” settings are now displayed on top of a map view that lets you hold your finger on screen to change the radius of a location’s trigger from 100 meters up to kilometers. If you enlarge the radius far enough, you can create location reminders that will alert you when you leave the country.[6]\nThe reason I truly like the Reminders app is the Scheduled view, available by tapping the alarm clock icon that resides in the top right corner on the iPhone, or at the bottom of the sidebar on the iPad. The Scheduled view gives you a simple, direct visualization of all your reminders that have due dates, sorted by chronological order from top to bottom. It gathers reminders from all lists, and it’s a great way to see all your reminders in one place.\n\nAlso worth of mention is iOS 7’s alert dialog for due reminders. When a reminder’s alert fires off (possibly using one of iOS 7’s new, futuristic, beautifully synth-based sounds), you get a dialog with two buttons: Options and Close. The Close one is displayed with bold text, making it the default choice for most users – close it, complete the reminder, then open the app or the Today view and check it off. But if you tap options, you get an expanded dialog with shortcuts to be reminded again in 15 minutes, mark as completed right away, or open the reminder. This is a superior design that speeds up the process of acting on alerts, and I wish Apple did it with more kinds of notifications.\nMy complaint about the Reminders app is that it doesn’t support tappable URLs. Using Reminders as my main todo system and writing for the web, several of my reminders include URLs in the notes, and iOS 7 doesn’t let me tap them to open them in the browser. To add insult to injury, Apple knows how to do URL matching, because URLs are tappable in both the new Notes app and Calendar. For tappable URLs, I’d recommend using something like Agenda (which also lets you open links in alternative browsers like Chrome), or Due (which has its own sync system, but comes with many other nice touches).\nSpeaking of the Notes app: it retains the essentiality of its iOS 6 ancestor, it gets rid of leather and yellow paper, and it murdered Marker Felt with the fury of a letterpress machine (the letterpress effect is the same of Reminders). What I don’t like is that a note’s title isn’t repeated in the title bar, which makes it easy to lose context in longer notes. It supports iOS 7’s new Back gesture for navigation and AirDrop, but, in my tests, AirDrop led to duplicate notes on the receiver’s end, which wasn’t cool. I think that some of the Notes animations, especially on the iPad mini, are a bit rough and unfinished (such as swipe to delete).\nI haven’t been able to use AirDrop much in real-life scenarios, but, from what I could test in my limited home environment, I think it’ll be a great addition for peer-to-peer sharing that will obviate the need for cumbersome solutions like Dropbox and Mail when you just need to share a document or piece of data with a friend or colleague next to you.\nLike its OS X counterpart, AirDrop for iOS uses an encrypted, WiFi ad-hoc communication to share files between compatible devices nearby. AirDrop is supported only on Apple’s most recent devices like the iPhone 5 and later, iPad 4th gen, iPad mini, and iPod touch 5th gen. In iOS 7, AirDrop settings live in Control Center, where you can tell iOS 7 to make yourself visible through AirDrop to everyone nearby, just contacts, or nobody. If you choose the Contacts setting, only people who are in your contact list and using an iCloud account will show up.\nUsing AirDrop is extremely easy and I believe it’ll supplant awkward web based sharing solutions for things like photos and URLs. Available by default in the new share sheet, people visible through AirDrop will show up with profile pictures; tap one (or multiple ones) and they will get a request to accept what you’re sharing; when done, the receiver will get the file or information open in the default iOS system app, and you’ll see a progress bar filling the outline of that person’s avatar, followed by a “Sent” label in the share sheet. Everything is familiar if you’re coming from OS X, and even if you’re not, it’s easy to use and intuitive once you’ve tried it a couple of times.\n\nI found some interesting touches in iOS 7’s AirDrop implementation worth noting. AirDrop’s alert dialogs can contain inline pictures, which is a neat way to see a preview of the file you’re receving (such as a photo) directly in the AirDrop confirmation dialog. So, say you’ve left AirDrop visible to everyone in a public place and someone tries to send you an inappropriate photo, you can see a preview of the photo before accepting, so you can decline and file request and turn off AirDrop. Apple’s inline preview system is well done, as it supports snapshots for web pages shared from Safari, icons for App Store apps, and a screenshot of a location’s view for Maps sharing. It’s a nice addition on a technical level, which ends up being a practical implementation to enhance AirDrop’s security and user experience.\n\nBy default, AirDrop tries to open a received file or bit of data in the system app that is associated with it. So, for instance, a photo will be received and added to the Photos app, an app’s link will open in the App Store, a map in Maps, and so forth. For files and data that AirDrop can’t launch in a default app, however, Apple added an instance of its Open In menu to the AirDrop confirmation dialog. In iOS 7, the Open In system hasn’t been redesigned, and it lives in the share sheet in the form of application icons. When sharing a document like a .txt file through AirDrop, iOS 7 will ask you to accept the file, and then choose an app to open it with. I would have preferred an Android-like option to always default certain file types to a specific app, but I’ve learned a long time ago not to expect this sort of feature from Apple (it’d always be welcome, though).\nI’ve always advocated for a version of AirDrop for iOS devices, and its implementation on iOS 7 doesn’t disappoint. Within the existing limitations of iOS that haven’t been addressed in 7.0 (Open In system, lack of user-configurable default apps) AirDrop “just works” thanks to peer-to-peer sharing that is fast and doesn’t require passwords or uploading to cloud services. I would like to see a simpified management of the “contacts-only” setting, but, otherwise, I think that AirDrop sharing will immensely improve things like local photo and video sharing for everyone.\nThey’re not actionable like some of their Mavericks counterparts, but I’ve found myself liking iOS 7’s new banner notifications. They haven’t received any sort of new functionality that iOS 6 didn’t have before, but they’re now translucent and, for apps that have been designed following Apple’s guidelines, they’ll cover up the exact upper portion of an app where the status bar and navigation controls should be. You can really tell when an app hasn’t been designed using Apple’s advised size for the navigation bar because of how banner notifications will cover parts of the interface they shouldn’t cover.\nOne minor addition that I do appreciate is that you can pull down a banner notification to reveal it in the full-size Notification Center. It makes for a neat way to see a single notification with more context in regard to others that were sent by the same app; if you want to immediately dismiss a banner notification, you can swipe it up and it’ll quickly go away.\nI haven’t been using the Missed view of Notification Center at all. According to Apple, the new view is supposed to show you only alerts that you haven’t addressed in the past 24 hours, but, in practice, I always ended up opening the default All view and cleaning notifications from there.\nWhich leads me to my two personal favorite features of iOS 7’s Notification Center: sync and Lock screen access. The latter is obvious and convenient: you can now pull down from the status bar in your device’s Lock screen to access the real Notification Center, so if you want to manage your missed alerts from the Lock screen while waiting in line at the grocery store, you can now do that. But that’s not the best part of Notification Center.\nWhat I found truly great is notification sync through (what I assume is) iCloud. Imagine this: you receive a text message on your iPhone, but because you have also an iPad, you get it on that device as well. Now you have the same notification on two devices. Prior to iOS 6, you’d have to manually address the notification both on the iPhone and iPad. No more. On iOS 7, once you’ve addressed a notification on one device, it syncs to the other device automatically, removing the notification from Notification Center. Try it with the Messages app: get a notification, read it on one device, watch what happens to that notification on the other device’s Notification Center. You’ll see it disappear with no manual intervention.\nNotification sync is amazing if you, like me, rely heavily on apps that can send a lot of notifications on a daily basis (like Messages) for daily communication needs. It’s the way Notification Center should have worked from the start, and I can’t go back to a system that doesn’t sync notifications across devices. I don’t know if only apps built with the iOS 7 SDK will be able to take advantage of this feature, but I wasn’t able to sync notifications for iOS 6 apps like Mailbox and Dropbox on my iOS 7 devices.\nI’ve also found myself interacting with my iPhone using gestures rather than buttons – more than I used to with iOS 6. Control Center has been a fantastic addition for me, if only to access the Flashlight, the music playback controls, and Bluetooth/WiFi shortcuts. I wish that iOS 7 let me activate the Flashlight during a FaceTime call (it would be useful to show things to people if you’re in a dark room), but, in general, I’ve come to rely on Control Center so much, it seems crazy to think iOS didn’t have this functionality before. The only gripe I have with Control Center is that it’s hard to activate when the keyboard is shown on screen, because you’ll end up inadvertendly hitting some keys before the panel comes up. When the keyboard is visible, I think that the gesture recognition should be improved.\n\nAside from Notification Center, the other feature activated with a pull down gesture is Spotlight. Previously confined in a separate page on the SpringBoard, Spotlight is now available by pulling down the Home screen, on any screen. If you’re on screen #2 and you want to search for something, you can do it. I believe this is a better design than iOS 6, as it makes Spotlight more accessible without having to go to a separate area, albeit certainly more hidden for first-time users because there is no indicator telling that Spotlight is available by swiping down.\nThanks to Control Center’s Camera shortcut, I’ve taken a lot more photos and selfies. The Camera app has been redesigned on iOS 7, with a black interface that focuses on the fact that you can swipe to change between four camera modes: Video, Photo, Square, and Pano. HDR, Flash, and the switch button are still available at the top, the Camera Roll is at the bottom left, and a new button in the bottom right lets you access live photo filters.\nFilters and Square are really meant to complement each other for Instagram users and people who like to apply filters to photos without Instagram.[7] I am not an expert of photography or filters, I never use them in my photos, but I guess it’s nice that the grid displays them in real-time, and that the Photo and Square mode can have separate filters: set a filter for Photo, another for Square, quit the Camera, launch it again, and each mode will still have the filter you chose. When a filter is active, the button is colored (as opposed to desatured).\nI’m conflicted about the photo taking experience. Swiping between modes is a better solution than iOS 6’s various buttons, but the app needs more visual feedback when pressing the shutter. In iOS 6, you’d get a sound and an animation showing what you’d see on a real camera; in iOS 7, the eschewal of real-life objects has led Apple to replace the shutter animation with a brief flash of the screen accompanied by the same sound. The problem is that the new flash animation, which lasts less than a second, is easy to miss, and if you’re in a loud public place you’re going to miss the sound notification too. That will result in you not knowing whether you took a picture or not, and therefore taking another one “just to make sure”, but then ending up with duplicate photos in the Camera Roll because you actually missed the shutter animation. This is what happened to me, and I’ve heard the same complaint from other users as well. I think this is bad design in the name of change, and I hope that Apple will return to a more obvious camera animation.\nI am, on the other hand, a fan of the new Photos app. Photos (either from your device or other devices’ Photo Streams) are available in a single Photos tab that organizes items by Years, Collections, and Moments. The last two are smart groupings that divide photos that were taken in different places while still sorting them by time, filtering down by single days when you reach the Moments view.\n\nMoments, in particular, are more effective than a simple vertical list of photos (what the app used to be) because they provide a logical organization of your photos without you having to do anything about it: your device’s camera already has the time and location information to do the heavy lifting for you. You can tap on the location to view photos on a map (make sure to pinch the photos on the map view to see some cool animations), or, better yet, share a specific moment (or selected photos inside a moment) to Facebook or iCloud.\nThe new iCloud shared streams are good, and I plan to use them with my family a lot. In iOS 7, you can create a private photo stream shared with selected users, and everyone will gain the ability to upload photos and videos to the stream if you enable the setting. Then, every user will be able to like photos, leave comments, and members of the stream will receive a notification every time there is activity. To catch up on recent activity in a stream, there is an aptly named Activity view. Streams can be published on the web at a public iCloud.com webpage so users who don’t have iOS devices will be able to view them. Overall, it’s a useful and intuitive functionality that I will use with my parents to let them see what I’m seeing on a vacation or a particular day without having to rely on email or message threads. That’s a powerful idea, beautifully developed in iOS 7.\nMy opinion of Maps has only slightly improved from last year. The app has been redesigned with a white theme, but the map views have stayed the same. For my area, there is still no Flyover or 3D support, but there seem to be more recent businesses listed in the search results. However, the app is still inferior to Google Maps when it comes to parsing search queries and finding results, sometimes bringing up results that are in a different region because it isn’t smart as Google at matching my input. Other times, the app generally picks routes that aren’t the best ones available, and, as I noted last year, voice navigation still uses the system’s language, and not the Siri language, which is, in my opinion, a bad decision (for voice stuff, iOS should pick the language the user sets for Siri, not the interface). There is a night mode now, but I can’t recommend Maps because of a color theme. For me and for my area, I believe Google Maps is superior for search results, quality of voice navigation, listed businesses, and traffic information.\nA feature that I didn’t initially like and that I’ve criticized on multiple occasions, Siri, is much improved in iOS 7. I actually am using Siri quite a bit more now, and I was surprised by the quality of the Italian voice, its increased speed, clean new design, and new functions. Notably, Siri is now a black translucent panel like Notification Center, showing light text on a dark background. There still isn’t a live text transcription akin to Google’s one, but at least there is a more immediate visual feedback with an audio waveform. In the past few weeks, Siri for iOS 7 has been much faster than its iOS 6 counterpart, and I wonder if this is the reason Apple is now confident enough to say Siri is out of beta.\nI have noticed that Siri has gotten better at understanding the Italian language as well. The assistant is more capable when it comes to pronouns and subordinate clauses, although it still struggles with conjugations and more advanced sentence constructions (that’s the more advanced stuff though, and it’s understandable). The new commands that Siri supports in iOS 7 are useful: you can change settings, get and return missed phone calls, see what’s playing in iTunes Radio, and tell Siri how to pronounce your name (it did get Federico right on first try, to be fair).\n\nThe best addition, though, is integration with Wikipedia. When I first demoed Siri to friends two years ago, they would always try to ask common questions like “how many people live in Italy” or “what is a pizza”, and Siri would provide a shortcut to a web search because it didn’t know how to parse that information. With Wikipedia information, you can now run Q&As within Siri, asking the most disparate questions and getting spoken results back with inline text and image previews directly from Wikipedia.\nAs MacStories readers know, I never used Apple’s Music app much because all the music I need is on Rdio. I only keep a couple of albums in the Music app, and that’s about it. With iOS 7, the Music app has gotten a visual refresh that, to my inexperienced eye, looks mostly good, especially in the new “wall of artists” landscape mode.\n\nTo my surprise, I’ve been liking iTunes Radio, Apple’s new Pandora/Rdio-like feature to serve up stations of songs that are chosen by iTunes and based on your tastes. You can create stations starting from a specific artist, song, or genre; stations are synced across devices with your iTunes account; and, you can tell the app to play more songs like the current one, never play the current song again, or add it to the new synced wish list of iOS 7. iTunes Radio is, essentially, a gateway to purchase more content on the iTunes Store, which Apple makes easy to do by prominently featuring a Buy button in the title bar.\n\nBesides Apple’s strategy, though, iTunes Radio is quite good. The algorithm is accurate thanks to Apple’s years of experience in analyzing customer purchases and aggregate listening habits through the Genius feature of iTunes, so, as you’ll keep liking/diskling songs and tuning stations to hits, variety, or discovery mode, iTunes Radio will learn from you and present you songs that it knows you’ll like or at least not hate. In my three months of testing, my Oasis and Bloc Party stations have turned into an endless stream of songs that I really like, so I guess that’s encouraging. I’m not moving away from Rdio, but I enjoy listening to Apple’s iTunes Radio every once in a while, and I think that a lot of people will like it (especially in the free version, where ads aren’t that intrusive).\nWorking with iOS 7\niOS 7 hasn’t negatively impacted the way I work from my iPad and iPhone. Some changes have made my workflow better, there are some annoyances with the OS’ performance, and, for everything else, I’ll have to wait for third-party developers to update the apps I use.\nI haven’t switched from Google Chrome to Safari as my main browser, primarily because Safari still doesn’t come with an inter-app communication system like x-callback-url, which I’ve come to rely upon. There are, however, some changes that I truly like in Safari: most notably, the address bar is now a unified URL + search field, and, on the iPhone, bookmarks are available as website apple-touch-icons right below the address bar when you select it. If you have a lot of bookmarklets installed in Safari, it’s very convenient to be able to tap the address bar and launch one. This is a much better bookmarks menu than iOS 6, and it works especially well in conjunction with Safari’s new Shared Links feature, which collects all links shared in your Twitter timeline, right in Safari. I like to open Shared Links, find something, then send it to other apps with a bookmarklet.\n\nAnd yet, in spite of Safari’s clean new UI, Reader and Shared Links features, and new bookmarks menu, I can’t bring myself to leave Chrome. Google has done an excellent job with the x-callback-url integration between its apps and third-parties, and I find Google’s sync to be superior (faster, more reliable) than Apple’s, even under iOS 7. The new tab carousel view in Safari for iOS 7 is beautiful, but not practical for me: I open a lot of tabs on a daily basis, and to close them in Safari for iOS 7 I have to swipe them away, remembering that only swiping to the left is supported.\n\nIf you want to achieve some sort of “close all tabs” feature, there is a workaround to enable Private mode and disable it right away, so that all your tabs will be closed. Compare this to Google Chrome, which automatically closes a tab that was triggered by x-callback-url when you go back to another app (like Tweetbot) and that lets you swipe tabs away in both directions. I won’t get the speed improvements, text selection support on iPad, and better bookmark access on iPhone, but my workflow depends on Chrome’s inter-app communication/sync and I can’t switch to Safari. Plus, I’m curious to see what Google will do with Chrome and iOS 7-specific features.\nThe iOS 7’s App Store is a huge improvement over the performance-riddled mess that was the iOS 6 App Store. From last year:\nI’m not alone in thinking the App Store on iOS 6 needs serious improvements before being ready for an optimal and stress-free customer experience. As the app is an HTML wrapper for content and styles Apple can fix “server-side”, there’s hope Apple will push fixes to the App Store without software updates. For now, however, the new App Store feels like a step backwards in terms of speed and reliability, and this is not good news for third-party developers who need users to pleasantly browse and discover software through Apple’s storefront.\nIn iOS 7, the App Store is now native and, as such, it comes with smooth scrolling, fast opening times for app descriptions, sections, and charts, a finally usable Purchased tab that loads thousands of apps in seconds, and an overall unprecedented fluidness. I am serious when I say that the new App Store app is one of my favorite additions to iOS 7: I browse the Store to find apps, see sections, and read updates on a daily basis, and the iOS 6 App Store was terrible for that. The layout hasn’t changed, but the App Store is now a proper Apple app.\n\nI still think several aspects of the App Store could be improved, though. Search, for instance, is still based on a cards layout and a ranking algorithm that, supposedly, Apple should be improving to lead to more accurate and genuine results. The persistence of cards is interesting, as the Top Charts have switched back to the original, pre-iOS 6 vertical layout, and I would like to see Apple making the same change to regular search too. As I argued in July, the new Near Me feature sounds good for populated areas and tourists who visit a lot of attractions or attend events, but, in practice, it’s always empty for me. When it does have something, it features my local newspaper’s app, which is terrible. I don’t think that Near Me will prove to be a substantial addition to the App Store in the next few months, and I think that Apple should consider replacing it with a more flexible Genius feature (this time, one that actually works). It seems like the sort of feature built for tech journalists who live in San Francisco and always try new apps, not “normal” folks who commute each day from home to work. It’s a neat idea, but ultimately useless in my opinion.\niOS 7 now lets the App Store automatically update apps for you, and, overall, I like this addition. It can be disabled from Settings > iTunes & App Store, but I’d suggest leaving it on to get the stress out of seeing a red badge on the App Store and having to remember to update apps manually. It’s just too convenient to always have your apps up to date to the latest version, with iOS 7 taking care of downloading updates in the background all the time, letting you know which apps were updated in Notification Center.\n\nAt the same time, I’d caution against automatic app updates if you’re the type of user who wants to know exactly what kind of update you’re about to install: if you don’t like it when developers remove a feature you really like from your favorite app, maybe it would be better for you to disable automatic updates and retain control over how apps are updated. Ideally, Apple could add more granular controls to disable automatic updates for certain apps, but I understand by they simply added an on/off switch in iOS 7.0.\nI don’t use Apple’s Mail app because I need the push notifications of the Gmail app and Mailbox (as well as the ability to open links in Google Chrome), but the Mail app has received other nice additions in iOS 7. First off, the app supports Dynamic Type, Apple’s new system-wide control for font sizes. Configurable in Settings > General > Text Size, Dynamic Type allows apps that have been built with the iOS 7 SDK (and that support the feature) to intelligently scale text at various sizes automatically, without developers having to care about advanced attributes like kerning and ligatures because Apple’s iOS 7 typographic engine will take care of them. Obviously, the Mail app takes advantage of the feature, so you can adjust the text size to your liking in the Settings, and you’ll get a bigger or smaller size in Mail messages.[8]\nSearch in the Mail app still isn’t as fast as the one in Mailbox, but Apple added a feature to create custom mailboxes in the main screen of the app to facilitate the process of, for instance, looking for a specific message in All Mail. Just tap the Edit button in Mail, then Add Mailbox, and choose Gmail’s All Mail folder as the root folder for the new mailbox. You’ll get a custom mailbox in Mail’s initial screen, from where you can start a new search that looks in every message of your account. Apple is also including built-in shortcuts for messages that are CC’d to you, or ones that contain attachments.\nFor power users, Apple added support for message URLs in iOS 7. A feature that OS X supported since Leopard but that few people know about, message URLs allow you to open a specific message in Apple’s Mail app using the message:// URL scheme. MacStories readers may be familiar with these URLs because I rely on them for some of my Evernote scripts and automation tools.\nIn iOS 7, if you have a message URL that corresponds to a message, the URL will correctly open it directly in Mail. There are two limitations: the message has to be already downloaded in the Mail app, and, of course, you have to know the URL. So far, I haven’t found a way to create URLs to reference Mail messages on iOS, but the ones you create on your Mac through AppleScript and Mail.app will continue to work on iOS 7 devices. Therefore, if you have scripts that generate these URLs to, say, attach them to OmniFocus or Evernote, you’ll be able to tap them and open the associated message on an iPhone or iPad. I look forward to seeing whether developers will figure out a way to generate message:// URLs on iOS.\nI spend most of my time writing and taking notes on iOS, but, unfortunately, Apple hasn’t made changes to the iOS text editing and selection mechanisms, which, in my opinion, are showing evident signs of time. I still find it utterly cumbersome to place and move the cursor on iOS 7, and, due to the keyboard’s new lighter “glass” design, I think that keys have lost too much constrast, making it harder to quickly peek at the keyboard if you, like me, can’t type without looking every once in a while. Personally, I still get confused by the Caps Lock key, which has lost the white glow and blue highlight of iOS 6 to leave room for a more subdued black outline and white key color.\n\nI wasn’t sure about it initially, but I now think that the new multitasking view of iOS 7 is a good step forward. iOS 7 eschews app icons in the multitasking tray and brings full app previews when double-clicking the Home button to switch between apps. While less icons are shown on screen (both on the iPhone and iPad, you get 3 icons/previews), the new multitasking view gives more context as to what you’re really switching to, the status in which you’ve left an app, and new content available in an app. In iOS 7, in fact, apps can update their latest snapshots in the background if something has changed, an obvious example being that you can quickly look at the multitasking view to see if you’ve received a new message without opening the Messages app. While the animation of the multitasking view is slower than iOS 6, the slightly reduced speed is compensated by an increased convenience in switching to an app by tapping the large snapshot in the middle of the screen. One small touch that I like is that iOS 7’s multitasking puts the most recent app in the middle of the screen, which is the most comfortable area to tap, so that switching back and forth between two apps can be easier. The only problem I’ve encountered is that, due to the iPad mini’s amount of memory, apps are often closed in the background, and the multitasking view gives you the illusion of switching to a “live” app while you’ll actually be forced to wait because an app has to relaunch.\nIn iOS 7, removing an app from the app switcher (by swiping up) stops its background service. The new Background App Refresh system is one of my favorite iOS 7 features and one that I’ve long wished for. I haven’t been able to try many third-party apps with support for background refresh, but, based on what I’ve seen so far, I believe it’ll be an excellent addition to my workflow with minimal impact on battery life.\nIn iOS 6, it used to be that apps could transfer data in the background only for a limited amount of time (usually 10 minutes) and receive push notifications for new content available remotely. In iOS 6, if Instacast started downloading podcast episodes and 10 minutes passed, you would receive an alert that required you to open the app again or the download session would time out. In OmniFocus, you couldn’t get the app to always be in sync with the Omni Sync Server in the background. In Instapaper, Marco Arment had to come up with a workaround to make the app fetch your latest articles when the user’s location changed, not any time there were new items in your account.\nIn iOS 7, the system has been completely overhauled. Through a new Background Transfer service, apps can now enqueue multiple downloads in the background, leave them running, and the OS will take care of making them stick after an app is closed or a device is restarted. There are no time restrictions, so you don’t have to launch apps every 10 minutes – you can just leave them in the app switcher, and they will always download new content for you, in the background, automatically. This is great for apps like Pocket Casts and Instacast, which can now always present your up-to-date content without any additional manual operation.\nIn general, through background fetch, developers have now a way to periodically launch apps and check for new content. When using background fetch, iOS 7 will determine at opportunistic times when it’s appropriate to check and fetch new data: by analyzing user patterns and other criteria like battery life, network, and location, iOS 7 can pick the best time to run the background fetch, coalesce fetches from multiple apps, and then use the Transfer service to download files in the background if necessary. If new content is critical and of immediate importance, developers can send a remote notification (optionally making it a silent one, so a device won’t ring) to tell iOS 7 to fetch updated content right away.\nFor developers, Apple is now offering proper tools to manage background updates and downloads by making iOS do the hard work of checking device and network conditions. On the user’s side, what I have noticed with the handful of iOS 7-ready apps I could try is that, as long as you keep apps in the app switcher, everything seems to “just work”. It’s great to be able to launch Pocket Casts and find your latest episodes waiting for you without having to wait for a refresh.\nI’m excited to see how developers will integrate with the new background refresh APIs to make switching between apps and working with them even more seamless and intuitive. I can’t wait to see the day when Day One will always fetch my latest diary entries, when Tweetbot won’t force me to reload timelines and DMs, or when Evernote won’t have to sync every time I launch it.\nStability, Polish, and iPad\nIn my experience, the iOS 7 Golden Master seed that Apple released on September 10th has been more problematic than GM seeds for iOS 5 (2011) and iOS 6 (2012). Assuming that the version that Apple released today is the same one of the Golden Master released to developers a week ago, I can say that iOS 7 has been fairly smooth and fast on my iPhone 5, with occasional system crashes and reboots. On the iPad mini, I have experienced several hiccups in terms of animations and overall performance, with SpringBoard crashes, reboots, and hard resets.[9]\nIn the past week, I’ve had two random crashes of the SpringBoard on my iPhone 5, and one when using the Phone app to make a phone call. In my workflow, I haven’t noticed any other reproducible serious bugs that prevented me from using my iPhone 5 on a daily basis. On the iPhone 5, animations are, in general, fluid and smooth, if only a bit too slow in certain areas. I can safely recommend updating an iPhone 5 to iOS 7, keeping in mind that, in this first build, crashes and reboots may occur occasionally.\nI would have given some extra weeks of development to iOS 7 for the iPad. On my iPad mini, aside from an overall inferior level of polish of the interface, graphical glitches, UI inconsistencies, and slower performance, I’ve seen Home screen crashes and hard resets, which forced me to re-configure my iCloud account and Apple ID. I still managed to get my work done on my iPad mini when I was writing in Editorial, but issues were frequent when moving across multiple apps.\nBoth on the iPhone and iPad, I’ve experienced issues with Reminders synchronization, with completed or new items not syncing changes back and forth, even after I deleted my iCloud account and added a new one. Reminders sync worked mostly well when using Reminders on just an iPhone and iPad, but it started presenting issues when configuring Reminders for OS X and for the new iOS 7-inspired iCloud.com website with my iCloud account.\nIn many ways, iOS 7 for iPad feels more “beta” than iOS 7 for the iPhone. There are a few rough spots with design issues and inconsistencies on the iPhone, but, in the iPad GM release, there are design problems that go beyond performance.\n\nMany areas of the OS and Apple apps feel like scaled-up pieces of the iPhone UI that don’t take advantage of the iPad’s larger canvas, such as folders, which are now full-screen and paginated but present the same amount of apps at once as the iPhone. Notification Center doesn’t try anything clever with the new full-screen view and the extra space provided by the iPad’s screen. The Music app is a larger replica of the iPhone’s version, not attempting more audacious features such as inline album expansion a la iTunes 11, but instead spacing out large rows of whitespace and forcing users to navigate inside multiple screens.\nThe spatiality of apps in the multitasking view is broken by the iPad’s multitasking gestures: while I wouldn’t make a big deal of the fact that the three-finger swipe up is no longer mapped to an actual direct slide-up of the interface (as it was on iOS 6), the four-finger and five-finger horizontal swipes to move between apps are just downright confusing when compared to the multitasking view’s layout. When swiping to the right, you’re taken from App A to App B, and you can swipe to the left to return from App B to App A. However, in the multitasking view, the first time you switch App B becomes App A, and vice versa. In my opinion, this is only confusing at a high-level design consideration (i.e. gesture works, and normal people won’t care), but it shows that it’s time for Apple to rethink several of its old UX and UI choices on the iPad.\nI would have liked to see Apple experimenting with brand new ideas for iPad interfaces and gesture interaction. My hope is that Apple will iterate quickly with 7.0.1 an 7.0.2 releases with bug fixes and performance improvements, saving possible design and interaction improvements for 7.1 and future major updates.\nTo The Future\nToday, you’ll read many stories saying that iOS 7 is iPhone OS 1.0 for the modern age: a reset of the iOS design language, a fresh start for Apple, and a new opportunity for all developers. To an extent, that’s certainly true. But such a statement can only be representative of reality if the history of iOS 7 is also properly contextualized. iOS 7 is the new iPhone OS 1.0…with six years of iOS software changes and advancements behind it. That’s a big difference.\nI think that Apple’s greatest accomplishment is to make iOS 7 feel new and cleaner without losing any of the powerful functionality that allows people to use iPhones and iPads as computers. In spite of having more features than iOS 6, Apple’s new design language and structure gets rid of much UI cruft and makes iOS more organized, precise, fluid, and less rigid. When I think of iOS 7 after three months of intensive usage, I don’t get upset about the icons (though they’re still odd) or the gradients, but I associate this update with flexibility, ease of use, and fewer annoyances. It’s extremely hard to add features without complicating a user interface, and I think that Apple achieved its goal here.\nIt’s important to remember that Apple’s audience for iOS 7 isn’t the vocal minority of people who point out every design inconsistency, but the millions of customers who rely on iPhones and iPads every day for communication, work, entertainment, creation, consumption, and everything in between. A design critique is always necessary – especially for a company like Apple that prides itself upon higher design standards – but it’s not the only metric that should be used to judge iOS 7. That would be inconclusive and shortsighted.\niOS 7 comes with powerful additions like Control Center, better Siri, AirDrop, improved Notification Center, and Background App Refresh out of the box, and these will be enough for most people to upgrade to iOS 7. Less promoted new features like FaceTime Audio (for audio-only FaceTime calls), possibility to check cellular usage for individual apps (finally), and new options for Find My iPhone will also substantially improve the everyday experience of all iOS users, whether they’re “average users” or geeks.\nWhen the developer APIs are taken into account, however, is when it’s easy to see Apple’s advantage over the competition: iOS 7 now lets developers implement animations that respect the laws of physics, easy controls for typographic styles, better text layout and reflow controls, background fetch and download APIs, new graphic rendering engines, and much more. Looking ahead at the iPhone 5s, iOS 7 will also let developers access accurate motion data from a dedicated motion coprocessor that is always on with less strain on a device’s battery. iOS 7 is powerful as it is, and even more for the people who will make the apps that we use.\nIn the short term, you’ll see thousands of apps adopting Apple’s iOS 7 design style with few custom variations on Apple’s take. This is completely normal as it happened before with the launch of the iPhone App Store in 2008 and the iPad in 2010: developers want to be on this new platform today, and the easiest way to do it is to follow Apple’s guidelines for a fast and relatively painless transition, leaving more risk-taking design ideas for later. I wouldn’t be surprised to start seeing iOS 7 app designs that move away from Apple’s default look in 3 or 4 months, when the dust will have settled on iOS 7’s design and the general public will be more familiar with the new OS.\nYou should also start seeing apps that take advantage of background refresh and better text controls and layouts starting today. Expect a lot of new animations in your favorite apps, major updates to existing apps sold as separate paid versions in the App Store, and, occasionally, an abuse of transitions and translucencies. A lot of new apps will launch as iOS 7-only, and, in a matter of 12 months, I don’t think there will be many popular apps that still support iOS 6. After the initial wave of App Store releases this week and next week, I expect to see text editors with support for custom shortcuts with Bluetooth keyboards, new photo apps with slow motion modes and other effects, and updates to fitness-oriented apps to support the 5s’ M7 chip and CoreMotion API.\niOS 7 could have used more development time on the iPad, more UI polish in some iPad apps and features, and less minor bugs that can annoy people with an attention to details. I hope that Apple will release bug fixes and performance improvements soon.\nIt’ll be interesting to see where Apple will take iOS 7 next year. The new design gives the company new areas to explore, such as improvements to Notification Center to support actionable notifications (coming in Mavericks), inline traffic and weather conditions in Calendar (also in Mavericks), or personalization of shortcuts in Control Center (currently, Control Center’s panel in Settings has only two options). With popular user requests such as background downloads, Control Center, and new multitasking now out of the way, it seems reasonable to hope that Apple will consider revamping interoperability and communication between apps in the future. A Siri API has long been rumored, and I like to think that the special widget that iTunes Radio has in Control Center (with controls to like songs, etc) will someday be opened to third-party developers. Text editing and selection, system clipboard, and Open In are still unchanged from the early days of iPhone OS, and now’s the right time to address them.\n7.0\niOS 7.0 is beautiful, coherent, structured, and powerful. Apple has made improvements to key areas of the OS that allow me to use iCloud and Siri more, and third-party developers now have the tools to make apps that will let me be even more efficient from my iPad and iPhone. I think that iOS 7 is a great step in the right direction: making the iOS interface more versatile both in terms of design and functionality.\nI can’t wait to see what’s next.\n\nYes, I do also make phone calls and send text messages. I don’t keep the Phone and Messages apps on my Home screen, but I am, indeed, still using the iPhone as a real phone as well. ↩︎\nAlso an API available to third-party developers. ↩︎\nLast year, when Apple added a subtle motion effect to the Camera icon in the iOS 6 Lock screen, many noted how that concept should have been used in more areas of iOS that felt too static. Yep. ↩︎\nIf that’s the reason, at least put that information in the Calendar app. But it’s not there either. ↩︎\nA nice change: iOS 7’s new time-picking wheel lets you set individual minutes, and not five-minute increments, for alerts. ↩︎\nIn my tests, I couldn’t bring the radius outside the European continent. I guess there goes my plan to be notified by Reminders to call my mom when I’ll decide to flee Europe and take to the sea. ↩︎\nWhy would they do that? ↩︎\nI’ve talked to several third-party developers, and Dynamic Type seems to be one of the most popular SDK additions because it takes so much pain away from typographic settings. Apple’s engine is aimed at making text look great at any size, and it allows developers to specify styles semantically using nomenclatures such as “Headline 1” or “Caption 2”. Text is dynamically made bold when smaller, thin when larger; Apple’s engine will make size-specific adjustments to character space and weight, as well as line spacing to ensure that text will always be legible at any size. Dynamic Type has Accessibility built-in, and there’s even an option to increase contrast between normal and bold text for impaired users. In short: Dynamic Type is a huge help for developers, and apps that integrate with it are much more flexible with their font sizes and typographic styles because they can follow a system-wide setting that is consistent across other iOS 7 apps. It’s great. ↩︎\nBy hard reset, I mean when an iOS device crashes, reboots, and forces you to go through the initial setup again without losing data (app, documents, etc). ↩︎\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2013-09-18T12:57:57-04:00", "date_modified": "2018-03-20T13:21:49-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iOS 7", "iOS Reviews", "Featured", "stories" ] }, { "id": "http://www.macstories.net/?p=31124", "url": "https://www.macstories.net/stories/the-ios-6-trilemma/", "title": "The iOS 6 Trilemma", "content_html": "In some ways, iOS 6 is not a major update. And yet, in others, it’s possibly the biggest thing to happen to iOS since iPhone OS 1. Both of these assertions have far-reaching consequences for the users, third-party developers, and Apple itself.
\nIn June, soon after the official announcement and preview of iOS 6, I concluded my general overview of the software with four questions. Looking back at that article now, those questions are more relevant than ever.
\nThe answer to three of them is “no”. The last one – whether Google will release a standalone Maps application for iOS on the App Store – could be a “most certainly yes”, but we don’t know any more details.
\niOS 6 is a controversial release, in that through the following days we’ll likely witness several news outlets and independent bloggers declare Apple’s doom or absolute genius, depending on the Internet clique they choose to side with. I think that, in this case, the truth lies somewhere in the middle – a gray area that needs a calm and thorough consideration. At the same time, I also believe that the “controversial” nature of iOS 6 needs to be analyzed for its various facets and reasons of existence. Why did Apple choose these features for iOS 6? What does the user make of it?
\nI have been testing iOS 6 for the past weeks, and I have (slowly) come to the conclusion that there’s no easy way to cover this update with a traditional review strategy. Instead, I have decided to take a look at the software from multiple perspectives, understanding the possible implications, downsides, and improvements for each one of them.
\nWhile last year I would have answered to the question “Should I upgrade to iOS 5 right away” with a resounding “Yes”, this year I’m not so sure about the “right away” part.
\nIf you’re a third-party developer who makes apps for iOS, you should be relatively excited about iOS 6, and I don’t think I need to tell you why. The developers I spoke with about iOS 6 told me that, in this release, there are some great changes, ranging from little goodies that fix long-standing annoyances of the SDK to bigger, new, and more important APIs. It was a shared sentiment among the developers I contacted that there will be plenty of room for experimentation as soon as the update’s adoption rate rises to a level where going iOS 6-only can be considered.
\nTake, for instance, the additions to the Event Kit framework. Starting today, you’ll begin to see apps integrating with Apple’s Reminders, as, thanks to iOS 6, third-party apps can directly access Reminders data. Apps can now view and share to-do lists in the Reminders app, with options to create and modify reminders, assign due date and priorities – even set location and time-based alarms. I have personally tested a couple of upcoming updates to popular third-party apps that implement the new Event Kit framework, and it’s a great addition.
\nIt’s easy to understand why. By accessing Apple’s Reminders app and its data, developers are essentially given access to a “platform” for to-dos and alerts that syncs through iCloud on the iPhone, iPad, Mac, and, now, the web. Think of all those “simple to-do list apps” that have populated the App Store for the past four years: now, instead of implementing their own sync mechanism (which has costs both in terms of realization and future development) or having to rely on external services like Dropbox, they can just add Reminders support and get all the benefits of iCloud. iOS users know how Reminders works; instead of choosing the syncing system they prefer, it’s likely that, with iOS 6 and third-party support, users will simply pick “an interface” for their Reminders. Don’t like how Apple’s Reminders presents your lists? You can try another app that has (more or less) the same functionalities, with a different design. Just like users have been able to pick their preferred calendar client, with iOS 6 they’ll have options for “Reminders clients” as well.
\nWhile Event Kit is the most “user facing” addition to the SDK, there are several other improvements worth mentioning. iOS 6 has native Facebook integration: like Twitter in iOS 5, users can sign into their Facebook accounts from the Settings, and start posting status updates, media, or links to Facebook without the need of a dedicated client. Developers who want to integrate Facebook into their apps will be able to leverage single sign-on to let users quickly log-in with their Facebook credentials, as they are already saved on the device. Furthermore, this new social framework can essentially give any app decent Facebook features thanks to “share sheets” – little pop-up menus that enable users to post to Facebook from anywhere on iOS. It means developers won’t have to create their own custom Facebook login systems anymore.
\nThere are other notable additions to the SDK, such as In-App Purchases for any kind of iTunes content (music, books, etc); Smart Banners for Safari, a way to easily redirect users to the App Store if they visit a developer’s webpage; new Camera APIs with access to face detection and exposure (among other things); and, obviously, APIs for Passbook and Maps, Apple’s two new apps in iOS 6. Not to mention improvements to layout and presentation techniques, which will pave the way for richer, more engaging apps that use text-based lists less and employ “more visual” elements such as tiles and grids.
\nBut developing apps that take advantage of iOS’ latest features would be useless without a platform to promote them and sell them.
\nIn May I took a look at the first four years of App Store, speaking to various third-party developers about their experiences and concerns with Apple’s ecosystem, then counting 600,000 apps for iPhone and iPad, and now well over 700,000 according to recent data from Apple. Among the several suggestions and critiques made by developers, nearly all of them expressed their wish for a more social, truly curated, and more searchable App Store that would supersede the structure Apple imagined in 2008.
\n\n(Click for full-size)
\nThe App Store has grown exponentially over the past four years, and the overall presentation of software needs to take in account the sheer number of different apps available to users today. At the same time, Apple shouldn’t simply use automated algorithms to showcase noteworthy apps – it should give less importance to charts (which can be easily tricked) and “curate” software more with custom sections, weekly picks, and recommendations.
\niOS 6 only partially addresses these concerns, and, unfortunately, makes some functionalities of the App Store inexplicably worse.
\nOn iOS 6, the App Store has a new look. Gone is the black, flat look of the bottom tab bar to leave room for dark gray, almost Tapbots-like tabs with a subtle 3D effect. The blue toolbar, a marquee graphic element of iOS since 2007, is also gone in favor of the same dark gray scheme that, App Store aside, has been implemented in the iTunes Store as well.
\nIn terms of content organization, the front page has been completely redesigned both on the iPhone and iPad. Sharing the same “Featured” name as the pre-iOS 6 days, the four banners that were displayed at the top as small images (iPhone) or rotating galleries (iPad) have been replaced by a swipable carousel of larger images that, on the iPad, sports a Coverflow-like interface design and animations.
\nBelow the featured gallery, Apple is now showcasing “New & Noteworthy” apps alongside “What’s Hot”, a custom section that changes every week, and a second set of swipable “mini banners” for things like “Education” and “Apps Made by Apple”. On the front page, there’s obviously room for Apple’s recently relaunched “Free App of the Week” and “Editor’s Choice” initiatives.
\nApple’s reshuffling of App Store content and sections doesn’t stop at the front page. The tab bar, the primary way of navigating the App Store, has been reorganized to always show five areas of interaction: Featured, Charts, Genius, Search, and Updates. The most visible change is that Categories are gone from their spotlight position in the tab bar, replaced by Charts, and that Genius, once relegated to the front page’s top toolbar, is now an active element always looking for the user’s attention. This change is interesting for two reasons: first, it seems to suggest Apple wants to invest in its Genius algorithm more, perhaps hoping it’ll prove to be a feasible system to bring personalized app recommendations to the user. I can’t confirm this theory, as the Genius section I tested with the iOS 6 GM for this review was completely unusable – tapping on the “Not interested” buttons didn’t do anything – so I’m not sure how Genius is supposed to work. Better algorithms for automated, personalized recommendations were among the developers’ wishes for iOS 6, so we’ll see how this will play out in the next months.
\nMore importantly, the inclusion of Charts in the bottom toolbar speaks to Apple’s preference for displaying content that is doing well in the Top Charts, rather than new apps available inside specific categories. Previously, there used to be a “Top 25” tab and another one for Categories; now, the Charts tab is immediately after Featured, and Categories have disappeared from the tab bar.
\nEverything’s not lost, though. Single categories are still accessible on the iOS 6 App Store: in the revamped Charts area, you can view the Top Free, Top Paid, and Top Grossing charts for a single category by tapping the Categories button in upper left corner; once in there, you’ll be able to choose the chart you want to consult, and tap “See All” to view all results in a list with infinite scrolling (in my tests, laggy and slow).
\nThe good news for developers is that, with iOS 6, browsing single categories (not their top charts) may now be more accessible and visible to the end user: whereas in iOS 5 some always ignored the Categories tab in the bottom bar, a button to access every category is now available directly from the front page; on the iPhone, the button is clearly labelled “Categories”, and on the iPad there’s a series of tabs at the top starting with All Categories, then Books, Business, and a “More” button to display other Categories in alphabetical order. Following this new organization scheme, it seems like Apple is trying to position the front page – the Featured tab – as a “front page of all categories”; going further down into the category list will open a “mini front page” with New, What’s Hot, and Paid sections. In iOS 6, the lack of sorting options for categories persists, as there’s no way to sort items from a specific category by price, release date, or any other parameter.
\nIn theory, the changes mentioned so far may sound like a definitive improvement over the iOS 5 App Store. Unfortunately, in actual testing, it does seem as if the new App Store layout engine was, at best, rushed and unpolished. Generally speaking, scrolling performance isn’t good: lists, carousels, charts, horizontal lists – they all stop working every once in a while, requiring a forced restart of the App Store application. Too, there are inconsistencies across the entire visualization of content: in the iPad’s Charts area, three charts are displayed in landscape mode – Paid, Free, and Top Grossing. This column layout is, at first, intriguing, but while using it you’ll notice that lists don’t scroll particularly well. Because the main window is split in 3 columns, to scroll back to the top of a list quickly you’ll have to tap on the portion of status bar relative to a section; this system is consistent with the behavior of Settings.app on the iPad (also a multi-column layout), but on the App Store, the gesture is sometimes unresponsive.
\nI’m not alone in thinking the App Store on iOS 6 needs serious improvements before being ready for an optimal and stress-free customer experience. As the app is an HTML wrapper for content and styles Apple can fix “server-side”, there’s hope Apple will push fixes to the App Store without software updates. For now, however, the new App Store feels like a step backwards in terms of speed and reliability, and this is not good news for third-party developers who need users to pleasantly browse and discover software through Apple’s storefront.
\nSo far, I have covered the content reorganization process that went into the iOS 6 App Store, and mentioned some of the changes developers once wished for, but that didn’t come with this new version. Let’s now touch upon the other two most important changes of the iOS 6 App Store: how apps are presented, and search. And let’s start with the good news.
\nTapping on an app’s icon in the new App Store reveals a completely redesigned, cleaner, and full-featured description view with more options. On the iPad, this window is modal, offering a simple way to go back to the main store interface.
\n\n\nThe new description view is organized in two areas: at the top, icon, name, rating, Buy and Share buttons. In the lower portion of the window, there are three tabs for Details, Ratings and Reviews, and Related. Personally, I believe the new app description window is the best change Apple brought to the App Store, with some clever touches that should help both users and developers alike. Separating “Details” from “Rating and Reviews”, for one, allows for a cleaner design that actually has more information displayed in each area.
\nThe Details view starts with a swipable gallery of screenshots; on the iPhone, swiping on the first screenshot will automatically scroll the window down, placing the three aforementioned tabs at the top; if you want, you can tap on screenshots to display them in full-screen mode. Description and What’s New are located below with a “More” button to expand the text inline; regardless of whether an app is available as an update or not, the changes of “What’s New” are now always visible for every app. Trying to understand the latest changes to an app was one of the most common annoyances of iOS, and I’m glad this has been fixed in iOS 6.
\nUnderneath the new and integrated Description area, Apple is displaying information about an app (Seller, Category, Updated, etc), Developer Info, and, a new feature, Version History. This is a particularly welcome addition, as it lets you check out the full changelog of every version of an app released to date, from the most recent one to the oldest. Version History is instrumental to showing the developers who are really committed to the development of an app, as opposed to those who release an app, forget about it, and never update it. I personally try to avoid applications that haven’t been updated in a while, as they imply their developers don’t care about them much. Simultaneously, Version History provides an effortless and precise way to check whether or not a functionality you’re looking for has been added to an app.
\nOverall, I’m a fan of the new Details area. I believe it does a much better job at describing and showing an app’s feature set than the old App Store did.
\nThe current Ratings and Reviews tab, on the other hand, is a mixed bag. Its top section is promising, with a prominent Like button to recommend an app on Facebook. This is one of the perquisites of integrating Facebook at a system-wide level, and it is a simple, yet effective way to share an app’s direct link with your friends. But directly below the new Facebook Like button there’s the actual user-written reviews…which I never read. I know most users take a peek at reviews to instantly see whether or not an app is good, but, in my experience, I always stumble upon customers complaining about prices (usually within the range of $3) and missing features no one said would be available. So, I don’t read reviews.
\nThe third tab, “Related”, hosts the familiar “Customers also bought” recommendations we’ve previously seen on iTunes.
\nBut, in my opinion, the best change Apple brought to the single app view isn’t a tab, or a new design, or some new algorithm. It’s a button. Specifically, the new Share button in the top right, which allows you to send an app’s link via email, Messages, Twitter, and Facebook.
\nAll these new actions – visualized through the new, more visual and icon-oriented share sheet of iOS 6 – should dramatically increase the ease of sharing for customers, possibly driving more sales towards developers. The “Copy Link” action is also a long-time coming, a “finally” that I though I’d never say about the App Store.
\nAnd then there’s Search. In a somewhat curious attempt to bring a mix of Genius UI and Chomp to search, results are now displayed as “cards”. These new cards show the first screenshot of an app inline, giving an immediate idea of the kind of software a user is about to check out. Alongside the screenshot, icon, name, and ratings are also displayed, as well as a button to Buy/Download (or Open, if the app is already installed on the device). While this sounds great, there’s a big problem with such a UI revamp – this problem:
\nOn the iPhone, only one search result is displayed per page; to move to the next results, you have to swipe. On the iPad it gets slightly better: Apple has abandoned the integrated iPad Apps & iPhone Apps interface for separate tabs (iPad Apps is the default one) and the aforementioned cards; thanks to the iPad’s larger screen, 6 cards are shown simultaneously. At the top, there are also search filters, which are still mysteriously absent from the iPhone’s App Store.
\nThe issue I have with this new interface for searching App Store apps is both conceptual and technical. Firstly, I believe that, on the iPhone, limiting a page to displaying only one result goes against the very idea of searching: for instance, imagine if Google only displayed one link per page, or if iTunes on the computer only visualized the “top hit” result, forcing you to swipe to see more. On the iPhone, the new App Store search interface has an information density problem that, at the moment of writing this, still hasn’t been given an option to return to the classic list view. Whereas the old App Store search visually suggested that, yes, various results were available for your query, the new search UI forcibly puts the focus on the first result alone. We’re moving from 5–6 results displayed on a screen to just one, and I’m not sure swiping horizontally will be as intuitive as simply scrolling to load more results. Surely, it’ll be more tiring.
\nSecond, the new search interface just doesn’t work as advertised from a technical standpoint. Perhaps Apple will improve it with fixes on its remote servers (again, just like the other annoyances mentioned above), but as of right now, swiping between results is far from an optimal experience. On iOS 5, the “infinite scrolling” Apple used for search more or less worked as expected: if you had a decent Internet connection, you could see results (app names and icons) load within seconds. On iOS 6, cards are slow to swipe through, animations sometimes fail to finish properly, and, depending on your Internet connection, it’ll take a few seconds to load results next to the first few cards you’ve loaded. Overall, swiping between cards isn’t nearly as fast as scrolling through a vertical list of results.
\n\nI find the iPad version to be a viable compromise. Cards are obviously being used, but at least six of them are always visible, both in landscape and portrait modes. Instead of swiping, you have to scroll to load more cards, and, generally, the system seems to perform better on the iPad than the iPhone. Also, the search filters available on the iPad help considerably in refining your search and get to the results you want (or need).
\nI think I get why Apple is switching to the cards layout for search. By turning results into cards, they’re enlarging tap areas making it, in theory, easier for the user to tap on a result without accidentally loading another one. Simultaneously, by placing the first screenshot of an application inline with search results, they are trying to somewhat increase the “recognizability” of apps – which, to date, have always been associated with their icons on the App Store. And yet, for as much as I’m trying to understand Apple’s reasoning with cards, I can’t help but have the feeling the feature was rushed out of the door. The current display of cards on the iPhone doesn’t seem to make sense as a search interface: it works on the iPad, and I can say it does work for Genius, where you’re consciously going through a list of recommendations one by one, manually. But App Store search, with over 450,000 apps exclusively made for iPhone, isn’t suited to cards – an interface paradigm that clashes with the act of looking for information while getting a rapid overview of it. Can you imagine cards being applied to the Top Charts?
\nThe underlying problem goes even deeper: if Apple had a perfect search algorithm that managed to always find exactly the kind of app a user was looking for, cards could work better. They wouldn’t be a great alternative to lists, but users would notice the issue less if results were top-notch. But Apple doesn’t have perfect search results right now. Apple doesn’t even have good search results, as the algorithm they’re using returns the most curious choices on a variety of queries. The YouTube app is the fifth result for “YouTube”; “Apple” doesn’t return apps made by Apple (which also happens to be a section inside the App Store); looking for “Twitter” places Tweetbot around the #20 position, with a bunch of games and Instagram apps before it. And then there’s stuff like this still happening on a weekly basis, cluttering search with software the users don’t need.
\nIf cards are meant to highlight the first results, then those have to be state-of-the-art results. But nothing has changed from May: search is still dumb. And now, on the iPhone, it’s got a new interface that makes it even worse.
\nOn the technical side, iOS 6 comes with some great changes for developers. Additions to the SDK such as the new Event Kit framework and Camera APIs will allow them to build cool new features and apps, whilst Facebook integration will undoubtedly prove to be a solid way to let users enjoy social functionalities without being annoyed by login screens and confirmation dialogs. On the other hand, though, there are still some lingering issues and questions that developers will have to keep in mind while considering iOS 6. In spite of the numerous improvements, several developers I talked to were still largely unsatisfied about the current state of iCloud storage, which keeps posing various challenges for developers of document-based apps or software that needs to sync large and complex databases across multiple devices. Others were more skeptical about starting to require iOS 6 from their users, as they’ll need to see what the new OS’ adoption rate will look like before making the jump to developing iOS 6-only apps. Hopefully, with over-the-air updates, the majority of active iOS users will update within the next few weeks.
\nDevelopers I contacted welcomed Apple’s approach to Facebook integration, which will hopefully help them register more sales thanks to users actively sharing links to their apps from iOS devices. Overall, they also appreciated the (many) subtle tweaks Apple made to the App Store’s system-wide integration on iOS 6: for instance, iTunes links received via email will now open a modal window to download an app immediately from Mail, rather than yanking out users to the App Store. Moreover, Apple is now allowing users to download app updates without entering a password (thus likely increasing the percentage of users who upgrade to the latest version of an app), and it is displaying a “New” badge on the icons of newly-downloaded applications.
\nHowever, developers are concerned by App Store search on the iPhone, and the constant changes Apple is making to its search and ranking algorithms, which, despite Apple’s efforts, haven’t so far drastically improved the accuracy of App Store results. With over 700,000 available apps and a new interface, developers who make a living out of selling software hope Apple is listening.
\nIt’s easy to look at iOS 6 from Apple’s perspective: self-preservation.
\nSince Steve Jobs came back to the company in 1997, Apple has increasingly prioritized control over features in some key areas of its products and services. Control over the software that is sold on the App Store. Control over the dock connector and accessory market for portable devices. Control over software updates, not dictated by the carriers anymore. Control over the user experience, with guidelines on what is possible to do with apps for the iPhone and iPad.
\nSometimes, however, in order to ensure basic functionality for a product, Apple had to give up some of that control, either because they were not ready with their own alternative, or because the market didn’t have any possible alternatives. Samsung played a big role in providing components for the iPhone up until this year, when Apple decided to look at other options, besides making the A6 processor a truly custom core. On iOS, they baked Google Search into Mobile Safari because it was (still is) the world’s most popular search engine, but in iOS 5 they started implementing more specific third-party services (such as Wolfram | Alpha) to provide users with more accurate, precise, and immediate results while at the same time slowly moving away from Google.
\nIn iOS 6, Apple went ahead and removed YouTube and Google Maps from the operating system, offering their own, new standalone Maps application.
\nFor Apple, there can only be one big fish in the pond: themselves. Therefore, most of Google’s legacy built-in functionality had to go. But once again, Apple had to compromise on other things.
\nThey had to choose the lesser of two evils. With iOS devices selling millions of units every month and an iPhone 5 coming soon (and off to a great start), should have Apple allowed Google – its biggest competitor in the mobile OS space – to keep a strategic position on iOS with built-in apps and search? Or, would it be better to cut deals with the smaller guys – Wolfram, Yelp, OpenTable, Yahoo, Rotten Tomatoes – to build a more focused search that happens outside of Google’s system? Should have Apple allowed Google to keep making money off search queries performed through Mobile Safari?
\nAs we’ve seen, Apple chose the second option. They touted Google integration in the iPhone introductory keynote and used most of their services until they needed them; once they lost their leverage – e.g. Google built a strong mobile platform of its own – they had to look for alternatives. Which are still based on third-party services they don’t own, but that they likely can control more thanks to better deals, agreements, and lack of conflicts of interest (Yelp doesn’t make smartphones or mobile OSes).
\nThey started from Maps. By using a bevy of third-party services and licensed data, they built an experience they can control. Not just in terms of visual appearances – Apple developed the Google Maps app, but they couldn’t modify the Maps tiles for example – they effectively created a pipe to turn the location of iOS users into aggregate data they can use to develop more in-house location services for the future. Think crowd-souced traffic database, only on a much larger scale with much more information available about users’ location, habits, devices, and preferences. Just an example: by aggregating Maps and usage data, Apple could monitor what percentage of iPhone owners visits certain retail chains in the US, cut deals with those chains, and convince them to provide Passbook offers for loyal customers who have an iPhone. This is the kind of data that, before, Google would have likely kept for itself to build better services and more lucrative advertising deals. That was money and control on the user experience drifting away from Apple.
\nSimilarly, every search users performed on iOS used to go exclusively through Google. Starting with iOS 5 and even more with iOS 6, Apple is offering an alternative way of searching: Siri.
\nFor some kinds of data, Siri is more focused and to-the-point than Google’s mobile search results. For restaurant reviews and reservations, for instance, Siri offers a more elegant interface for browsing available options around you; for sports information and players’ data, the Apple-designed Yahoo Sports integration is better-looking and faster than finding updated stats on Google.com; for questions about movies, geography, or word definitions, Siri is quicker than going to Safari to type your question. Like Graham explained in August, this has increased Apple’s reliance on third-party services, but the company has managed to keep their branding as minimal as possible, making users think these queries are, in fact, happening thanks to Apple. And for all we know, the data processed by Siri uses other services’ databases, but goes through Apple’s pipe. Again: control on data, and control on the user experience. For the time being, Apple sees this as an acceptable compromise. The lesser of two evils.
\n“Right from day one, the Weather, Stocks, YouTube and Maps apps were inherently powered by Yahoo and Google – but the apps were designed by Apple and there was no overt branding. Weather and Stocks to this day exhibit just a single, Y! icon in the corner, whilst there was very minimal branding in the YouTube and Maps apps. Similarly, all the services that are integrated with Siri feature small and subtle branding that is almost invisible at a glance.”
\n- The Rise Of Third Party Services And Fall Of Google In iOS
\nObviously, Apple wasn’t ready to completely eliminate Google from the position of default search engine for Safari, and probably never will – Google’s dominance in the space far exceeds the probability of another search engine gaining the same amount of users and data in the short term. But Apple can take steps to make Google as invisible as possible, all while “guiding” users towards workflows and interactions that don’t require Google at all.
\nIn Safari for iOS 6, a new tab opened on the iPad won’t automatically place the cursor in the search bar – it’ll place it in the address bar. And speaking of the search bar: it doesn’t have “Google” (or Yahoo, or Bing) written on it anymore. Just “search”. It’s subtle, but it’s the kind of change that suggests search itself is a feature, not Google’s search. Other Safari features, such as iCloud Tabs and constant bookmark synchronization, will allow people to keep the websites they visits always available across devices, thus decreasing the need for searching again. And as mentioned above, Siri should prove itself worthy of consideration for some types of search – those that Google still hasn’t fully optimized for the instant-on, get-results-fast nature of mobile.
\nApple can’t do everything. They can’t build hardware, software, and services while simultaneously think about search, sports, weather, restaurants, mathematical computation, and social. Not even with $100+ billion in the bank can they afford to put their proverbial level of focus and care on dozens of different skills at once. They need partnerships. And they need to pick them carefully. The right horses to ride going forward.
\niOS 6 is the epitome of a company that has set out to find an equilibrium between data and presentation, control and user experience.
\nTo build a stronger structure, sometimes you have to start over. From the foundation. To ensure its long-term survival and expansion in key areas such as location and search, after iOS 5 Apple found itself wondering whether they should:
\n1) Ditch Google Maps and offer something of their own, or
\n2) Keep on improving iOS with major new “tentpole” user features without rethinking anything.
\nAll while (3) providing developers with new APIs to build apps for the richest software ecosystem on the planet.
\nThe first option would have given them data and control, at the cost of complaints from users who would find out Apple Maps aren’t nearly as full-featured and accurate as Google Maps yet (personally, I don’t believe the theory that it could have been Google to pull out of the iOS deal).
\nThe second option would have kept users happy: everybody loves new features. However, it would have put the future of the platform at risk, at least in some areas.
\nAs for the third option: it’s really obvious – the best thing Apple can do is to give developers more and better tools to build apps for its ecosystem.
\nAs the principle goes, among three favorable options only two are possible at the same time. Rethink core functionalities, Offer major new user features, Keep developers happy: Pick two.
\nWe talk about Apple’s self-preservation plans and company strategies, but the flip side is – people don’t care about this stuff. This is nerd talk at its finest – details that we, as tech writers, focus on because we’re striving for details and reporting accuracy. But those millions of people who buy an iPhone every month don’t care about Apple’s deals with Google and the importance of Siri to shift away from Google search. They don’t know about ad revenue, APIs, or aggregate data.
\nLet’s get real. People just want things to work. We, the tech press and the nerds, care about the gossip and details going on behind the scenes. But the rest of the world doesn’t. My friends, your friends, your neighbor and that guy who’s friends with your brother-in-law – they use their iPhones. They don’t want to know about the drama of the people who make them.
\nAs a user, I find iOS 6 to be a different kind of upgrade than, say, iOS 5 or iOS 4. The latest two major updates to iOS brought OS-redefining user additions like multitasking, folders, Notification Center, Siri, iMessage, and Reminders. And before them, iPhone OS 2 and 3 introduced the App Store and copy & paste. Apple has a history of bringing major, user-facing features to iOS with the new version it releases every year.
\niOS 6 swaps the Maps application with a new one, adds Facebook integration on the lines of iOS 5’s Twitter one, and then focuses on refining everything else.
\nSome parts of iOS 6 are painstakingly refined. Which isn’t a surprise given that, at this point, this operating system has been available for five years, and in development for many more.
\nSafari, already a fine browser, is now even faster thanks to JavaScript improvements, has full-screen support for landscape mode on iPhone, and can contain more tabs on the iPad (up to 24, available inside a popover). I already liked the possibility of syncing Safari tabs across my Macs running Mountain Lion, but as I expected in my review, iCloud Tabs’ full potential is only truly revealed when you start using the feature across OS X and iOS. Now I find myself seamlessly opening pages I had on my Mac while reading on the iPad, and vice versa.
\nThe speed improvements of iOS 6’s Safari are noticeable and impressive, and I like subtle refinements and additions such as the possibility to simply copy a URL (finally) and the fact that new tabs on the iPhone aren’t appended at the end anymore.
\niOS 6 comes with a new “share sheet” that offers a visual replacement for many of the “action/sharing buttons” that, previously, used to trigger list-based, vertically-oriented textual menus. The new sheet offers a grid-based, iOS-inspired sharing menu for Safari, with icons for Mail, Messages, Twitter, and Facebook, as well as Print, Bookmark, or Reading List. The sheet has also been implemented in Photos, allowing you to share an image to Photo Stream or assign it to a contact, among other options. It works in Notes, every app that uses the “Open In…” feature and has been updated for iOS 6, and even Maps.
\n\nTracing the path Apple took to achieve immediacy and user-friendliness with app icons, the new share sheet makes sharing and opening files into other apps faster and more intuitive. It’s a reasonable change.
\nFacebook integration means iOS 6 can officially “talk” to the social network I use every day to keep in touch with my non-geek friends who don’t use Twitter. While I have mentioned the implications for apps and developers, from a user’s perspective I appreciate the simplicity of Facebook sharing in iOS 6. It works exactly like iOS 5’s Twitter integration, only with a different color scheme and more privacy options. Since I started testing iOS 6, I have used the new “Tap to Post” Notification Center widget to quickly send status updates to Facebook, but I’ve also uploaded pictures from the Photos app. I like how Apple made Facebook’s complex privacy settings as simple as a “Choose Audience” toggle that lets me set updates to Public or Friends-only.
\nWhile the new Facebook app is a terrific improvement over the old version, I still prefer using iOS 6 if I need to send off a quick update or photo. Because I have had a bad experience with letting other services mess with my contacts, I haven’t activated the option to “Update Contacts” using Facebook and the iOS Address Book. Single sign-on will be huge for game developers and video discovery apps with a social component.
\nMail received the same VIP feature I outlined with Mountain Lion. I use this functionality sporadically. Instead, I’m glad to see things like separate Archive / Delete (tap & hold to reveal both options), possibility to attach photos and videos to a message, and easier draft access (tap & hold compose button) finding their way to Mail in iOS 6. The “attach media” feature is particularly interesting, as it’s been implemented through the standard copy & paste menu and it’s somewhat reminiscent of an old concept for an iOS Services Menu that I’m still looking forward to (maybe someday).
\nAs a user, I appreciate Apple’s renewed interest for well-presented, understandable privacy settings that are simple to control and change at any time. In older versions of iOS, you could see apps that were accessing your location or Twitter accounts. Following this year’s Address Book privacy fiasco, Apple went back to the drawing board and designed an entirely new Privacy section that clearly lists apps that are accessing your location, contacts, calendars, reminders, photos, Bluetooth Sharing’s features, Twitter and Facebook accounts. You can revoke permissions at any time, and everything is elegantly presented with app icons and no complicated menus or poorly-worded dialogs.
\nAnd then there’s the huge list of minor improvements, hidden features, and subtle refinements that Apple has been adding to almost every aspect of the OS. It almost feels as if Apple looked in every corner, listened to users’ feedback, and fixed all those little and not-so-little things that, for years or months, have annoyed iPhone and iPad users. Separate “send” and “receive at” email addresses for iMessage; phone number support for iMessage and FaceTime on devices like the iPod touch and iPad, through iCloud; Notification Center in the cloud, with unread notifications automatically dismissed if read on another device. And there’s more: per-account Mail signatures with HTML support; Lost Mode for Find My iPhone and iCloud.com; FaceTime on 3G (it worked reliably in my tests on 3 Italia’s network in my town); the totally-reworked iTunes Store with previews that keep playing even if you navigate sections and close the app.
\nAnd many, many other things that we’ve collected in a separate post.
\nJust because iOS is getting some new features it doesn’t mean I have to use them all. For as much as I’m intellectually curious and I like to know about new things, there are some functionalities that I just can’t fit into my workflow and routine.
\nDo Not Disturb isn’t for me. Located in Settings, DND allows you to set a “quiet mode” for calls and alerts. If a device is locked, they will be silenced and a moon icon will appear in the status bar; then, when you’ll turn off DND, they’ll be there in Notification Center waiting for you to catch up. Apple really cared about the details for Do Not Disturb: there’s a setting to always allow calls from Favorite contacts (or a group) even if DND is on, and another that will let the second call in a row from the same person within three minutes go through DND’s wall. As you can imagine, there’s also a scheduled option to, say, automatically activate DND at night, so you won’t be bothered by calls or sound alerts.
\nFor various personal reasons, if a call comes in at night, not only do I want to be disturbed, I have to make sure I can wake up to answer and see what’s going on. I don’t get random accidental calls at 4 AM, so if my mom is calling at that time of the night, I want to know why. Similarly, if I’m meeting someone but my doctor is calling, he’s got to come to terms with that – I have to take the call. Do Not Disturb is not for me, because, by nature, I want to know why people are looking for me. But I can see why this feature will appeal to a lot to users who aren’t like me.
\nSimilarly, while I know it’ll be great for some people, the “Remind Me Later/Reply with Message” feature added to the Phone app isn’t for me. I don’t receive that many phone calls throughout the day to be honest, and the ones I do get are either from my parents, my doctor, or a friend of mine.
\nI’d rather just let the phone ring and go to voicemail than send a text to my friend saying “Can’t talk right now…Ti richiamo tra poco”. Because, personal preference aside, my iPhone is set to English, yet my friends are Italian, and Settings.app doesn’t let me modify the first string of the “Reply with Message” option. So, I’ve ended up with a mix of strange-sounding, possibly-cool-for-some-people messages that begin with “Can’t talk right now…” and end with “scusa mamma ho da fare”. I actually sent one of these to my dad, and it was awkward. I’ve always thought combining Italian and English sounded kind of weird.
\nI recently migrated my entire photo library to Dropbox, and set up an automated workflow to send new photos to the service from my iPhone, iPad, and Mac. For this reason, I am not interested in trying out iOS’ new “Shared Photo Streams” feature, because I know I wouldn’t use it. With a shared Photo Stream, users can set up a private album to share with their friends (via email) over iCloud; members of a shared Photo Stream can “like” photos, post comments, and check out everything on the web or from their iOS devices. It is a pretty neat feature – it also supports notifications for new comments – but I have chosen to trust someone else with my memories.
\nIf I were a parent or a teacher, I’d be excited about the new Guided Access functionality introduced in iOS 6. As part of the OS’ Accessibility improvements, Guided Access can limit an iOS device to just one app by disabling the Home button and even certain tap areas of the UI. I have played around with it a little, and I see how, for many, this will change the way iOS devices are handed over to kids.
\nPanorama sounds cool, in theory. It’s a new feature of the Camera app that lets you save “panoramic shots” by capturing 240 degrees of what’s around you in one single motion. The software will then take care of processing the images, stitching them together, and it’ll try its best to make sense out of the small details that make up the 28+ megapixel photos you’ll have in your Camera Roll. In actual testing, Panorama has been nothing more than a “cool thing to check out once” for me, maybe because I don’t typically think about taking panoramic shots with my iPhone. I have been trying apps like Pano and 360 Panorama over the years, but I kept coming back to Instagram or Apple’s Camera.
\nFrom what I’ve seen, Panorama seems to work as advertised with good light conditions, but I have some doubts about Apple’s claims that it can figure out the correct exposure on its own. Maybe so with the iPhone 5.
\nPassbook
\nOn paper, Passbook is all kinds of promising. A new app in iOS 6, it wants to collect all your boarding passes, loyalty cards, movie tickets, gift cards, and more in a single place. It’s not a payment system by itself: rather, it’s an iOS-enhanced solution to collect redemption codes and tickets issued by others (such as United Airlines, Starbucks, Fandango) and use them when appropriate. Instead of keeping cards in your physical wallet, you can set up Passbook to become your central repository for this kind of content. Developers can build Passbook-compatible cards and tickets using bar code formats such as QR (as well as PDF417 and Aztec) and a set of APIs that, among other things, will allow them to add or update passes in a user’s Passbook.
\nPotentially, Passbook could be the first step towards the “digitalization of your wallet” many have been predicting for the past years. It’s not NFC – it’s a software solution that uses bar codes and iOS technologies to transmit information about you to a retailer or organization that issued a pass. However, in spite of its theoretical potential for all the parties involved (including Apple), I can’t have more than a brief description of what Passbook is, because I haven’t been able to try it. The Passbook app links to a section of the App Store listing Passbook-compatible apps, which, in the period during which I tested iOS 6, weren’t available. I’m sure users will see apps adding Passbook support starting today, but being based in Italy, I’m also more skeptical in regards to the time Italian users will have to wait before trying Passbook. When it comes to this kind of innovations in retail and local institutions, Italy is typically behind the curve with slow adoption times and bureaucracy restrictions that prevent small businesses from embracing the latest technologies and innovations. I don’t know when we’ll start seeing names like Trenitalia, Alitalia, Spizzico, or Autogrill add Passbook support; based on what I’ve heard so far, there haven’t been any discussions about this, so don’t hold your breath for major Italian organizations adding Passbook support soon.
\nMy hope is that private companies like Ryanair and individuals will set an example for the rest of Italian businesses and organizations to follow. My girlfriend runs a local dance organization that’s based in a gym here in Viterbo, where there are also Pilates and Yoga classes. If she and the gym owner decided to support Passbook, they could set up a “loyalty system” for the oldest members with discounts and monthly promotions. They could update the passes over the air, send push notifications, and give passes to new members via email or a webpage. This would be incredibly easier, engaging, and fun than the current pen-and-paper system they use for loyalty cards and discounts. Or again, there’s a friend of mine who, every year, organizes a series of cultural events here in Viterbo during the summer; instead of tickets – or in addition to – he could set up iOS passes, relying on geofences to send push notifications to an iPhone’s Lock screen when people are nearby the area of the event (there are several locations with multiple “stages” to choose from).
\nI hope that forward-looking individuals, private companies, and small-business owners will consider experimenting with Passbook. I hope that initiatives like Passdock (made in Italy) will promote the adoption of this technology for organizations that don’t want to hire developers to set up a one-time event or gift card.
\nPassbook can be a lot of great things, but I can’t use it right now.
\nFor all the improvements that went into Mail, there have been some changes that I don’t really understand or appreciate. Firstly, pull to refresh. Initially created by Loren Brichter for Tweetie, pull to refresh went on to become the de-facto way to check for updates in all kinds of applications. And now, following various open-source takes on the subject and a patent held (but not enforced) by Twitter, Apple has decided to use pull to refresh in Mail.
\nI have two issues with Apple’s implementation. Since its inception, pull to refresh was used as a clever method to check for updates in a list of messages sorted vertically from newest to oldest. It was perfect for a Twitter client: as users reached the top of the timeline, they could continue scrolling to check for updates and load new tweets. But since pull to refresh gained popularity, developers started abusing it in all sorts of ways, defeating its original purpose. I can’t help but have the same feeling about Apple’s use of the feature – that is somehow forced and that’s been added only because “users are now familiar with it”. This, however, doesn’t necessarily mean that pull to refresh is the right metaphor for checking emails: while displayed in the same vertical orientation, email is not Twitter. I am not checking for new messages every 20 seconds while I’m waiting in line, and even if I was, email refresh times are still slow compared to Twitter.
\nIf I had to nitpick, I’d say the implementation itself feels off. In the original pull to refresh, and in the several iterations by other developers, the “refresh” action is triggered when you release the pull animation. Instead, Apple’s version starts refreshing the main view as soon as you pull beyond a certain threshold, no matter if you’ve released your finger off the screen or not. It’s a minor detail, but it’s something I’ve immediately noticed after years of pull to refresh in other apps.
\nMy second problem with Mail is the “swoosh” sound for a message that’s been sent. Up until iOS 6, the sound effect used to go off after a message was actually sent – e.g. when the “loading bar” had been completely filled and the message was really off the server. In iOS 6, for some reason, the “sent” sound plays as soon as you hit Send, even if the message hasn’t been sent anywhere yet. This change has had a series of repercussions on my email workflow already. For instance, on a couple of occasions I thought I had sent a message – the sound played – but I really didn’t, because I had poor 3G and my connection was slow. Even worse, if for some reason you lose your Internet connection right after hearing the “swoosh” sound effect, your message won’t be sent at all, and it’ll be saved in the drafts. I think playing Mail’s sound effect in advance is confusing, as users have associated it with a message that’s really been sent. Imagine if Apple decided to do the same for Messages.
\nOn iOS 6, the phone number and email address you use for FaceTime are synced across your devices. While this sounds great in theory, I had to disable FaceTime on my iPad, as I kept experiencing a bug that would let my iPad keep ringing even if I answered a FaceTime call on my iPhone. I hope Apple fixes this in the next few days now that iOS 6 is public.
\nThe iPad finally has a Clock app on iOS 6. Like the iPhone app, it is divided in four areas: World Clock, Alarm, Stopwatch, and Timer. The first two are really good looking: World Clock takes advantage of the larger screen to display multiple cities on a planisphere with time and weather information. You can scroll between clocks at the top, tap on one to bring it up in full-screen, and the world map even displays areas of light and dark. Alarm is interesting, too, as it uses a calendar layout to visualize all the alarms you’ve activated for the week. There is a visual consistency between World Clock and Alarm.
\nStopwatch and Timer, on the other hand, feel like were made by someone else for another app. While the same sections in the Clock app for iPhone somehow fit together with the rest of the app, on the iPad they share a completely different UI approach in stark contrast with World Clock and Alarm. They have flat, round, glowing buttons that I haven’t seen anywhere else on iOS, and that seem to be taken straight out of a post-modern microwave oven control system. I won’t get into a debate about digital interfaces and skeuomorphism here, but to me, the Timer UI is especially baffling as it’s completely different from any other iPad UI Apple has designed to date.
\nSpeaking of design, iOS 6 brings a new option to the system status bar, which can “blend” with the top bar of applications using a different shade of the same color. This is up to developers to implement, and you can see it in action in Apple apps like Settings and Mail on iPhone. I am not a fan of this UI choice: I think that, in most cases, it diminishes the contrast between status bar data (time, battery, etc) and the background, whist simultaneously making it more distracting and “in the way”. I like the simplicity and elegance of Safari and Maps, which use the new blue gradient and silver UI, respectively, with a black status bar that is easy on the eyes.
\nI agree with the explanation Loren Brichter wrote back in June:
\n\nPhilosophically the beauty of these devices is that because they are a screen, they become whatever you are doing with them. Minimizing extraneous hardware, and extraneous system interface elements should be a goal. A distracting status bar is antithetical to that.
Siri
\nItalian is a difficult language to master. Being a Neo-Latin language, it comes with several lexical and grammatical rules related to affixes and suffixes for genre, number, and tense, definite, indefinite, and partitive articles, prepositions, word accents, pronouns, and many, many other complexities that not even most Italians know. Put simply: it’s incredibly hard to build a software that understands the meaning and context of what we’re saying, like Siri, in Italian.
\nI approached iOS 6’s Italian Siri with a mix of cautionary curiosity and skepticism. I haven’t been really using Siri in English, since I would look completely out of place here in Italy, talking to Siri in English in front of my friends and family. Plus, it just wouldn’t be natural. I was looking forward to the Italian version of Siri to speed up some common actions like sending a message, checking the weather, or finding a gas station nearby.
\nI won’t go into the details of the Italian words and expressions that Siri doesn’t understand or is unable to process. But I will say this: during the period I tested iOS 6, when Siri wasn’t down due to server issues (and there were many), I noticed that it struggles to keep up with the many ways of asking for something in Italian. Because Italian is an inflected language, verbs have conjugations; and because verbs have those grammatical variations, when you start mixing them with pronouns, prepositions, and auxiliary verbs, you can get to the same sentence in dozens of different ways – all of them being syntactically correct.
\nAs I expected, the “beta” of Italian Siri isn’t meant for “commands” that go beyond the grade of complexity marked by Apple in the “example phrases” of Siri’s information popover. In my tests, commands built with simple verb-object-time structures like Apple’s demo ones were okay for Siri (although it often failed to recognized them correctly, and I don’t have a strong dialect), but other sentences – spoken like an Italian normally would with prepositions, pronouns, and articles – didn’t go through. Italians reading this right now know what I mean. Try dictating a message with Siri using “che” (the that Shawn Blanc recently mentioned) with a subordinate clause – it won’t parse it, because it doesn’t understand we’re using an indirect speech with prepositions, which need to be excluded from the actual message.
\n\nFor commands issued with moderately complex “real” Italian sentences and schemes people use on a daily basis, this first version of Siri is a letdown. Neo-Latin languages are hard to figure out for actual people, and Siri has a long way to go as a virtual Italian assistant.
\nWith simple do this/do that commands, I’m actually quite impressed by the additions Apple brought to Siri in iOS 6. When it worked, I was able to check on Serie A results and player stats (curiously, players didn’t have profile pictures), open apps, get movie information and watch trailers. I also managed to check on some restaurants in my town using the new Yelp integration, but unfortunately the Yelp database in Viterbo is very limited, there aren’t many reviews – let alone photos – and restaurants here don’t support OpenTable reservations.
\nOverall, if you keep the sentence structure simple enough, the new features of Italian Siri mostly work. And they’re nice.
\nI have found two big issues with Siri, besides questionable support for Italian. The first one is the fact, that, if you keep your device set to English (like I do), but Siri in Italian, it can get confused with the two languages. For instance, sometimes I get movie descriptions in Italian, other times in English, and turn-by-turn navigation, spoken by Siri, will be in English in spite of Siri being set to Italian. It’s confusing, as Maps’ English interface overrides Siri’s Italian setting.
\nMy second issue is Maps.
\nFor a complete overview of Maps, I recommend you check out Cody’s extensive look at the application. Instead, these are my impressions of Apple’s new Maps in Italy.
\nAbove, I wrote that users just want stuff to work. They don’t care about deals between companies or the strategic importance of moving away from Google. Many of those who will upgrade to iOS 6 will find a worse Maps application than the one they used to have.
\nApple Maps are pretty. The standard view is all vector-based, so you can zoom in and out without losing any detail. Their icons are colored and nice. But besides that, just about everything else is a step backwards from Google Maps here in Italy, and especially in my area.
\n\nWhen zooming out, major road names aren’t displayed on the map; Google spends a great amount of time computing the data they have available, trying to understand traffic, signs, and directions based on the footage they have stored on their computers. In my area, names of less trafficked roads pop up earlier than other major roads and intersections; the labels are “too close” to each other; the satellite view has less detail than Google’s. Maybe Apple Maps have “less clutter” when compared to Google Maps; but I still don’t understand how they choose which road names to display first – less important roads shouldn’t show up when zooming out of view.
\nThe label for Viterbo has the same size of the ones from nearby smaller villages. Sometimes its label disappears completely from the map.
\nEvery once in a while, looking for a location in Viterbo takes me to another city in Italy, although I entered the name correctly. Occasionally, this happens with local businesses too: I hit the “Locate” button, type in the name of a store in Viterbo, and suddenly I’m in Rome.
\nIn Viterbo, Google had basic support for traffic information and alternative routes. In spite of the option I activated, Apple Maps never showed me traffic in Viterbo, nor did they offer me to consider other routes. I checked in other, bigger Italian cities, and traffic was supported there, alongside multiple routes. This gives me hope support for more cities is coming.
\n\nLocal business search is inferior to Google Maps. Apple has nicer icons to differentiate businesses, but the results are fewer and out of date. I did some research, and it turns out, while Google is using PagineGialle.it for its business listings – PagineGialle is biggest provider in Italy, which almost every business owner considers using to show up in search – Apple is using data from other sources, such as Acxiom. This explains why Google Maps has recent results and information (phone numbers, URLs) for businesses that are still running, while Apple Maps has several outdated results with basic info like phone numbers and webpage addresses often missing.
\nEven worse, Apple Maps doesn’t know of the existence of several institutions in my town like schools and government offices. Google does.
\nIt’ll be interesting to see if, alongside a standalone Maps app, Google will also release an SDK for developers to integrate in their apps. Because Maps is a system framework on iOS, every app that uses Map Kit will automatically switch from Google Maps to Apple Maps in iOS 6. This is a fundamental change for AR apps like Where To, or, in general, other apps with a strong location component.
\nWhere To and Facebook using Apple Maps, no app update needed.
\nWhat’s really bad, in my opinion, is the removal of Street View from iOS. I used Street View all the time to explore cities I wanted to visit and to get the general idea of what walking and driving around there would be like. Now, Apple has its own take on 3D-browsing, which is called Flyover and it’s an aerial 3D representation of major city areas in satellite view. Meaning, you won’t get to “virtually walk” through actual photos of a city, you’ll see a 3D rendering of some cities that Apple is supporting today (with a “flying around” animation Scott Forstall really seemed proud of). And it’s even a poor representation, with slow loading times, laggy animations, and a weird “apocalypse effect” if you try to zoom in too closely.
\nDon’t look at Apple’s promotional images – try it for yourself.
\nGhost cars?
\nGoogle’s version of Piazza del Popolo in Rome Vs. Piazza del Popolo after a nuclear disaster according to Apple.
\n\n\n
If you try to use Flyover like Street View – as a way to see how walking around a city would be like – prepare to do a lot of pinching, zooming, and panning around that, most of the time, iOS 6 won’t recognize correctly. As a side note, you could use iOS 5’s Street View with one hand.
\nAnd if Flyover isn’t available in your city, like in Viterbo (for context, here’s the current coverage of Street View around the world), you’ll get an ugly, flat, “3D” satellite view. If 3D buildings aren’t supported, the 3D button shouldn’t be available at all.
\n\nEven if Flyover had better imagery, animations, and was faster and more responsive, I don’t think it would be as useful as Street View. It’s a cool toy. But to get the job done – to explore cities as humans would – I have to use Street View on my computer. Humans drive and walk, they don’t fly. Flyover sounds good on marketing material – “stunning 3D images with animations!” – but in actual usage, it’s terrible.
\nMichael Degusta wrote an excellent overview of the Google features Maps is losing in iOS 6:
\n\nOn the plus side, at least people are getting turn-by-turn directions and Apple’s Flyover feature in exchange, right? Not so fast: 20 countries (population: 3.2 billion) are losing transit, traffic, or street view and getting neither turn-by-turn nor Flyover. The biggest losers are Brazil, India, Taiwan, and Thailand (population: 1.5 billion) which overnight will go from being countries with every maps feature (transit, traffic, and street view) to countries with none of those features, nor any of the new features either.
\nIt gets worse. Even in countries where turn-by-turn and/or Flyover are available, the iPhone 3GS, iPhone 4, and the 4th generation iPod touch won’t support them. These devices are owned by tens of millions of users who may update over-the-air when prompted, only to find they’ve lost features and haven’t even gained any of the marquee Maps features in return.
And this is where I come back to my initial point. People want their devices to work. Normal people use these things to plan trips, go to work, wake up in the morning, catch the bus to go to school. These devices have changed and improved many aspects of people’s lives. We’re not playing games here anymore. The tech press is so entrenched in itself, we have forgotten normal people use their iPhones and iPads not for “reviews” and “exclusives” – they use them to do stuff. To get the kids to school on time. To learn a city’s landmarks and must-see locations before going there.
\nHow are we going to tell these people that, because of Apple’s strategy, they’ll have to cope with an inferior version of Maps?
\nHow do we tell students that public transit directions are no longer, that they’ll have to use separate “App Store apps” – which aren’t available yet?
\nCan we justify Apple Maps in the name of the greater good?
\nPersonally, I can’t. Because while I could go on and let my friends read the “Company” section above and try to make them understand that, yes, that’s why Apple had to ditch Google, the truth is – they don’t care. They are going to update to iOS 6 because they’re curious, just like everybody else, and they’re going to ask about “the Maps app that doesn’t work anymore”.
\nThey won’t say “Yeah, but at least Apple has more control now”.
\nThey won’t say “At least Apple’s icons are nicer”.
\nThose are things bloggers like me write. The “normal people” will hate that the new Maps app isn’t as good as before.
\nAnd we’ll have to tell them that “It’ll get better soon”. Because it’s not great since day one.
\nIt’s time we stop giving Apple a free pass on everything. Enough with the sugar-coating. Earlier this year, someone argued that the privacy fiasco was actually a good thing because it could have been worse on Android. What kind of explanation is that? It was bad, period. With Maps, I have a similar black & white view. The current version of Maps is, from a data standpoint, a step backwards from Google Maps, with the exception of turn-by-turn navigation, which is a great addition.
\nAs a tech writer, I understand the importance of Apple’s new Maps, and I applaud their decision to build their own solution for the future. I also understand that, while Apple could have licensed some Google technologies such as Street View and public transit directions to include as options in the new Maps, they didn’t want to.
\nAs a user, I can’t help but think that this needed more time, that maps simply aren’t ready in many areas, and that Google Maps was just better.
\nAs the saying goes, you can’t please everyone. When you’re building software and you’re targeting multiple audiences at once, you can’t build at the same level for all of them. And that’s how I see iOS 6.
\nFor the most part, Apple built iOS 6 for itself and to guarantee the prosperity of certain areas of the OS. Areas such as search, increasingly shifting to Siri, and location. As a reflection of these actions, Apple gave new goodies to developers, as they understand the importance of a healthy developer ecosystem that sets iOS apart from the competition. And last, Apple refined iOS 6 for the users, polishing many inconsistencies but lacking the Notification Center and iMessage breakthroughs of iOS 5. From this perspective, I’d say that, of three favorable options, Apple mainly focused on two: the company and the developers.
\nThat’s not necessarily a bad thing. Users don’t need to learn a new UI or app every year, especially when they’ve become accustomed to how things work. Adding features for the sake of adding is not innovation. Users want their devices to keep working with the same degree of functionality, which is why I see Maps as a real, tangible problem today. But this doesn’t mean iOS 6 isn’t a notable update. iOS 6 is a good improvement over iOS 5 with several welcome refinements and additions like Facebook, more languages for Siri, and a faster Safari.
\nIn my opinion, iOS 6 has, right now, worse Maps and App Store search; especially for Maps, if you rely on features like Street View and public transit directions, I can’t recommend the update until an official Google Maps app comes out.
\nFor everything else, iOS 6 improves on almost every aspect of the operating system, and sets the stage for a stronger platform in the future.
\nFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.
\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.
\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;
\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;
\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.
\n\nJoin Now", "content_text": "In some ways, iOS 6 is not a major update. And yet, in others, it’s possibly the biggest thing to happen to iOS since iPhone OS 1. Both of these assertions have far-reaching consequences for the users, third-party developers, and Apple itself.\nIn June, soon after the official announcement and preview of iOS 6, I concluded my general overview of the software with four questions. Looking back at that article now, those questions are more relevant than ever.\n“Will the App Store redesign also bring new curation and search features, as many developers asked?”\n“We didn’t quite get the “silver” system theme that was rumored; Apple seems to be moving towards blue gradients as a standard UI element, but not everyone’s liking the change for toolbars and status bars. Will they reconsider or improve upon today’s beta in terms of looks come Fall 2012?”\n“With a new iPhone likely to be released in October, will we see even more features being added to the OS to take advantage of the new device’s hardware?”\n“Will Google release a standalone Maps application?”\nThe answer to three of them is “no”. The last one – whether Google will release a standalone Maps application for iOS on the App Store – could be a “most certainly yes”, but we don’t know any more details.\niOS 6 is a controversial release, in that through the following days we’ll likely witness several news outlets and independent bloggers declare Apple’s doom or absolute genius, depending on the Internet clique they choose to side with. I think that, in this case, the truth lies somewhere in the middle – a gray area that needs a calm and thorough consideration. At the same time, I also believe that the “controversial” nature of iOS 6 needs to be analyzed for its various facets and reasons of existence. Why did Apple choose these features for iOS 6? What does the user make of it?\nI have been testing iOS 6 for the past weeks, and I have (slowly) come to the conclusion that there’s no easy way to cover this update with a traditional review strategy. Instead, I have decided to take a look at the software from multiple perspectives, understanding the possible implications, downsides, and improvements for each one of them.\nWhile last year I would have answered to the question “Should I upgrade to iOS 5 right away” with a resounding “Yes”, this year I’m not so sure about the “right away” part.\niOS 6: The Developer\nIf you’re a third-party developer who makes apps for iOS, you should be relatively excited about iOS 6, and I don’t think I need to tell you why. The developers I spoke with about iOS 6 told me that, in this release, there are some great changes, ranging from little goodies that fix long-standing annoyances of the SDK to bigger, new, and more important APIs. It was a shared sentiment among the developers I contacted that there will be plenty of room for experimentation as soon as the update’s adoption rate rises to a level where going iOS 6-only can be considered.\nTake, for instance, the additions to the Event Kit framework. Starting today, you’ll begin to see apps integrating with Apple’s Reminders, as, thanks to iOS 6, third-party apps can directly access Reminders data. Apps can now view and share to-do lists in the Reminders app, with options to create and modify reminders, assign due date and priorities – even set location and time-based alarms. I have personally tested a couple of upcoming updates to popular third-party apps that implement the new Event Kit framework, and it’s a great addition.\nIt’s easy to understand why. By accessing Apple’s Reminders app and its data, developers are essentially given access to a “platform” for to-dos and alerts that syncs through iCloud on the iPhone, iPad, Mac, and, now, the web. Think of all those “simple to-do list apps” that have populated the App Store for the past four years: now, instead of implementing their own sync mechanism (which has costs both in terms of realization and future development) or having to rely on external services like Dropbox, they can just add Reminders support and get all the benefits of iCloud. iOS users know how Reminders works; instead of choosing the syncing system they prefer, it’s likely that, with iOS 6 and third-party support, users will simply pick “an interface” for their Reminders. Don’t like how Apple’s Reminders presents your lists? You can try another app that has (more or less) the same functionalities, with a different design. Just like users have been able to pick their preferred calendar client, with iOS 6 they’ll have options for “Reminders clients” as well.\nWhile Event Kit is the most “user facing” addition to the SDK, there are several other improvements worth mentioning. iOS 6 has native Facebook integration: like Twitter in iOS 5, users can sign into their Facebook accounts from the Settings, and start posting status updates, media, or links to Facebook without the need of a dedicated client. Developers who want to integrate Facebook into their apps will be able to leverage single sign-on to let users quickly log-in with their Facebook credentials, as they are already saved on the device. Furthermore, this new social framework can essentially give any app decent Facebook features thanks to “share sheets” – little pop-up menus that enable users to post to Facebook from anywhere on iOS. It means developers won’t have to create their own custom Facebook login systems anymore.\nThere are other notable additions to the SDK, such as In-App Purchases for any kind of iTunes content (music, books, etc); Smart Banners for Safari, a way to easily redirect users to the App Store if they visit a developer’s webpage; new Camera APIs with access to face detection and exposure (among other things); and, obviously, APIs for Passbook and Maps, Apple’s two new apps in iOS 6. Not to mention improvements to layout and presentation techniques, which will pave the way for richer, more engaging apps that use text-based lists less and employ “more visual” elements such as tiles and grids.\nBut developing apps that take advantage of iOS’ latest features would be useless without a platform to promote them and sell them.\nThe App Store\nIn May I took a look at the first four years of App Store, speaking to various third-party developers about their experiences and concerns with Apple’s ecosystem, then counting 600,000 apps for iPhone and iPad, and now well over 700,000 according to recent data from Apple. Among the several suggestions and critiques made by developers, nearly all of them expressed their wish for a more social, truly curated, and more searchable App Store that would supersede the structure Apple imagined in 2008.\n\n(Click for full-size)\nThe App Store has grown exponentially over the past four years, and the overall presentation of software needs to take in account the sheer number of different apps available to users today. At the same time, Apple shouldn’t simply use automated algorithms to showcase noteworthy apps – it should give less importance to charts (which can be easily tricked) and “curate” software more with custom sections, weekly picks, and recommendations.\niOS 6 only partially addresses these concerns, and, unfortunately, makes some functionalities of the App Store inexplicably worse.\nOn iOS 6, the App Store has a new look. Gone is the black, flat look of the bottom tab bar to leave room for dark gray, almost Tapbots-like tabs with a subtle 3D effect. The blue toolbar, a marquee graphic element of iOS since 2007, is also gone in favor of the same dark gray scheme that, App Store aside, has been implemented in the iTunes Store as well.\nIn terms of content organization, the front page has been completely redesigned both on the iPhone and iPad. Sharing the same “Featured” name as the pre-iOS 6 days, the four banners that were displayed at the top as small images (iPhone) or rotating galleries (iPad) have been replaced by a swipable carousel of larger images that, on the iPad, sports a Coverflow-like interface design and animations.\nBelow the featured gallery, Apple is now showcasing “New & Noteworthy” apps alongside “What’s Hot”, a custom section that changes every week, and a second set of swipable “mini banners” for things like “Education” and “Apps Made by Apple”. On the front page, there’s obviously room for Apple’s recently relaunched “Free App of the Week” and “Editor’s Choice” initiatives.\nApple’s reshuffling of App Store content and sections doesn’t stop at the front page. The tab bar, the primary way of navigating the App Store, has been reorganized to always show five areas of interaction: Featured, Charts, Genius, Search, and Updates. The most visible change is that Categories are gone from their spotlight position in the tab bar, replaced by Charts, and that Genius, once relegated to the front page’s top toolbar, is now an active element always looking for the user’s attention. This change is interesting for two reasons: first, it seems to suggest Apple wants to invest in its Genius algorithm more, perhaps hoping it’ll prove to be a feasible system to bring personalized app recommendations to the user. I can’t confirm this theory, as the Genius section I tested with the iOS 6 GM for this review was completely unusable – tapping on the “Not interested” buttons didn’t do anything – so I’m not sure how Genius is supposed to work. Better algorithms for automated, personalized recommendations were among the developers’ wishes for iOS 6, so we’ll see how this will play out in the next months.\nMore importantly, the inclusion of Charts in the bottom toolbar speaks to Apple’s preference for displaying content that is doing well in the Top Charts, rather than new apps available inside specific categories. Previously, there used to be a “Top 25” tab and another one for Categories; now, the Charts tab is immediately after Featured, and Categories have disappeared from the tab bar.\nEverything’s not lost, though. Single categories are still accessible on the iOS 6 App Store: in the revamped Charts area, you can view the Top Free, Top Paid, and Top Grossing charts for a single category by tapping the Categories button in upper left corner; once in there, you’ll be able to choose the chart you want to consult, and tap “See All” to view all results in a list with infinite scrolling (in my tests, laggy and slow).\nThe good news for developers is that, with iOS 6, browsing single categories (not their top charts) may now be more accessible and visible to the end user: whereas in iOS 5 some always ignored the Categories tab in the bottom bar, a button to access every category is now available directly from the front page; on the iPhone, the button is clearly labelled “Categories”, and on the iPad there’s a series of tabs at the top starting with All Categories, then Books, Business, and a “More” button to display other Categories in alphabetical order. Following this new organization scheme, it seems like Apple is trying to position the front page – the Featured tab – as a “front page of all categories”; going further down into the category list will open a “mini front page” with New, What’s Hot, and Paid sections. In iOS 6, the lack of sorting options for categories persists, as there’s no way to sort items from a specific category by price, release date, or any other parameter.\nIn theory, the changes mentioned so far may sound like a definitive improvement over the iOS 5 App Store. Unfortunately, in actual testing, it does seem as if the new App Store layout engine was, at best, rushed and unpolished. Generally speaking, scrolling performance isn’t good: lists, carousels, charts, horizontal lists – they all stop working every once in a while, requiring a forced restart of the App Store application. Too, there are inconsistencies across the entire visualization of content: in the iPad’s Charts area, three charts are displayed in landscape mode – Paid, Free, and Top Grossing. This column layout is, at first, intriguing, but while using it you’ll notice that lists don’t scroll particularly well. Because the main window is split in 3 columns, to scroll back to the top of a list quickly you’ll have to tap on the portion of status bar relative to a section; this system is consistent with the behavior of Settings.app on the iPad (also a multi-column layout), but on the App Store, the gesture is sometimes unresponsive.\nI’m not alone in thinking the App Store on iOS 6 needs serious improvements before being ready for an optimal and stress-free customer experience. As the app is an HTML wrapper for content and styles Apple can fix “server-side”, there’s hope Apple will push fixes to the App Store without software updates. For now, however, the new App Store feels like a step backwards in terms of speed and reliability, and this is not good news for third-party developers who need users to pleasantly browse and discover software through Apple’s storefront.\nSearching For A Better Search\nSo far, I have covered the content reorganization process that went into the iOS 6 App Store, and mentioned some of the changes developers once wished for, but that didn’t come with this new version. Let’s now touch upon the other two most important changes of the iOS 6 App Store: how apps are presented, and search. And let’s start with the good news.\nTapping on an app’s icon in the new App Store reveals a completely redesigned, cleaner, and full-featured description view with more options. On the iPad, this window is modal, offering a simple way to go back to the main store interface.\n\n\nThe new description view is organized in two areas: at the top, icon, name, rating, Buy and Share buttons. In the lower portion of the window, there are three tabs for Details, Ratings and Reviews, and Related. Personally, I believe the new app description window is the best change Apple brought to the App Store, with some clever touches that should help both users and developers alike. Separating “Details” from “Rating and Reviews”, for one, allows for a cleaner design that actually has more information displayed in each area.\nThe Details view starts with a swipable gallery of screenshots; on the iPhone, swiping on the first screenshot will automatically scroll the window down, placing the three aforementioned tabs at the top; if you want, you can tap on screenshots to display them in full-screen mode. Description and What’s New are located below with a “More” button to expand the text inline; regardless of whether an app is available as an update or not, the changes of “What’s New” are now always visible for every app. Trying to understand the latest changes to an app was one of the most common annoyances of iOS, and I’m glad this has been fixed in iOS 6.\nUnderneath the new and integrated Description area, Apple is displaying information about an app (Seller, Category, Updated, etc), Developer Info, and, a new feature, Version History. This is a particularly welcome addition, as it lets you check out the full changelog of every version of an app released to date, from the most recent one to the oldest. Version History is instrumental to showing the developers who are really committed to the development of an app, as opposed to those who release an app, forget about it, and never update it. I personally try to avoid applications that haven’t been updated in a while, as they imply their developers don’t care about them much. Simultaneously, Version History provides an effortless and precise way to check whether or not a functionality you’re looking for has been added to an app.\nOverall, I’m a fan of the new Details area. I believe it does a much better job at describing and showing an app’s feature set than the old App Store did.\nThe current Ratings and Reviews tab, on the other hand, is a mixed bag. Its top section is promising, with a prominent Like button to recommend an app on Facebook. This is one of the perquisites of integrating Facebook at a system-wide level, and it is a simple, yet effective way to share an app’s direct link with your friends. But directly below the new Facebook Like button there’s the actual user-written reviews…which I never read. I know most users take a peek at reviews to instantly see whether or not an app is good, but, in my experience, I always stumble upon customers complaining about prices (usually within the range of $3) and missing features no one said would be available. So, I don’t read reviews.\nThe third tab, “Related”, hosts the familiar “Customers also bought” recommendations we’ve previously seen on iTunes.\nBut, in my opinion, the best change Apple brought to the single app view isn’t a tab, or a new design, or some new algorithm. It’s a button. Specifically, the new Share button in the top right, which allows you to send an app’s link via email, Messages, Twitter, and Facebook.\nAll these new actions – visualized through the new, more visual and icon-oriented share sheet of iOS 6 – should dramatically increase the ease of sharing for customers, possibly driving more sales towards developers. The “Copy Link” action is also a long-time coming, a “finally” that I though I’d never say about the App Store.\nAnd then there’s Search. In a somewhat curious attempt to bring a mix of Genius UI and Chomp to search, results are now displayed as “cards”. These new cards show the first screenshot of an app inline, giving an immediate idea of the kind of software a user is about to check out. Alongside the screenshot, icon, name, and ratings are also displayed, as well as a button to Buy/Download (or Open, if the app is already installed on the device). While this sounds great, there’s a big problem with such a UI revamp – this problem:\nOn the iPhone, only one search result is displayed per page; to move to the next results, you have to swipe. On the iPad it gets slightly better: Apple has abandoned the integrated iPad Apps & iPhone Apps interface for separate tabs (iPad Apps is the default one) and the aforementioned cards; thanks to the iPad’s larger screen, 6 cards are shown simultaneously. At the top, there are also search filters, which are still mysteriously absent from the iPhone’s App Store.\nThe issue I have with this new interface for searching App Store apps is both conceptual and technical. Firstly, I believe that, on the iPhone, limiting a page to displaying only one result goes against the very idea of searching: for instance, imagine if Google only displayed one link per page, or if iTunes on the computer only visualized the “top hit” result, forcing you to swipe to see more. On the iPhone, the new App Store search interface has an information density problem that, at the moment of writing this, still hasn’t been given an option to return to the classic list view. Whereas the old App Store search visually suggested that, yes, various results were available for your query, the new search UI forcibly puts the focus on the first result alone. We’re moving from 5–6 results displayed on a screen to just one, and I’m not sure swiping horizontally will be as intuitive as simply scrolling to load more results. Surely, it’ll be more tiring.\nSecond, the new search interface just doesn’t work as advertised from a technical standpoint. Perhaps Apple will improve it with fixes on its remote servers (again, just like the other annoyances mentioned above), but as of right now, swiping between results is far from an optimal experience. On iOS 5, the “infinite scrolling” Apple used for search more or less worked as expected: if you had a decent Internet connection, you could see results (app names and icons) load within seconds. On iOS 6, cards are slow to swipe through, animations sometimes fail to finish properly, and, depending on your Internet connection, it’ll take a few seconds to load results next to the first few cards you’ve loaded. Overall, swiping between cards isn’t nearly as fast as scrolling through a vertical list of results.\n\nI find the iPad version to be a viable compromise. Cards are obviously being used, but at least six of them are always visible, both in landscape and portrait modes. Instead of swiping, you have to scroll to load more cards, and, generally, the system seems to perform better on the iPad than the iPhone. Also, the search filters available on the iPad help considerably in refining your search and get to the results you want (or need).\nI think I get why Apple is switching to the cards layout for search. By turning results into cards, they’re enlarging tap areas making it, in theory, easier for the user to tap on a result without accidentally loading another one. Simultaneously, by placing the first screenshot of an application inline with search results, they are trying to somewhat increase the “recognizability” of apps – which, to date, have always been associated with their icons on the App Store. And yet, for as much as I’m trying to understand Apple’s reasoning with cards, I can’t help but have the feeling the feature was rushed out of the door. The current display of cards on the iPhone doesn’t seem to make sense as a search interface: it works on the iPad, and I can say it does work for Genius, where you’re consciously going through a list of recommendations one by one, manually. But App Store search, with over 450,000 apps exclusively made for iPhone, isn’t suited to cards – an interface paradigm that clashes with the act of looking for information while getting a rapid overview of it. Can you imagine cards being applied to the Top Charts?\nThe underlying problem goes even deeper: if Apple had a perfect search algorithm that managed to always find exactly the kind of app a user was looking for, cards could work better. They wouldn’t be a great alternative to lists, but users would notice the issue less if results were top-notch. But Apple doesn’t have perfect search results right now. Apple doesn’t even have good search results, as the algorithm they’re using returns the most curious choices on a variety of queries. The YouTube app is the fifth result for “YouTube”; “Apple” doesn’t return apps made by Apple (which also happens to be a section inside the App Store); looking for “Twitter” places Tweetbot around the #20 position, with a bunch of games and Instagram apps before it. And then there’s stuff like this still happening on a weekly basis, cluttering search with software the users don’t need.\nIf cards are meant to highlight the first results, then those have to be state-of-the-art results. But nothing has changed from May: search is still dumb. And now, on the iPhone, it’s got a new interface that makes it even worse.\nFor The Developers\nOn the technical side, iOS 6 comes with some great changes for developers. Additions to the SDK such as the new Event Kit framework and Camera APIs will allow them to build cool new features and apps, whilst Facebook integration will undoubtedly prove to be a solid way to let users enjoy social functionalities without being annoyed by login screens and confirmation dialogs. On the other hand, though, there are still some lingering issues and questions that developers will have to keep in mind while considering iOS 6. In spite of the numerous improvements, several developers I talked to were still largely unsatisfied about the current state of iCloud storage, which keeps posing various challenges for developers of document-based apps or software that needs to sync large and complex databases across multiple devices. Others were more skeptical about starting to require iOS 6 from their users, as they’ll need to see what the new OS’ adoption rate will look like before making the jump to developing iOS 6-only apps. Hopefully, with over-the-air updates, the majority of active iOS users will update within the next few weeks.\nDevelopers I contacted welcomed Apple’s approach to Facebook integration, which will hopefully help them register more sales thanks to users actively sharing links to their apps from iOS devices. Overall, they also appreciated the (many) subtle tweaks Apple made to the App Store’s system-wide integration on iOS 6: for instance, iTunes links received via email will now open a modal window to download an app immediately from Mail, rather than yanking out users to the App Store. Moreover, Apple is now allowing users to download app updates without entering a password (thus likely increasing the percentage of users who upgrade to the latest version of an app), and it is displaying a “New” badge on the icons of newly-downloaded applications.\nHowever, developers are concerned by App Store search on the iPhone, and the constant changes Apple is making to its search and ranking algorithms, which, despite Apple’s efforts, haven’t so far drastically improved the accuracy of App Store results. With over 700,000 available apps and a new interface, developers who make a living out of selling software hope Apple is listening.\niOS 6: The Company\nIt’s easy to look at iOS 6 from Apple’s perspective: self-preservation.\nSince Steve Jobs came back to the company in 1997, Apple has increasingly prioritized control over features in some key areas of its products and services. Control over the software that is sold on the App Store. Control over the dock connector and accessory market for portable devices. Control over software updates, not dictated by the carriers anymore. Control over the user experience, with guidelines on what is possible to do with apps for the iPhone and iPad.\nSometimes, however, in order to ensure basic functionality for a product, Apple had to give up some of that control, either because they were not ready with their own alternative, or because the market didn’t have any possible alternatives. Samsung played a big role in providing components for the iPhone up until this year, when Apple decided to look at other options, besides making the A6 processor a truly custom core. On iOS, they baked Google Search into Mobile Safari because it was (still is) the world’s most popular search engine, but in iOS 5 they started implementing more specific third-party services (such as Wolfram | Alpha) to provide users with more accurate, precise, and immediate results while at the same time slowly moving away from Google.\nIn iOS 6, Apple went ahead and removed YouTube and Google Maps from the operating system, offering their own, new standalone Maps application.\nFor Apple, there can only be one big fish in the pond: themselves. Therefore, most of Google’s legacy built-in functionality had to go. But once again, Apple had to compromise on other things.\nThey had to choose the lesser of two evils. With iOS devices selling millions of units every month and an iPhone 5 coming soon (and off to a great start), should have Apple allowed Google – its biggest competitor in the mobile OS space – to keep a strategic position on iOS with built-in apps and search? Or, would it be better to cut deals with the smaller guys – Wolfram, Yelp, OpenTable, Yahoo, Rotten Tomatoes – to build a more focused search that happens outside of Google’s system? Should have Apple allowed Google to keep making money off search queries performed through Mobile Safari?\nAs we’ve seen, Apple chose the second option. They touted Google integration in the iPhone introductory keynote and used most of their services until they needed them; once they lost their leverage – e.g. Google built a strong mobile platform of its own – they had to look for alternatives. Which are still based on third-party services they don’t own, but that they likely can control more thanks to better deals, agreements, and lack of conflicts of interest (Yelp doesn’t make smartphones or mobile OSes).\nThey started from Maps. By using a bevy of third-party services and licensed data, they built an experience they can control. Not just in terms of visual appearances – Apple developed the Google Maps app, but they couldn’t modify the Maps tiles for example – they effectively created a pipe to turn the location of iOS users into aggregate data they can use to develop more in-house location services for the future. Think crowd-souced traffic database, only on a much larger scale with much more information available about users’ location, habits, devices, and preferences. Just an example: by aggregating Maps and usage data, Apple could monitor what percentage of iPhone owners visits certain retail chains in the US, cut deals with those chains, and convince them to provide Passbook offers for loyal customers who have an iPhone. This is the kind of data that, before, Google would have likely kept for itself to build better services and more lucrative advertising deals. That was money and control on the user experience drifting away from Apple.\nSimilarly, every search users performed on iOS used to go exclusively through Google. Starting with iOS 5 and even more with iOS 6, Apple is offering an alternative way of searching: Siri.\nFor some kinds of data, Siri is more focused and to-the-point than Google’s mobile search results. For restaurant reviews and reservations, for instance, Siri offers a more elegant interface for browsing available options around you; for sports information and players’ data, the Apple-designed Yahoo Sports integration is better-looking and faster than finding updated stats on Google.com; for questions about movies, geography, or word definitions, Siri is quicker than going to Safari to type your question. Like Graham explained in August, this has increased Apple’s reliance on third-party services, but the company has managed to keep their branding as minimal as possible, making users think these queries are, in fact, happening thanks to Apple. And for all we know, the data processed by Siri uses other services’ databases, but goes through Apple’s pipe. Again: control on data, and control on the user experience. For the time being, Apple sees this as an acceptable compromise. The lesser of two evils.\n“Right from day one, the Weather, Stocks, YouTube and Maps apps were inherently powered by Yahoo and Google – but the apps were designed by Apple and there was no overt branding. Weather and Stocks to this day exhibit just a single, Y! icon in the corner, whilst there was very minimal branding in the YouTube and Maps apps. Similarly, all the services that are integrated with Siri feature small and subtle branding that is almost invisible at a glance.”\n- The Rise Of Third Party Services And Fall Of Google In iOS\nObviously, Apple wasn’t ready to completely eliminate Google from the position of default search engine for Safari, and probably never will – Google’s dominance in the space far exceeds the probability of another search engine gaining the same amount of users and data in the short term. But Apple can take steps to make Google as invisible as possible, all while “guiding” users towards workflows and interactions that don’t require Google at all.\nIn Safari for iOS 6, a new tab opened on the iPad won’t automatically place the cursor in the search bar – it’ll place it in the address bar. And speaking of the search bar: it doesn’t have “Google” (or Yahoo, or Bing) written on it anymore. Just “search”. It’s subtle, but it’s the kind of change that suggests search itself is a feature, not Google’s search. Other Safari features, such as iCloud Tabs and constant bookmark synchronization, will allow people to keep the websites they visits always available across devices, thus decreasing the need for searching again. And as mentioned above, Siri should prove itself worthy of consideration for some types of search – those that Google still hasn’t fully optimized for the instant-on, get-results-fast nature of mobile.\nApple can’t do everything. They can’t build hardware, software, and services while simultaneously think about search, sports, weather, restaurants, mathematical computation, and social. Not even with $100+ billion in the bank can they afford to put their proverbial level of focus and care on dozens of different skills at once. They need partnerships. And they need to pick them carefully. The right horses to ride going forward.\niOS 6 is the epitome of a company that has set out to find an equilibrium between data and presentation, control and user experience.\nTo build a stronger structure, sometimes you have to start over. From the foundation. To ensure its long-term survival and expansion in key areas such as location and search, after iOS 5 Apple found itself wondering whether they should:\n1) Ditch Google Maps and offer something of their own, or\n2) Keep on improving iOS with major new “tentpole” user features without rethinking anything.\nAll while (3) providing developers with new APIs to build apps for the richest software ecosystem on the planet.\nThe first option would have given them data and control, at the cost of complaints from users who would find out Apple Maps aren’t nearly as full-featured and accurate as Google Maps yet (personally, I don’t believe the theory that it could have been Google to pull out of the iOS deal).\nThe second option would have kept users happy: everybody loves new features. However, it would have put the future of the platform at risk, at least in some areas.\nAs for the third option: it’s really obvious – the best thing Apple can do is to give developers more and better tools to build apps for its ecosystem.\nAs the principle goes, among three favorable options only two are possible at the same time. Rethink core functionalities, Offer major new user features, Keep developers happy: Pick two.\niOS 6: The User\nWe talk about Apple’s self-preservation plans and company strategies, but the flip side is – people don’t care about this stuff. This is nerd talk at its finest – details that we, as tech writers, focus on because we’re striving for details and reporting accuracy. But those millions of people who buy an iPhone every month don’t care about Apple’s deals with Google and the importance of Siri to shift away from Google search. They don’t know about ad revenue, APIs, or aggregate data.\nLet’s get real. People just want things to work. We, the tech press and the nerds, care about the gossip and details going on behind the scenes. But the rest of the world doesn’t. My friends, your friends, your neighbor and that guy who’s friends with your brother-in-law – they use their iPhones. They don’t want to know about the drama of the people who make them.\nAs a user, I find iOS 6 to be a different kind of upgrade than, say, iOS 5 or iOS 4. The latest two major updates to iOS brought OS-redefining user additions like multitasking, folders, Notification Center, Siri, iMessage, and Reminders. And before them, iPhone OS 2 and 3 introduced the App Store and copy & paste. Apple has a history of bringing major, user-facing features to iOS with the new version it releases every year.\niOS 6 swaps the Maps application with a new one, adds Facebook integration on the lines of iOS 5’s Twitter one, and then focuses on refining everything else.\nSome parts of iOS 6 are painstakingly refined. Which isn’t a surprise given that, at this point, this operating system has been available for five years, and in development for many more.\nThe Things I Like\nSafari, already a fine browser, is now even faster thanks to JavaScript improvements, has full-screen support for landscape mode on iPhone, and can contain more tabs on the iPad (up to 24, available inside a popover). I already liked the possibility of syncing Safari tabs across my Macs running Mountain Lion, but as I expected in my review, iCloud Tabs’ full potential is only truly revealed when you start using the feature across OS X and iOS. Now I find myself seamlessly opening pages I had on my Mac while reading on the iPad, and vice versa.\nThe speed improvements of iOS 6’s Safari are noticeable and impressive, and I like subtle refinements and additions such as the possibility to simply copy a URL (finally) and the fact that new tabs on the iPhone aren’t appended at the end anymore.\niOS 6 comes with a new “share sheet” that offers a visual replacement for many of the “action/sharing buttons” that, previously, used to trigger list-based, vertically-oriented textual menus. The new sheet offers a grid-based, iOS-inspired sharing menu for Safari, with icons for Mail, Messages, Twitter, and Facebook, as well as Print, Bookmark, or Reading List. The sheet has also been implemented in Photos, allowing you to share an image to Photo Stream or assign it to a contact, among other options. It works in Notes, every app that uses the “Open In…” feature and has been updated for iOS 6, and even Maps.\n\nTracing the path Apple took to achieve immediacy and user-friendliness with app icons, the new share sheet makes sharing and opening files into other apps faster and more intuitive. It’s a reasonable change.\nFacebook integration means iOS 6 can officially “talk” to the social network I use every day to keep in touch with my non-geek friends who don’t use Twitter. While I have mentioned the implications for apps and developers, from a user’s perspective I appreciate the simplicity of Facebook sharing in iOS 6. It works exactly like iOS 5’s Twitter integration, only with a different color scheme and more privacy options. Since I started testing iOS 6, I have used the new “Tap to Post” Notification Center widget to quickly send status updates to Facebook, but I’ve also uploaded pictures from the Photos app. I like how Apple made Facebook’s complex privacy settings as simple as a “Choose Audience” toggle that lets me set updates to Public or Friends-only.\nWhile the new Facebook app is a terrific improvement over the old version, I still prefer using iOS 6 if I need to send off a quick update or photo. Because I have had a bad experience with letting other services mess with my contacts, I haven’t activated the option to “Update Contacts” using Facebook and the iOS Address Book. Single sign-on will be huge for game developers and video discovery apps with a social component.\nMail received the same VIP feature I outlined with Mountain Lion. I use this functionality sporadically. Instead, I’m glad to see things like separate Archive / Delete (tap & hold to reveal both options), possibility to attach photos and videos to a message, and easier draft access (tap & hold compose button) finding their way to Mail in iOS 6. The “attach media” feature is particularly interesting, as it’s been implemented through the standard copy & paste menu and it’s somewhat reminiscent of an old concept for an iOS Services Menu that I’m still looking forward to (maybe someday).\nAs a user, I appreciate Apple’s renewed interest for well-presented, understandable privacy settings that are simple to control and change at any time. In older versions of iOS, you could see apps that were accessing your location or Twitter accounts. Following this year’s Address Book privacy fiasco, Apple went back to the drawing board and designed an entirely new Privacy section that clearly lists apps that are accessing your location, contacts, calendars, reminders, photos, Bluetooth Sharing’s features, Twitter and Facebook accounts. You can revoke permissions at any time, and everything is elegantly presented with app icons and no complicated menus or poorly-worded dialogs.\nAnd then there’s the huge list of minor improvements, hidden features, and subtle refinements that Apple has been adding to almost every aspect of the OS. It almost feels as if Apple looked in every corner, listened to users’ feedback, and fixed all those little and not-so-little things that, for years or months, have annoyed iPhone and iPad users. Separate “send” and “receive at” email addresses for iMessage; phone number support for iMessage and FaceTime on devices like the iPod touch and iPad, through iCloud; Notification Center in the cloud, with unread notifications automatically dismissed if read on another device. And there’s more: per-account Mail signatures with HTML support; Lost Mode for Find My iPhone and iCloud.com; FaceTime on 3G (it worked reliably in my tests on 3 Italia’s network in my town); the totally-reworked iTunes Store with previews that keep playing even if you navigate sections and close the app.\nAnd many, many other things that we’ve collected in a separate post.\nThe Things I Don’t Use\nJust because iOS is getting some new features it doesn’t mean I have to use them all. For as much as I’m intellectually curious and I like to know about new things, there are some functionalities that I just can’t fit into my workflow and routine.\nDo Not Disturb isn’t for me. Located in Settings, DND allows you to set a “quiet mode” for calls and alerts. If a device is locked, they will be silenced and a moon icon will appear in the status bar; then, when you’ll turn off DND, they’ll be there in Notification Center waiting for you to catch up. Apple really cared about the details for Do Not Disturb: there’s a setting to always allow calls from Favorite contacts (or a group) even if DND is on, and another that will let the second call in a row from the same person within three minutes go through DND’s wall. As you can imagine, there’s also a scheduled option to, say, automatically activate DND at night, so you won’t be bothered by calls or sound alerts.\nFor various personal reasons, if a call comes in at night, not only do I want to be disturbed, I have to make sure I can wake up to answer and see what’s going on. I don’t get random accidental calls at 4 AM, so if my mom is calling at that time of the night, I want to know why. Similarly, if I’m meeting someone but my doctor is calling, he’s got to come to terms with that – I have to take the call. Do Not Disturb is not for me, because, by nature, I want to know why people are looking for me. But I can see why this feature will appeal to a lot to users who aren’t like me.\nSimilarly, while I know it’ll be great for some people, the “Remind Me Later/Reply with Message” feature added to the Phone app isn’t for me. I don’t receive that many phone calls throughout the day to be honest, and the ones I do get are either from my parents, my doctor, or a friend of mine.\nI’d rather just let the phone ring and go to voicemail than send a text to my friend saying “Can’t talk right now…Ti richiamo tra poco”. Because, personal preference aside, my iPhone is set to English, yet my friends are Italian, and Settings.app doesn’t let me modify the first string of the “Reply with Message” option. So, I’ve ended up with a mix of strange-sounding, possibly-cool-for-some-people messages that begin with “Can’t talk right now…” and end with “scusa mamma ho da fare”. I actually sent one of these to my dad, and it was awkward. I’ve always thought combining Italian and English sounded kind of weird.\nI recently migrated my entire photo library to Dropbox, and set up an automated workflow to send new photos to the service from my iPhone, iPad, and Mac. For this reason, I am not interested in trying out iOS’ new “Shared Photo Streams” feature, because I know I wouldn’t use it. With a shared Photo Stream, users can set up a private album to share with their friends (via email) over iCloud; members of a shared Photo Stream can “like” photos, post comments, and check out everything on the web or from their iOS devices. It is a pretty neat feature – it also supports notifications for new comments – but I have chosen to trust someone else with my memories.\nIf I were a parent or a teacher, I’d be excited about the new Guided Access functionality introduced in iOS 6. As part of the OS’ Accessibility improvements, Guided Access can limit an iOS device to just one app by disabling the Home button and even certain tap areas of the UI. I have played around with it a little, and I see how, for many, this will change the way iOS devices are handed over to kids.\nPanorama sounds cool, in theory. It’s a new feature of the Camera app that lets you save “panoramic shots” by capturing 240 degrees of what’s around you in one single motion. The software will then take care of processing the images, stitching them together, and it’ll try its best to make sense out of the small details that make up the 28+ megapixel photos you’ll have in your Camera Roll. In actual testing, Panorama has been nothing more than a “cool thing to check out once” for me, maybe because I don’t typically think about taking panoramic shots with my iPhone. I have been trying apps like Pano and 360 Panorama over the years, but I kept coming back to Instagram or Apple’s Camera.\nFrom what I’ve seen, Panorama seems to work as advertised with good light conditions, but I have some doubts about Apple’s claims that it can figure out the correct exposure on its own. Maybe so with the iPhone 5.\nPassbook\nOn paper, Passbook is all kinds of promising. A new app in iOS 6, it wants to collect all your boarding passes, loyalty cards, movie tickets, gift cards, and more in a single place. It’s not a payment system by itself: rather, it’s an iOS-enhanced solution to collect redemption codes and tickets issued by others (such as United Airlines, Starbucks, Fandango) and use them when appropriate. Instead of keeping cards in your physical wallet, you can set up Passbook to become your central repository for this kind of content. Developers can build Passbook-compatible cards and tickets using bar code formats such as QR (as well as PDF417 and Aztec) and a set of APIs that, among other things, will allow them to add or update passes in a user’s Passbook.\nPotentially, Passbook could be the first step towards the “digitalization of your wallet” many have been predicting for the past years. It’s not NFC – it’s a software solution that uses bar codes and iOS technologies to transmit information about you to a retailer or organization that issued a pass. However, in spite of its theoretical potential for all the parties involved (including Apple), I can’t have more than a brief description of what Passbook is, because I haven’t been able to try it. The Passbook app links to a section of the App Store listing Passbook-compatible apps, which, in the period during which I tested iOS 6, weren’t available. I’m sure users will see apps adding Passbook support starting today, but being based in Italy, I’m also more skeptical in regards to the time Italian users will have to wait before trying Passbook. When it comes to this kind of innovations in retail and local institutions, Italy is typically behind the curve with slow adoption times and bureaucracy restrictions that prevent small businesses from embracing the latest technologies and innovations. I don’t know when we’ll start seeing names like Trenitalia, Alitalia, Spizzico, or Autogrill add Passbook support; based on what I’ve heard so far, there haven’t been any discussions about this, so don’t hold your breath for major Italian organizations adding Passbook support soon.\nMy hope is that private companies like Ryanair and individuals will set an example for the rest of Italian businesses and organizations to follow. My girlfriend runs a local dance organization that’s based in a gym here in Viterbo, where there are also Pilates and Yoga classes. If she and the gym owner decided to support Passbook, they could set up a “loyalty system” for the oldest members with discounts and monthly promotions. They could update the passes over the air, send push notifications, and give passes to new members via email or a webpage. This would be incredibly easier, engaging, and fun than the current pen-and-paper system they use for loyalty cards and discounts. Or again, there’s a friend of mine who, every year, organizes a series of cultural events here in Viterbo during the summer; instead of tickets – or in addition to – he could set up iOS passes, relying on geofences to send push notifications to an iPhone’s Lock screen when people are nearby the area of the event (there are several locations with multiple “stages” to choose from).\nI hope that forward-looking individuals, private companies, and small-business owners will consider experimenting with Passbook. I hope that initiatives like Passdock (made in Italy) will promote the adoption of this technology for organizations that don’t want to hire developers to set up a one-time event or gift card.\nPassbook can be a lot of great things, but I can’t use it right now.\nThe Things I Don’t Like\nFor all the improvements that went into Mail, there have been some changes that I don’t really understand or appreciate. Firstly, pull to refresh. Initially created by Loren Brichter for Tweetie, pull to refresh went on to become the de-facto way to check for updates in all kinds of applications. And now, following various open-source takes on the subject and a patent held (but not enforced) by Twitter, Apple has decided to use pull to refresh in Mail.\nI have two issues with Apple’s implementation. Since its inception, pull to refresh was used as a clever method to check for updates in a list of messages sorted vertically from newest to oldest. It was perfect for a Twitter client: as users reached the top of the timeline, they could continue scrolling to check for updates and load new tweets. But since pull to refresh gained popularity, developers started abusing it in all sorts of ways, defeating its original purpose. I can’t help but have the same feeling about Apple’s use of the feature – that is somehow forced and that’s been added only because “users are now familiar with it”. This, however, doesn’t necessarily mean that pull to refresh is the right metaphor for checking emails: while displayed in the same vertical orientation, email is not Twitter. I am not checking for new messages every 20 seconds while I’m waiting in line, and even if I was, email refresh times are still slow compared to Twitter.\nIf I had to nitpick, I’d say the implementation itself feels off. In the original pull to refresh, and in the several iterations by other developers, the “refresh” action is triggered when you release the pull animation. Instead, Apple’s version starts refreshing the main view as soon as you pull beyond a certain threshold, no matter if you’ve released your finger off the screen or not. It’s a minor detail, but it’s something I’ve immediately noticed after years of pull to refresh in other apps.\nMy second problem with Mail is the “swoosh” sound for a message that’s been sent. Up until iOS 6, the sound effect used to go off after a message was actually sent – e.g. when the “loading bar” had been completely filled and the message was really off the server. In iOS 6, for some reason, the “sent” sound plays as soon as you hit Send, even if the message hasn’t been sent anywhere yet. This change has had a series of repercussions on my email workflow already. For instance, on a couple of occasions I thought I had sent a message – the sound played – but I really didn’t, because I had poor 3G and my connection was slow. Even worse, if for some reason you lose your Internet connection right after hearing the “swoosh” sound effect, your message won’t be sent at all, and it’ll be saved in the drafts. I think playing Mail’s sound effect in advance is confusing, as users have associated it with a message that’s really been sent. Imagine if Apple decided to do the same for Messages.\nOn iOS 6, the phone number and email address you use for FaceTime are synced across your devices. While this sounds great in theory, I had to disable FaceTime on my iPad, as I kept experiencing a bug that would let my iPad keep ringing even if I answered a FaceTime call on my iPhone. I hope Apple fixes this in the next few days now that iOS 6 is public.\nThe iPad finally has a Clock app on iOS 6. Like the iPhone app, it is divided in four areas: World Clock, Alarm, Stopwatch, and Timer. The first two are really good looking: World Clock takes advantage of the larger screen to display multiple cities on a planisphere with time and weather information. You can scroll between clocks at the top, tap on one to bring it up in full-screen, and the world map even displays areas of light and dark. Alarm is interesting, too, as it uses a calendar layout to visualize all the alarms you’ve activated for the week. There is a visual consistency between World Clock and Alarm.\nStopwatch and Timer, on the other hand, feel like were made by someone else for another app. While the same sections in the Clock app for iPhone somehow fit together with the rest of the app, on the iPad they share a completely different UI approach in stark contrast with World Clock and Alarm. They have flat, round, glowing buttons that I haven’t seen anywhere else on iOS, and that seem to be taken straight out of a post-modern microwave oven control system. I won’t get into a debate about digital interfaces and skeuomorphism here, but to me, the Timer UI is especially baffling as it’s completely different from any other iPad UI Apple has designed to date.\nSpeaking of design, iOS 6 brings a new option to the system status bar, which can “blend” with the top bar of applications using a different shade of the same color. This is up to developers to implement, and you can see it in action in Apple apps like Settings and Mail on iPhone. I am not a fan of this UI choice: I think that, in most cases, it diminishes the contrast between status bar data (time, battery, etc) and the background, whist simultaneously making it more distracting and “in the way”. I like the simplicity and elegance of Safari and Maps, which use the new blue gradient and silver UI, respectively, with a black status bar that is easy on the eyes.\nI agree with the explanation Loren Brichter wrote back in June:\nPhilosophically the beauty of these devices is that because they are a screen, they become whatever you are doing with them. Minimizing extraneous hardware, and extraneous system interface elements should be a goal. A distracting status bar is antithetical to that.\nSiri\nItalian is a difficult language to master. Being a Neo-Latin language, it comes with several lexical and grammatical rules related to affixes and suffixes for genre, number, and tense, definite, indefinite, and partitive articles, prepositions, word accents, pronouns, and many, many other complexities that not even most Italians know. Put simply: it’s incredibly hard to build a software that understands the meaning and context of what we’re saying, like Siri, in Italian.\nI approached iOS 6’s Italian Siri with a mix of cautionary curiosity and skepticism. I haven’t been really using Siri in English, since I would look completely out of place here in Italy, talking to Siri in English in front of my friends and family. Plus, it just wouldn’t be natural. I was looking forward to the Italian version of Siri to speed up some common actions like sending a message, checking the weather, or finding a gas station nearby.\nI won’t go into the details of the Italian words and expressions that Siri doesn’t understand or is unable to process. But I will say this: during the period I tested iOS 6, when Siri wasn’t down due to server issues (and there were many), I noticed that it struggles to keep up with the many ways of asking for something in Italian. Because Italian is an inflected language, verbs have conjugations; and because verbs have those grammatical variations, when you start mixing them with pronouns, prepositions, and auxiliary verbs, you can get to the same sentence in dozens of different ways – all of them being syntactically correct.\nAs I expected, the “beta” of Italian Siri isn’t meant for “commands” that go beyond the grade of complexity marked by Apple in the “example phrases” of Siri’s information popover. In my tests, commands built with simple verb-object-time structures like Apple’s demo ones were okay for Siri (although it often failed to recognized them correctly, and I don’t have a strong dialect), but other sentences – spoken like an Italian normally would with prepositions, pronouns, and articles – didn’t go through. Italians reading this right now know what I mean. Try dictating a message with Siri using “che” (the that Shawn Blanc recently mentioned) with a subordinate clause – it won’t parse it, because it doesn’t understand we’re using an indirect speech with prepositions, which need to be excluded from the actual message.\n\nFor commands issued with moderately complex “real” Italian sentences and schemes people use on a daily basis, this first version of Siri is a letdown. Neo-Latin languages are hard to figure out for actual people, and Siri has a long way to go as a virtual Italian assistant.\nWith simple do this/do that commands, I’m actually quite impressed by the additions Apple brought to Siri in iOS 6. When it worked, I was able to check on Serie A results and player stats (curiously, players didn’t have profile pictures), open apps, get movie information and watch trailers. I also managed to check on some restaurants in my town using the new Yelp integration, but unfortunately the Yelp database in Viterbo is very limited, there aren’t many reviews – let alone photos – and restaurants here don’t support OpenTable reservations.\nOverall, if you keep the sentence structure simple enough, the new features of Italian Siri mostly work. And they’re nice.\nI have found two big issues with Siri, besides questionable support for Italian. The first one is the fact, that, if you keep your device set to English (like I do), but Siri in Italian, it can get confused with the two languages. For instance, sometimes I get movie descriptions in Italian, other times in English, and turn-by-turn navigation, spoken by Siri, will be in English in spite of Siri being set to Italian. It’s confusing, as Maps’ English interface overrides Siri’s Italian setting.\nMy second issue is Maps.\nMaps\nFor a complete overview of Maps, I recommend you check out Cody’s extensive look at the application. Instead, these are my impressions of Apple’s new Maps in Italy.\nAbove, I wrote that users just want stuff to work. They don’t care about deals between companies or the strategic importance of moving away from Google. Many of those who will upgrade to iOS 6 will find a worse Maps application than the one they used to have.\nApple Maps are pretty. The standard view is all vector-based, so you can zoom in and out without losing any detail. Their icons are colored and nice. But besides that, just about everything else is a step backwards from Google Maps here in Italy, and especially in my area.\n\nWhen zooming out, major road names aren’t displayed on the map; Google spends a great amount of time computing the data they have available, trying to understand traffic, signs, and directions based on the footage they have stored on their computers. In my area, names of less trafficked roads pop up earlier than other major roads and intersections; the labels are “too close” to each other; the satellite view has less detail than Google’s. Maybe Apple Maps have “less clutter” when compared to Google Maps; but I still don’t understand how they choose which road names to display first – less important roads shouldn’t show up when zooming out of view.\nThe label for Viterbo has the same size of the ones from nearby smaller villages. Sometimes its label disappears completely from the map.\nEvery once in a while, looking for a location in Viterbo takes me to another city in Italy, although I entered the name correctly. Occasionally, this happens with local businesses too: I hit the “Locate” button, type in the name of a store in Viterbo, and suddenly I’m in Rome.\nIn Viterbo, Google had basic support for traffic information and alternative routes. In spite of the option I activated, Apple Maps never showed me traffic in Viterbo, nor did they offer me to consider other routes. I checked in other, bigger Italian cities, and traffic was supported there, alongside multiple routes. This gives me hope support for more cities is coming.\n\nLocal business search is inferior to Google Maps. Apple has nicer icons to differentiate businesses, but the results are fewer and out of date. I did some research, and it turns out, while Google is using PagineGialle.it for its business listings – PagineGialle is biggest provider in Italy, which almost every business owner considers using to show up in search – Apple is using data from other sources, such as Acxiom. This explains why Google Maps has recent results and information (phone numbers, URLs) for businesses that are still running, while Apple Maps has several outdated results with basic info like phone numbers and webpage addresses often missing.\nEven worse, Apple Maps doesn’t know of the existence of several institutions in my town like schools and government offices. Google does.\nIt’ll be interesting to see if, alongside a standalone Maps app, Google will also release an SDK for developers to integrate in their apps. Because Maps is a system framework on iOS, every app that uses Map Kit will automatically switch from Google Maps to Apple Maps in iOS 6. This is a fundamental change for AR apps like Where To, or, in general, other apps with a strong location component.\nWhere To and Facebook using Apple Maps, no app update needed.\nWhat’s really bad, in my opinion, is the removal of Street View from iOS. I used Street View all the time to explore cities I wanted to visit and to get the general idea of what walking and driving around there would be like. Now, Apple has its own take on 3D-browsing, which is called Flyover and it’s an aerial 3D representation of major city areas in satellite view. Meaning, you won’t get to “virtually walk” through actual photos of a city, you’ll see a 3D rendering of some cities that Apple is supporting today (with a “flying around” animation Scott Forstall really seemed proud of). And it’s even a poor representation, with slow loading times, laggy animations, and a weird “apocalypse effect” if you try to zoom in too closely.\nDon’t look at Apple’s promotional images – try it for yourself.\nGhost cars?\nGoogle’s version of Piazza del Popolo in Rome Vs. Piazza del Popolo after a nuclear disaster according to Apple.\n \n\nIf you try to use Flyover like Street View – as a way to see how walking around a city would be like – prepare to do a lot of pinching, zooming, and panning around that, most of the time, iOS 6 won’t recognize correctly. As a side note, you could use iOS 5’s Street View with one hand.\nAnd if Flyover isn’t available in your city, like in Viterbo (for context, here’s the current coverage of Street View around the world), you’ll get an ugly, flat, “3D” satellite view. If 3D buildings aren’t supported, the 3D button shouldn’t be available at all.\n\nEven if Flyover had better imagery, animations, and was faster and more responsive, I don’t think it would be as useful as Street View. It’s a cool toy. But to get the job done – to explore cities as humans would – I have to use Street View on my computer. Humans drive and walk, they don’t fly. Flyover sounds good on marketing material – “stunning 3D images with animations!” – but in actual usage, it’s terrible.\nMichael Degusta wrote an excellent overview of the Google features Maps is losing in iOS 6:\nOn the plus side, at least people are getting turn-by-turn directions and Apple’s Flyover feature in exchange, right? Not so fast: 20 countries (population: 3.2 billion) are losing transit, traffic, or street view and getting neither turn-by-turn nor Flyover. The biggest losers are Brazil, India, Taiwan, and Thailand (population: 1.5 billion) which overnight will go from being countries with every maps feature (transit, traffic, and street view) to countries with none of those features, nor any of the new features either.\nIt gets worse. Even in countries where turn-by-turn and/or Flyover are available, the iPhone 3GS, iPhone 4, and the 4th generation iPod touch won’t support them. These devices are owned by tens of millions of users who may update over-the-air when prompted, only to find they’ve lost features and haven’t even gained any of the marquee Maps features in return.\nAnd this is where I come back to my initial point. People want their devices to work. Normal people use these things to plan trips, go to work, wake up in the morning, catch the bus to go to school. These devices have changed and improved many aspects of people’s lives. We’re not playing games here anymore. The tech press is so entrenched in itself, we have forgotten normal people use their iPhones and iPads not for “reviews” and “exclusives” – they use them to do stuff. To get the kids to school on time. To learn a city’s landmarks and must-see locations before going there.\nHow are we going to tell these people that, because of Apple’s strategy, they’ll have to cope with an inferior version of Maps?\nHow do we tell students that public transit directions are no longer, that they’ll have to use separate “App Store apps” – which aren’t available yet?\nCan we justify Apple Maps in the name of the greater good?\nPersonally, I can’t. Because while I could go on and let my friends read the “Company” section above and try to make them understand that, yes, that’s why Apple had to ditch Google, the truth is – they don’t care. They are going to update to iOS 6 because they’re curious, just like everybody else, and they’re going to ask about “the Maps app that doesn’t work anymore”.\nThey won’t say “Yeah, but at least Apple has more control now”.\nThey won’t say “At least Apple’s icons are nicer”.\nThose are things bloggers like me write. The “normal people” will hate that the new Maps app isn’t as good as before.\nAnd we’ll have to tell them that “It’ll get better soon”. Because it’s not great since day one.\nIt’s time we stop giving Apple a free pass on everything. Enough with the sugar-coating. Earlier this year, someone argued that the privacy fiasco was actually a good thing because it could have been worse on Android. What kind of explanation is that? It was bad, period. With Maps, I have a similar black & white view. The current version of Maps is, from a data standpoint, a step backwards from Google Maps, with the exception of turn-by-turn navigation, which is a great addition.\nAs a tech writer, I understand the importance of Apple’s new Maps, and I applaud their decision to build their own solution for the future. I also understand that, while Apple could have licensed some Google technologies such as Street View and public transit directions to include as options in the new Maps, they didn’t want to.\nAs a user, I can’t help but think that this needed more time, that maps simply aren’t ready in many areas, and that Google Maps was just better.\nThe Trilemma\nAs the saying goes, you can’t please everyone. When you’re building software and you’re targeting multiple audiences at once, you can’t build at the same level for all of them. And that’s how I see iOS 6.\nFor the most part, Apple built iOS 6 for itself and to guarantee the prosperity of certain areas of the OS. Areas such as search, increasingly shifting to Siri, and location. As a reflection of these actions, Apple gave new goodies to developers, as they understand the importance of a healthy developer ecosystem that sets iOS apart from the competition. And last, Apple refined iOS 6 for the users, polishing many inconsistencies but lacking the Notification Center and iMessage breakthroughs of iOS 5. From this perspective, I’d say that, of three favorable options, Apple mainly focused on two: the company and the developers.\nThat’s not necessarily a bad thing. Users don’t need to learn a new UI or app every year, especially when they’ve become accustomed to how things work. Adding features for the sake of adding is not innovation. Users want their devices to keep working with the same degree of functionality, which is why I see Maps as a real, tangible problem today. But this doesn’t mean iOS 6 isn’t a notable update. iOS 6 is a good improvement over iOS 5 with several welcome refinements and additions like Facebook, more languages for Siri, and a faster Safari.\nIn my opinion, iOS 6 has, right now, worse Maps and App Store search; especially for Maps, if you rely on features like Street View and public transit directions, I can’t recommend the update until an official Google Maps app comes out.\nFor everything else, iOS 6 improves on almost every aspect of the operating system, and sets the stage for a stronger platform in the future.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2012-09-19T12:50:14-04:00", "date_modified": "2018-03-20T13:25:57-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iOS 6", "iOS Reviews", "Featured", "stories" ] } ] }