{ "version": "https://jsonfeed.org/version/1.1", "user_comment": "This feed allows you to read the posts from this site in any feed reader that supports the JSON Feed format. To add this feed to your reader, copy the following URL -- https://www.macstories.net/tag/ipad-at-10/feed/json/ -- and add it your reader.", "home_page_url": "https://www.macstories.net/tag/ipad-at-10/", "feed_url": "https://www.macstories.net/tag/ipad-at-10/feed/json/", "language": "en-US", "title": "iPad at 10 – MacStories", "description": "Apple news, app reviews, and stories by Federico Viticci and friends.", "items": [ { "id": "https://www.macstories.net/?p=62869", "url": "https://www.macstories.net/stories/the-future-of-the-ipad/", "title": "The Future of the iPad", "content_html": "
\"\"

\n

It’s been an eventful decade for the iPad. But what’s next?

\n

This week’s iPad at 10 celebration has centered primarily on the past. We’ve explored the device’s influence in accessibility and education, heard developers’ stories, outlined some of the most impactful apps from the decade, considered one of the most overlooked iPad models, and more. But as the week closes out, we turn our attention from the past and present to what lies ahead.

\n

For the longest time, the iPhone’s shadow loomed large over the iPad. The iPad Pro began to change that, iPadOS solidified that shift, and now the device is forging its own path as a modular computer.

\n

There’s never been a more exciting time to use the iPad. Yet as far as the device has come, we remain optimistic that its best days are still ahead.

\n

Before wrapping up this anniversary week, we have to consider the future of the iPad.

\n

\n

Modularity and Even Larger iPad Pros

\n

Federico: As I shared yesterday in my Modular Computer piece, I strongly believe the iPad hardware story should continue over the next decade with a focus on modularity. If there’s anything we’ve learned from the iPad Pro line since its debut five years ago – and particularly over the last couple years – it’s that Apple is increasingly leaning into modularity as a differentiating factor from the Mac. The company started moving its first steps into this field with the original Smart Keyboard and Apple Pencil; they continued to extend the iPad’s flexibility with the adoption of USB-C in 2018; and just a few weeks ago, they brought deep, extensive integration with mice and trackpads to iPadOS. While a Mac will always be a Mac – the type of computer you buy is what it’s always going to be – an iPad can be continuously transformed by the extra hardware paired with it. This, I believe, is the most interesting hardware angle to consider right now for the future of the device.

\n

I see a handful of different ways Apple may go about this strategy over the next few years. Aside from the upcoming Magic Keyboard (yet another accessory that is set to fundamentally transform interactions with the iPad Pro), I see proper integration with external displays as the next big item to tick on Apple’s checklist. Right now, the iPad Pro can mirror its UI to external displays via USB-C, HDMI, or AirPlay; the default UI mirroring mode is limited, however, by pillarboxing, which prevents apps from fully taking advantage of an external display. Some apps can output full-screen UI to a monitor via an old API originally designed for games, but that secondary UI cannot be interacted with using the new system pointer.

\n

Here’s what I think Apple should do: the company should introduce a new API that supports iPadOS’ UIScene technology for multiwindowing and allow users to choose which windows (window in the sense of an iPad app window – more details here) should be placed on an external display; unlike the current UI mirroring mode, those windows should be able to fill the entire display and users should be able to control them using the new native pointer. The fact that Apple added pointer support in iPadOS 13.4 rather than waiting for 14 later this year makes me somewhat optimistic about getting a new integration with external displays in the near future.

\n

USB is the other angle to keep an eye on for the future of the iPad as a modular computer. It was great to see Apple add support for connecting USB drives to the device in iPadOS 13, but more can be done in this regard to further open the iPad to different use cases. Put simply, I believe you should be able to use any peripheral you plug into a Mac with an iPad as well. Whether it’s an external audio interface, a scanner, or an optical drive, you shouldn’t be forced to “get a Mac” just to use a USB accessory that currently doesn’t work with the iPad. As I also suggested yesterday, Apple has the opportunity here to rethink how the necessary drivers for these accessories are distributed for iPadOS.

\n

By focusing on modularity, Apple can continue making the iPad a more versatile and extensible platform without fundamentally changing its nature as a tablet. Adding support for external displays, pointing devices, and USB accessories doesn’t change the fact you can still pick up an iPad and hold it in your hands – and I don’t ever want that basic aspect of the experience to change. Otherwise I would, in fact, just get a Mac.

\n

At the same time, however, I can’t help but wish for an even bigger iPad Pro (somewhere between 14” and 16”) that could be advertised as a “desk and couch” tablet, specifically optimized for drawing and productivity apps. With an even bigger display, such an iPad Pro could comfortably support up to three apps in Split View, for example, and allow professional applications such as LumaFusion and Photoshop (and maybe even Logic and Xcode by Apple?) to offer desktop-class interfaces on a device that supports both touch and external input.

\n

Then, of course, there’s the idea of a “drafting table iPad” that would essentially be Apple’s answer to the Microsoft Surface Studio. I would buy that product in a heartbeat, but I understand if Apple may want to start “small” and release a 15” iPad Pro first. Regardless, as soon as a larger iPad Pro comes out, I know I’m going to be very interested in one.

\n

The Cascading Benefits of the iPad Pro

\n

John: The iPad Pro’s importance to the entire iPad line can’t be overstated. Whether you use Apple’s largest, most powerful tablet or not, if you have any iPad, the Pro has likely affected the way you use it.

\n

It has been over four years since the first iPad Pro was delivered to customers. In that time, the Pro’s accessories, which felt like high-end exclusives when they were released, have trickled down to other models of iPad. Their expansion across the lineup isn’t something that is written about very often, but when you look at Apple’s entire iPad lineup, it’s remarkable just how impactful those accessories have become. The Apple Pencil and Smart Keyboard have become part of what an iPad is.

\n

The Apple Pencil is probably the most important accessory available for any Apple product today, because it’s what sets Apple’s tablets apart from their competition. Other companies make styluses, but even now, the first-generation Apple Pencil’s tight integration of hardware and software has no peer.

\n

iPads like the mini and 10.2-inch iPad don’t support the Pro’s second-generation Pencil, but even so, it’s an accessory that allows users to greatly expand the way they use the iPad through activities like drawing and note-taking. I find myself using the Apple Pencil that I bought for my 2015 iPad Pro with my iPad mini all the time for note-taking and navigating its UI. The first-generation Pencil is less convenient to charge than the current model, but it still works like a champ almost five years later.

\n

The Smart Keyboard has had a similar impact. It’s not available for the mini, and I doubt it ever will be given the device’s size, but it transforms the 10.2-inch iPad into an ultraportable device that quickly switches personalities. Without a USB-C port, the modularity of the device isn’t as flexible as the Pro, but the core of the experience that Federico described in his story yesterday is just as true for these iPads.

\n

When I look at what the future might hold for the iPad, I expect this accessory trickle-down trend to continue. When Apple announced the 2020 iPad Pro last month, it showed off the new cantilevered Magic Keyboard. The keyboard is expensive at $349 for the 12.9-inch iPad Pro, but I expect that like the Apple Pencil, that price will come down over time and we’ll see a version of it available for other iPads too.

\n

I also expect we will eventually see USB-C trickle down to the entire iPad line as a way to enhance its modularity. The Lightning connector has served Apple well, but USB-C is playing a larger and larger role in the adaptability of the iPad Pro. Based on what we’ve seen with Apple’s own accessories, I expect it’s only a matter of time before USB-C plays a similar role across the entire iPad line.

\n

A USB-C port alone isn’t exciting, but its implications are. There will always be differences between Apple’s least and most expensive iPads, but by bringing the core experience of accessories like the Apple Pencil and Smart Keyboard to all iPads, Apple is setting up a future where the primary differentiator between devices will be their screen size, not their capabilities. The Pro models will continue being the place where new features are introduced first, and there will be differences at the margins like there are now between the screen technologies used in each model, but the prospect of every iPad gaining the flexibility of the Pro is within reach and exciting.

\n

Becoming the Best Computer for the Most People

\n

Ryan: Earlier this week, in a story about developing apps for iPad, I stated that “the device’s very nature – a slab of glass that becomes its software – evokes countless possibilities.” As I think about the future of the iPad, I can’t help but go back to pulling that same thread. While advancements in hardware are important, they’re ultimately just a means to the end of enabling new experiences, new ways of that slab of glass transforming into something unique. In other words, the hardware’s evolution is all about serving the software.

\n

Often when a device is 10 years into its life, the time for innovation and experimentation has passed. But for the iPad, it’s been less than a year since Apple set it on a fresh path with the advent of iPadOS. The new OS indicates a new sense of independence and identity for the device, and iPadOS 13.4 proves that Apple is just getting started with evolving the iPad’s software. While I don’t expect this fall’s version 14 to offer too much new or revolutionary, since the past year has brought so much change already, I’m more confident than ever that the future of iPadOS is bright.

\n

As the iPad enters this next decade, it carries tremendous opportunity to become the computer of the future. This is where I believe Apple’s focus for the device, and especially its OS, will lie moving forward: making it the best computer for most people.

\n

What will that require?

\n

Rethinking multitasking. This subject has been covered a lot, by plenty of different people in recent months. I even outlined my own concept for how Apple could evolve multitasking to make it more accessible. I think my context menu-centric approach could work just as well, if not better even, now that we have proper mouse and trackpad support on iPad. I won’t rehash the details of my idea – you can read the article for all the specifics – but instead simply reiterate that something does need to change  before the iPad can become the best computer for the most people. Multitasking operations like Split View and Slide Over need to be easier to learn, but without losing any of their current capabilities. While I think iPadOS 14 is likely too soon to see these changes, I expect them no later than version 15 next year.

\n

More consistent app experiences. Besides rethinking multitasking, I think the next best thing Apple can do for the iPad’s future is to create more system APIs and tools to help developers offer consistent experiences across their apps. This was a major point emphasized by the developers I recently spoke with: too many app features have to be built from scratch by developers with entirely custom implementations. Things like modal sheets, advanced keyboard controls, file system integration, drag and drop, and multiwindow are all functions that developers in the last decade have had to build custom versions of because Apple didn’t offer native solutions. A lot of positive change has taken place in this regard, but there’s plenty more to be done. The more that Apple invests in designing key functionality that can then be given to developers through new APIs, the better experience will be had by iPad users who find that the apps they use share a common ethos in design. I’m not saying every app should be identical, or shouldn’t innovate in their own ways, only that common functions should be done the Apple-prescribed “iPad way,” via the help of APIs from the mothership.

\n

Continued expansion of sales models. This is again a continuation of something Apple’s already doing, but developers could use more ways of structuring their businesses than what’s currently available today. Things have gotten better ever since Phil Schiller took responsibility for the App Store, with changes like the expansion of subscription options to more apps. However, not every app is a good fit for a subscription, so Apple should offer more avenues for apps to build sustainable businesses in alternative ways.

\n

Apple leading the way with its apps. I use a lot of Apple apps because I think the company, for the most part, does a great job with its first-party offerings. I’m not in the camp that believes the company can’t make good apps anymore. That said, though, I do think Apple could do more to inspire developers toward greater creativity on the platform. The early days of the iPad were marked by remarkable experiences like GarageBand and full-fledged versions of iWork. While there have been some great app redesigns of late, such as the revamped App Store and Books apps, it’s been a very long time since Apple created brand new iPad apps that served to inspire. I hope that changes soon.

\n

There’s more that could be said about the iPad’s software, including how its OS and apps could evolve, but these few things listed would, I believe, go a long way toward making iPad the primary computer for a lot more people.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "It’s been an eventful decade for the iPad. But what’s next?\nThis week’s iPad at 10 celebration has centered primarily on the past. We’ve explored the device’s influence in accessibility and education, heard developers’ stories, outlined some of the most impactful apps from the decade, considered one of the most overlooked iPad models, and more. But as the week closes out, we turn our attention from the past and present to what lies ahead.\nSupported By\nConcepts\n\nConcepts: Where ideas take shape\nFor the longest time, the iPhone’s shadow loomed large over the iPad. The iPad Pro began to change that, iPadOS solidified that shift, and now the device is forging its own path as a modular computer.\nThere’s never been a more exciting time to use the iPad. Yet as far as the device has come, we remain optimistic that its best days are still ahead.\nBefore wrapping up this anniversary week, we have to consider the future of the iPad.\n\nModularity and Even Larger iPad Pros\nFederico: As I shared yesterday in my Modular Computer piece, I strongly believe the iPad hardware story should continue over the next decade with a focus on modularity. If there’s anything we’ve learned from the iPad Pro line since its debut five years ago – and particularly over the last couple years – it’s that Apple is increasingly leaning into modularity as a differentiating factor from the Mac. The company started moving its first steps into this field with the original Smart Keyboard and Apple Pencil; they continued to extend the iPad’s flexibility with the adoption of USB-C in 2018; and just a few weeks ago, they brought deep, extensive integration with mice and trackpads to iPadOS. While a Mac will always be a Mac – the type of computer you buy is what it’s always going to be – an iPad can be continuously transformed by the extra hardware paired with it. This, I believe, is the most interesting hardware angle to consider right now for the future of the device.\nI see a handful of different ways Apple may go about this strategy over the next few years. Aside from the upcoming Magic Keyboard (yet another accessory that is set to fundamentally transform interactions with the iPad Pro), I see proper integration with external displays as the next big item to tick on Apple’s checklist. Right now, the iPad Pro can mirror its UI to external displays via USB-C, HDMI, or AirPlay; the default UI mirroring mode is limited, however, by pillarboxing, which prevents apps from fully taking advantage of an external display. Some apps can output full-screen UI to a monitor via an old API originally designed for games, but that secondary UI cannot be interacted with using the new system pointer.\nHere’s what I think Apple should do: the company should introduce a new API that supports iPadOS’ UIScene technology for multiwindowing and allow users to choose which windows (window in the sense of an iPad app window – more details here) should be placed on an external display; unlike the current UI mirroring mode, those windows should be able to fill the entire display and users should be able to control them using the new native pointer. The fact that Apple added pointer support in iPadOS 13.4 rather than waiting for 14 later this year makes me somewhat optimistic about getting a new integration with external displays in the near future.\nUSB is the other angle to keep an eye on for the future of the iPad as a modular computer. It was great to see Apple add support for connecting USB drives to the device in iPadOS 13, but more can be done in this regard to further open the iPad to different use cases. Put simply, I believe you should be able to use any peripheral you plug into a Mac with an iPad as well. Whether it’s an external audio interface, a scanner, or an optical drive, you shouldn’t be forced to “get a Mac” just to use a USB accessory that currently doesn’t work with the iPad. As I also suggested yesterday, Apple has the opportunity here to rethink how the necessary drivers for these accessories are distributed for iPadOS.\nBy focusing on modularity, Apple can continue making the iPad a more versatile and extensible platform without fundamentally changing its nature as a tablet. Adding support for external displays, pointing devices, and USB accessories doesn’t change the fact you can still pick up an iPad and hold it in your hands – and I don’t ever want that basic aspect of the experience to change. Otherwise I would, in fact, just get a Mac.\nAt the same time, however, I can’t help but wish for an even bigger iPad Pro (somewhere between 14” and 16”) that could be advertised as a “desk and couch” tablet, specifically optimized for drawing and productivity apps. With an even bigger display, such an iPad Pro could comfortably support up to three apps in Split View, for example, and allow professional applications such as LumaFusion and Photoshop (and maybe even Logic and Xcode by Apple?) to offer desktop-class interfaces on a device that supports both touch and external input.\nThen, of course, there’s the idea of a “drafting table iPad” that would essentially be Apple’s answer to the Microsoft Surface Studio. I would buy that product in a heartbeat, but I understand if Apple may want to start “small” and release a 15” iPad Pro first. Regardless, as soon as a larger iPad Pro comes out, I know I’m going to be very interested in one.\nThe Cascading Benefits of the iPad Pro\nJohn: The iPad Pro’s importance to the entire iPad line can’t be overstated. Whether you use Apple’s largest, most powerful tablet or not, if you have any iPad, the Pro has likely affected the way you use it.\nIt has been over four years since the first iPad Pro was delivered to customers. In that time, the Pro’s accessories, which felt like high-end exclusives when they were released, have trickled down to other models of iPad. Their expansion across the lineup isn’t something that is written about very often, but when you look at Apple’s entire iPad lineup, it’s remarkable just how impactful those accessories have become. The Apple Pencil and Smart Keyboard have become part of what an iPad is.\nThe Apple Pencil is probably the most important accessory available for any Apple product today, because it’s what sets Apple’s tablets apart from their competition. Other companies make styluses, but even now, the first-generation Apple Pencil’s tight integration of hardware and software has no peer.\niPads like the mini and 10.2-inch iPad don’t support the Pro’s second-generation Pencil, but even so, it’s an accessory that allows users to greatly expand the way they use the iPad through activities like drawing and note-taking. I find myself using the Apple Pencil that I bought for my 2015 iPad Pro with my iPad mini all the time for note-taking and navigating its UI. The first-generation Pencil is less convenient to charge than the current model, but it still works like a champ almost five years later.\nThe Smart Keyboard has had a similar impact. It’s not available for the mini, and I doubt it ever will be given the device’s size, but it transforms the 10.2-inch iPad into an ultraportable device that quickly switches personalities. Without a USB-C port, the modularity of the device isn’t as flexible as the Pro, but the core of the experience that Federico described in his story yesterday is just as true for these iPads.\nWhen I look at what the future might hold for the iPad, I expect this accessory trickle-down trend to continue. When Apple announced the 2020 iPad Pro last month, it showed off the new cantilevered Magic Keyboard. The keyboard is expensive at $349 for the 12.9-inch iPad Pro, but I expect that like the Apple Pencil, that price will come down over time and we’ll see a version of it available for other iPads too.\nI also expect we will eventually see USB-C trickle down to the entire iPad line as a way to enhance its modularity. The Lightning connector has served Apple well, but USB-C is playing a larger and larger role in the adaptability of the iPad Pro. Based on what we’ve seen with Apple’s own accessories, I expect it’s only a matter of time before USB-C plays a similar role across the entire iPad line.\nA USB-C port alone isn’t exciting, but its implications are. There will always be differences between Apple’s least and most expensive iPads, but by bringing the core experience of accessories like the Apple Pencil and Smart Keyboard to all iPads, Apple is setting up a future where the primary differentiator between devices will be their screen size, not their capabilities. The Pro models will continue being the place where new features are introduced first, and there will be differences at the margins like there are now between the screen technologies used in each model, but the prospect of every iPad gaining the flexibility of the Pro is within reach and exciting.\nBecoming the Best Computer for the Most People\nRyan: Earlier this week, in a story about developing apps for iPad, I stated that “the device’s very nature – a slab of glass that becomes its software – evokes countless possibilities.” As I think about the future of the iPad, I can’t help but go back to pulling that same thread. While advancements in hardware are important, they’re ultimately just a means to the end of enabling new experiences, new ways of that slab of glass transforming into something unique. In other words, the hardware’s evolution is all about serving the software.\nOften when a device is 10 years into its life, the time for innovation and experimentation has passed. But for the iPad, it’s been less than a year since Apple set it on a fresh path with the advent of iPadOS. The new OS indicates a new sense of independence and identity for the device, and iPadOS 13.4 proves that Apple is just getting started with evolving the iPad’s software. While I don’t expect this fall’s version 14 to offer too much new or revolutionary, since the past year has brought so much change already, I’m more confident than ever that the future of iPadOS is bright.\nAs the iPad enters this next decade, it carries tremendous opportunity to become the computer of the future. This is where I believe Apple’s focus for the device, and especially its OS, will lie moving forward: making it the best computer for most people.\nWhat will that require?\nRethinking multitasking. This subject has been covered a lot, by plenty of different people in recent months. I even outlined my own concept for how Apple could evolve multitasking to make it more accessible. I think my context menu-centric approach could work just as well, if not better even, now that we have proper mouse and trackpad support on iPad. I won’t rehash the details of my idea – you can read the article for all the specifics – but instead simply reiterate that something does need to change  before the iPad can become the best computer for the most people. Multitasking operations like Split View and Slide Over need to be easier to learn, but without losing any of their current capabilities. While I think iPadOS 14 is likely too soon to see these changes, I expect them no later than version 15 next year.\nMore consistent app experiences. Besides rethinking multitasking, I think the next best thing Apple can do for the iPad’s future is to create more system APIs and tools to help developers offer consistent experiences across their apps. This was a major point emphasized by the developers I recently spoke with: too many app features have to be built from scratch by developers with entirely custom implementations. Things like modal sheets, advanced keyboard controls, file system integration, drag and drop, and multiwindow are all functions that developers in the last decade have had to build custom versions of because Apple didn’t offer native solutions. A lot of positive change has taken place in this regard, but there’s plenty more to be done. The more that Apple invests in designing key functionality that can then be given to developers through new APIs, the better experience will be had by iPad users who find that the apps they use share a common ethos in design. I’m not saying every app should be identical, or shouldn’t innovate in their own ways, only that common functions should be done the Apple-prescribed “iPad way,” via the help of APIs from the mothership.\nContinued expansion of sales models. This is again a continuation of something Apple’s already doing, but developers could use more ways of structuring their businesses than what’s currently available today. Things have gotten better ever since Phil Schiller took responsibility for the App Store, with changes like the expansion of subscription options to more apps. However, not every app is a good fit for a subscription, so Apple should offer more avenues for apps to build sustainable businesses in alternative ways.\nApple leading the way with its apps. I use a lot of Apple apps because I think the company, for the most part, does a great job with its first-party offerings. I’m not in the camp that believes the company can’t make good apps anymore. That said, though, I do think Apple could do more to inspire developers toward greater creativity on the platform. The early days of the iPad were marked by remarkable experiences like GarageBand and full-fledged versions of iWork. While there have been some great app redesigns of late, such as the revamped App Store and Books apps, it’s been a very long time since Apple created brand new iPad apps that served to inspire. I hope that changes soon.\nThere’s more that could be said about the iPad’s software, including how its OS and apps could evolve, but these few things listed would, I believe, go a long way toward making iPad the primary computer for a lot more people.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-04-04T12:02:55-04:00", "date_modified": "2020-04-05T11:41:06-04:00", "authors": [ { "name": "MacStories Team", "url": "https://www.macstories.net/author/macstoriestaff/", "avatar": "https://secure.gravatar.com/avatar/a5941d3461ac64d3d2687017df01306d?s=512&d=mm&r=g" } ], "tags": [ "iPad", "iPad at 10", "stories" ] }, { "id": "https://www.macstories.net/?p=62846", "url": "https://www.macstories.net/stories/modular-computer/", "title": "Modular Computer: iPad Pro as a Tablet, Laptop, and Desktop Workstation", "content_html": "
\"My

My iPad Pro desktop setup.

\n

When I started my iPad-only journey in 2012, I was stuck in a hospital bed and couldn’t use my Mac. It’s a story I’ve told many times before: I had to figure out a way to get work done without a Mac, and I realized the iPad – despite its limited ecosystem of apps and lackluster OS at the time – granted me the computing freedom I sought. At a time when I couldn’t use a desk or connect to a Wi-Fi network, a tablet I could hold in my hands and use to comunicate with remote colleagues over a cellular connection was all I needed. Over time, however, that state of necessity became a choice: for a few years now, I’ve preferred working on my iPad Pro and iPadOS (née iOS) in lieu of my Mac mini, even when I’m home and have access to my desk and macOS workstation.

\n

The more I think about it, the more I come to this conclusion: the iPad, unlike other computers running a “traditional” desktop OS, possesses the unique quality of being multiple things at once. Hold an iPad in your hands, and you can use it as a classic tablet; pair it with a keyboard cover, and it takes on a laptop form; place it on a desk and connect it to a variety of external accessories, and you’ve got a desktop workstation revolving around a single slab of glass. This multiplicity of states isn’t an afterthought, nor is it the byproduct of happenstance: it was a deliberate design decision on Apple’s part based on the principle of modularity.

\n

In looking back at the past decade of iPad and, more specifically, the past two years of the current iPad Pro line, I believe different factors contributed to making the iPad Pro Apple’s first modular computer – a device whose shape and function can optionally be determined by the extra hardware paired with it.

\n

The original iPad Pro showed how Apple was willing to go beyond the old “just a tablet” connotation with the Apple Pencil and Smart Keyboard. Three years later, the company followed up on the iPad Pro’s original vision with a switch to USB-C which, as a result, opened the iPad to a wider ecosystem of external accessories and potential configurations. At the same time, even without considerable software enhancements by Apple, the creativity of third-party developers allowed iPad apps to embrace external displays and new file management functionalities. And lastly, just a few weeks ago, Apple unveiled iPadOS’ native cursor mode, finally putting an end to the debate about whether the iPad would ever support the desktop PC’s classic input method.

\n

The intersection of these evolutionary paths is the modern iPad Pro, a device that fills many roles in my professional and personal life. Ever since I purchased the 2018 iPad Pro1, I’ve been regularly optimizing my setup at home and on the go to take advantage of the device’s versatility. I’ve tested dozens of different keyboards, purchased more USB-C hubs than I care to admit, and tried to minimize overhead by designing a system that lets me use the same external display and keyboard with two different computers – the Mac mini and iPad Pro.

\n

At the end of this fun, eye-opening process, I’ve ended up with a computer that is greater than the sum of its parts. By virtue of its modular nature, I find my custom iPad Pro setup superior to a traditional laptop, and more flexible than a regular desktop workstation.

\n

So how exactly did I transform the iPad Pro into this new kind of modular computer? Let’s dig in.

\n

\n

Table of Contents

Tablet Mode

\n

At its core, the iPad Pro is still very much a tablet. And despite the number of desktop-oriented accessories I’m going to cover in this story, I still enjoy the simple act of unplugging everything from my iPad Pro – including its Smart Keyboard Folio cover – and sitting on the couch to read longform articles and books, watch videos, or take care of MacStories admin tasks with the Apple Pencil.

\n

It may be a trite statement a decade into the iPad’s existence, but no traditional portable computer, from Apple or other companies, beats the iPad’s inherent simplicity when it comes to holding a large screen in your hands and controlling it with multitouch. In spite of such obviousness, I feel like I should reiterate this sentiment as, somewhere along the conversation surrounding post-PC hybrids and “what’s a computer” rhetoric, we may have lost track of the tablet’s basic appeal.

\n

The 12.9” iPad Pro is not a lightweight tablet: its footprint makes it impossible to operate with one hand, and when you compare it to the sheer portability of an iPad mini or Kindle, you’d be hard-pressed not to consider it an unwieldy affair. At the same time though, the big iPad Pro makes for an amazing “couch tablet” experience: watching YouTube videos2 and reading manga are fantastic activities to perform on a 12.9” display resting on your lap; whenever I need to edit an article that’s going to be published on MacStories, I like to put the iPad Pro in portrait mode on my lap (so I see more text onscreen), load up our team’s GitHub repository as an external location in iA Writer (more details here), and use a custom MacStories preview template to edit and read the piece as it would look on our website. The illusion of holding an article in my hands is reinforced by the iPad Pro’s near edge-to-edge design, a unique trait that I don’t appreciate as much when I use the device as a “computer” on a desk, paired with an external keyboard.

\n
\"The

The large iPad Pro is fantastic for reading longform stories.

\n

To make the iPad Pro’s tablet experience more enjoyable and flexible, a few months ago I installed an anti-glare screen protector. Initially, I tested version 2 of the popular Paperlike matte screen protector, but didn’t like its somewhat complex installation procedure and rough texture.3 Then, at the recommendation of some MacStories readers, I started testing the Moshi iVisor screen protector and fell in love with it. This screen protector is a thin film that can be applied to the iPad Pro’s display in 30 seconds; amazingly, it leaves no air bubbles, can be washed and reused, has a smooth texture that is much less aggressive than the Paperlike’s, and, more importantly, adds a matte layer on top of the iPad Pro’s display that nearly eliminates all reflections.

\n

I started looking into matte screen protectors for a couple reasons. First, I’ve always found it annoying I couldn’t read with my iPad Pro while sitting outside on our large balcony without seeing all kinds of reflections on the tablet’s screen; additionally, looking ahead at summer 2020 and iOS/iPadOS review season, I didn’t want to be in the same situation as last year – trying to desperately find some shade under a beach umbrella in a vain attempt to edit my annual review on the iPad Pro’s reflective screen. If Apple allowed a more expensive, build-to-order matte display for the iPad Pro, I would absolutely go for it because I like working outside in the spring and summer here in Italy. In the absence of an official option, I had to find a third-party alternative.

\n

I’ve been using my iPad Pro with the Moshi iVisor matte screen protector for about three months now; not only has this modification vastly improved my experience with using the iPad under direct sunlight or other light sources, but when I look at an iPad without a matte screen protector, I don’t know why I didn’t try this approach years ago. Unlike the Paperlike 2, the iVisor can be installed in seconds and easily re-applied without creating air bubbles, and I can still swipe with my fingers across the display without feeling too much friction. Color fidelity and image crispness is somewhat impacted by the addition of a screen protector, but, again, I’ve found the iVisor to perform better than the Paperlike in this regard too.

\n
\"Image

Image quality remains crisp and vivid even with the Moshi screen protector on.

\n
\"The

The Moshi iVisor up close.

\n

Even though it’s not advertised for those who seek a paper-like experience when handwriting or sketching on iPad Pro, I’ve found the iVisor to add just the right amount of friction for the Apple Pencil too. I’ve never been a heavy user of the Apple Pencil myself (John has far more creative use cases for it when it comes to brainstorming and mind-mapping), but it’s my go-to accessory whenever I have to review and sign PDF documents from my accountant. When I have to do that, I like to grab my iPad Pro and Apple Pencil, relax on the couch (if you have to read boring legal documents, you might as well do it from the comfort of a sofa), and annotate in PDF Viewer. I could do this with my Mac mini, or with a MacBook, but nothing beats the simple act of holding a document and signing it with an Apple Pencil.

\n

Throughout the rest of this story, you’re going to see and read about various pieces of additional hardware I’ve used to unlock different modes for my iPad Pro. I felt it was equally important, however, to emphasize that one of those many modes still is, perhaps surprisingly to some, to use the iPad Pro as a tablet. No keyboard, no trackpad, no external display: just a screen I can hold in my hands – the only way it used to be years ago, and something I don’t ever want the iPad to lose.

\n

Laptop Mode: Custom Smart Keyboard Folio

\n

My relationship with Apple’s Smart Keyboard Folio was off to a rocky start in November 2018, but, as I wrote last year, with time I’ve grown to appreciate the simplicity and reliability of Apple’s slim keyboard cover. Unlike Bluetooth keyboards, the Smart Keyboard feels like an integral component of the iPad Pro: by eschewing Bluetooth pairing and battery life concerns, the Smart Keyboard is always ready to type as soon as you open it; like the second-generation Apple Pencil, the act of letting the Smart Keyboard make contact with the iPad Pro means the two can work together. The Smart Keyboard has its limits4, but I’ve enjoyed using it for what it is: a full-size keyboard that requires no pairing and adds minimal bulk to my iPad Pro while also offering essential protection for its display. In my mind, no other keyboard accessory ticks all these boxes.

\n

That’s not to say I was just willing to accept the Smart Keyboard Folio’s design limitations. As I also detailed last year, I employed the so-called Kickstand Method5 to mod the Smart Keyboard Folio with two small metal kickstands, which allowed me to use the iPad in software keyboard mode (propped up at a slight angle) without removing the Smart Keyboard Folio.

\n

For context:

\n

\n Following a tip from a MacStories reader a few months ago, I came across these metal kickstands by Spigen, which are available for around $10 each on Amazon. These kickstands are designed for smartphones: you attach the base of the kickstand via its built-in adhesive tape to the back of your phone’s case; then, when you need to watch a video or have a video call and would like to keep the screen propped up on a desk, you can just flip open the kickstand – which is sturdy and reliable – and set your phone down. It’s a simple, unobtrusive, robust design that is perhaps a bit more utilitarian than a PopSocket.

\n

But back to the idea I received on Twitter a while back: what if instead of using a kickstand with an iPhone, I attached two kickstands in parallel to the back of the Smart Keyboard Folio so that, with the cover folded on itself, they would prop up the iPad at an angle, thus creating the “touch mode” Apple didn’t support out of the box?\n

\n

And here’s what the result looked like at the time:

\n
\"iPad

iPad Pro and kickstands.

\n
\"Using

Using the iPad Pro in software keyboard mode without removing the Smart Keyboard Folio.

\n

Now, some of you may have thought that was just an experiment that wouldn’t last long. And I wouldn’t blame you – it is my job, after all, to try multiple accessories and apps, most of which don’t stick around for long. But the Kickstand Method is different: not only have I been using the original set of kickstands to prop up my iPad whenever I want to type with the software keyboard, sign documents with the Apple Pencil, or just change input methods for a while, but I liked the results so much, a few months ago I added a second set of kickstands to the back of the Smart Keyboard Folio. To help you visualize everything, here’s what my Smart Keyboard Folio looks like now:

\n
\"The

The new stickers are the result of WWDC 2019 plus an order of hundreds of assorted stickers from Amazon.

\n

As you can see, the new kickstands – also placed parallel to each other – sit lower than the original set. With the two additional kickstands, I can now prop up the iPad Pro in “movie mode”: the base of the Smart Keyboard Folio folds on itself so it lays flat on the back of the device; the kickstands rest on the back of the cover, creating a 50-degree angle that is ideal for watching videos, playing games, or FaceTiming with the iPad Pro’s large display.

\n
\"My

My new angle for the iPad Pro thanks to additional kickstands.

\n

Together with adding a memory foam layer to the AirPods Pro silicone tips, these kickstands are my favorite mod I’ve ever applied to a tech accessory. As I explained last year, these kickstands have added practically no weight to my iPad Pro and haven’t dented the Smart Keyboard Folio’s back cover at all. In return, they’ve allowed me to get more out of the Smart Keyboard Folio since I can use it in more contexts that wouldn’t otherwise be supported without the additional viewing angles created by the kickstands. Furthermore, these kickstands are also behind my favorite setup (which I’ll describe later): the iPad Pro laying almost flat on my desk next to the UltraFine 4K display.

\n

My newfound appreciation for the Smart Keyboard Folio notwithstanding, I, like other iPad users, am ready to say goodbye and switch to the upcoming Magic Keyboard, which will feature a freely adjustable viewing angle, built-in trackpad, and backlit keys. But I also have questions.

\n

Will the Magic Keyboard be moddable enough to support folding it on itself and using the Kickstand Method again? (I don’t think it will.) Will it be too heavy to carry around? (I think it’ll be heavier than the Smart Keyboard Folio, but not incredibly so.) Will it be stable enough to use on my lap? (I believe Apple has figured this out.) I don’t have definite answers to any of these questions yet, but I’m keen to find out next month.

\n

Until that happens, I wanted to reiterate how much I’ve enjoyed working with my modded Smart Keyboard Folio for the past year. I typed my entire iOS and iPadOS 13 review on it, and I’ve carried it around with me to WWDC, the beach, and countless car trips.6 Two years ago I never thought I’d say this, but the Smart Keyboard Folio has been the best iPad keyboard experience I’ve had to date.

\n

USB DAC Mode

\n

This particular configuration of my iPad Pro is an extremely niche one, and I believe the majority of MacStories readers will not be interested in it, but I wanted to mention it because it’s fun, geeky, and, in a way, oddly fascinating.

\n

As I explained in my recent coverage of Meta for Mac, for the past year I’ve been building a personal music collection by purchasing my favorite albums in the lossless FLAC format. To enjoy these albums, I have a high-resolution Sony Walkman music player which I pair with my absolute favorite over-ear headphones – the Sony MDR-Z1R – via a balanced cable. If you’re curious about all the details behind my setup, I covered everything in the December 2019 episode of the MacStories Unplugged podcast for Club MacStories members; in short, I’ve been discovering sonic details of my favorite songs I never knew existed, and it’s nice to disconnect from the Internet every once in a while and simply enjoy listening to music without the distractions of Twitter or the endless catalog of a music streaming service. It’s pure audio bliss, it’s nice, and we could all use more nice things these days.

\n

Sometimes, however, I want to listen to music with my good Sony headphones and continue working on my iPad Pro at the same time. So after some research (and thanks to the help of Connected listeners), I realized it was possible to use the Sony Walkman as an external DAC (a digital-to-analog converter) that can drive my headphones with high-res music coming from the iPad Pro’s USB-C port.

\n

My music library is stored on a Samsung T5 SSD that is connected to my Mac mini server, which is always running in the background and shared on our local network. This means I can access the contents of the T5 drive via SFTP and SMB, allowing me to connect to it from the iPad’s Files app and other iPad file managers as well. To listen to my music library in its original format with my Sony headphones, I can’t use the iPad alone: the MDR-Z1R come with a 4.4mm balanced audio cable, which needs to be plugged into the Walkman. Here’s where things get interesting: the Walkman supports a USB DAC mode, which lets the device connect to a computer and act as an audio passthrough for the headphones connected to it. And as it turns out, the iPad Pro’s USB-C port supports all of this – provided you use the right apps to start music playback.

\n

Here’s what I did to listen to high-resolution music (both FLAC files in 16/24-bit at 44.1/48/96 kHZ and DSD) from the iPad Pro:

\n

When I want to listen to an album in high-resolution and work on the iPad at the same time, all I have to do is enable DAC mode on the Walkman and connect it via USB to the iPad Pro; the iPad will see the Walkman as an external audio device and set it as default output. Then, I can open Neutron, browse my external music library, and start playback. Audio will be passed in its original lossless format from the iPad to the Walkman to my headphones, so I can continue working while retaining the ability to control playback from my keyboard’s media keys.

\n
\"My

My Sony Walkman as an external DAC for the iPad Pro.

\n

Unfortunately, Neutron is – how can I say this – not the prettiest app around. The app’s UI is…confusing at best, and it doesn’t scale well to the iPad’s large display. But, from an audio engine perspective, Neutron is incredible: the app is powered by a 32/64-bit audio rendering engine that delivers high-resolution audio via Lightning or USB-C without frequency resampling and with all DSP effects applied – even if the audio stream is coming wirelessly from a server.

\n
\"Neutron

Neutron is not a good-looking app. However, I always leave the app running in the background, so that doesn’t bother me much.

\n
\"Neutron

Neutron makes up for its poor UI with an incredible audio processing engine.

\n

Neutron is the only app I’ve found that can source audio files from network locations such as SMB or FTP, and it even offers native DSD and gapless playback. If you’re an audiophile, you know what all this means and, like me, you’d be willing to forgive the app’s poor UI in exchange for its astounding functionality. Just take a look at the list of features on Neutron’s website to see what I mean.

\n

Using the iPad Pro with an external DAC for high-resolution audio is, quite possibly, the definition of a niche use case. Regardless, this continues to prove my point: it’s also thanks to USB-C that the iPad Pro now supports a variety of accessories, which in turn has allowed the device to fit into new and different setups.

\n

Desk Mode

\n

Ever since I upgraded my home office with a new desk, Mac mini, and UltraFine 4K monitor in November 2018, I’ve been working toward a single goal: building a setup that would allow me to use the same external display and keyboard with two different computers and OSes – the Mac mini and iPad Pro. Same desk, two vastly different experiences. It took me a while, but thanks to the improvements in iPadOS 13.4 (and a late realization on my part), I’m happy to say I finally have the “desktop iPad Pro” setup I’ve long desired.

\n

First, an explanation is in order for those who may land on this section without being familiar with my tastes and needs. Most days, I only turn on my Mac mini to let it run homebridge and Plex in the background, and I primarily get work done on my iPad Pro. A couple times a week, I also record podcasts on my Mac mini; I could do this from my iPad Pro, but because it’s a more involved setup, I only use my iPad Pro to record podcasts when I do not have access to my desk. The Mac is still essential to an important part of my work, but it’s actively used for less than six hours each week.

\n

In case it wasn’t clear, I enjoy working7 on my iPad Pro more than the Mac mini. Or more specifically, I prefer the design, interactions, and app ecosystem of iPadOS to macOS. For this reason, when I was rethinking my home office two years ago, I had an idea:

\n

What if I could create a – you guessed it – modular setup that supported both macOS and iPadOS with minimal adjustments necessary?

\n
\"My

My desktop setup.

\n

Enter the UltraFine 4K display, which, thanks to a single USB-C cable, can work as an external monitor both for my Mac mini and 2018 iPad Pro. I’ve written about this display and my penchant for using an iPad Pro with an external monitor (and all its limitations) several times before, most notably here and here. Here’s a key section from last year:

\n

\n With a single USB-C cable (compatible with USB 3.1 Gen. 2 speeds), the iPad Pro can mirror its UI to an external 4K monitor, support second-screen experiences for apps that integrate with that API, and be charged at the same time. In the case of the UltraFine 4K display, the monitor can also act as a USB hub for the iPad Pro thanks to its four USB-C ports in the back; as I mentioned last year, this allows me to plug the Magic Keyboard (which I normally use via Bluetooth with the Mac mini) into the UltraFine and use it to type on the iPad Pro. To the best of my knowledge, there are no portable USB-C hubs that support 4K@60 mirroring to an external display via USB-C’s DisplayPort alt mode.

\n

Despite the fact that I can’t touch the UltraFine to control the iOS interface or use a trackpad to show a pointer on it, I’ve gotten used to working with iOS apps on the big screen while the iPad sits next to the keyboard, effectively acting as a giant trackpad with a screen. For instance, when I want to concentrate on writing while avoiding neck strain or eye fatigue, I just plug the iPad Pro into the UltraFine, connect the Magic Keyboard in the back, and type in iA Writer on a larger screen. No, pillarboxing is not ideal, but the bigger fonts and UI elements are great for my eyesight, and I still get to work on iOS, which is the operating system I prefer for my writing tasks.\n

\n

Keep in mind the second quoted paragraph, as it’s going to be relevant in a bit.

\n

Now, the reason I want to use my iPad Pro with an external display is simple enough: even with pillarboxing, it’s bigger and doesn’t cause neck strain if I have to type for several consecutive hours. I get to enjoy the benefits of iPadOS and all my favorite apps while using a large display that sits at eye level and is more comfortable than hunching down at a desk to look at my iPad’s smaller display.

\n

When I last wrote about using the iPad Pro with an external display last year, I had the one-cable-to-rule-them-all ordeal pretty much figured out (yay USB-C!), but the external keyboard was a problem: I didn’t want to manually unpair the Magic Keyboard from the Mac mini every time I wanted to use it with the iPad Pro. Additionally, the iPad didn’t support mice and trackpads – via AssistiveTouch or otherwise. Suffice to say, a lot has changed over the past few months.

\n

The first aspect I was able to fix8 is the keyboard. I’ve stopped using the Magic Keyboard and I now use the Logitech MX Keys, a Bluetooth keyboard that supports switching between multiple devices with the press of a button. There’s plenty to like about the MX Keys besides its multi-device pairing: it’s backlit, its build quality is terrific, it charges via USB-C, and keys have a bit more travel than the Magic Keyboard. The only downside, at least for me, is that the MX Keys cannot be purchased in a compact layout that omits the number pad on the right side of the keyboard, which I never use.

\n
\"Unlike

Unlike the Magic Keyboard, the MX Keys comes with media keys to activate specific functions such as volume control and media playback. Source: Logitech.

\n

The most important feature of the MX Keys, as I mentioned above, is the ability to quickly switch between multiple paired devices – in my case, the Mac mini and iPad Pro. When I want to work with the iPad Pro at my desk, I press the ‘1’ switch button, and the iPad instantly connects to the MX Keys; when it’s time to use the Mac mini, I press ‘2’ and the keyboard pairs with the Mac again. It’s that simple, and I wish Apple’s own extended Magic Keyboard offered a similar function, perhaps based on the company’s proprietary wireless chips.

\n

Which brings me to pointer support in iPadOS 13.4 and controlling content mirrored from an iPad onto an external display.

\n

The New Pointer

\n

In my Beyond the Tablet story last year, which was published before the debut of iPadOS and its support for USB/Bluetooth mice via Accessibility, I described why I enjoyed connecting my iPad Pro to the UltraFine 4K to focus on one task at a time, particularly writing. If I was spending the majority of my time typing in iA Writer, then not having a way to control the iPad’s UI shown on the monitor without touching the iPad’s screen was not an issue – I could just keep typing with the keyboard. I also noted how I could keep the iPad propped up at an angle next to the UltraFine thanks to its custom kickstands and use my right hand or the Apple Pencil for the occasional touch interaction with the display.

\n

Besides the placement of my iPad Pro, all of this has changed with the release of iPadOS 13.4 and its native integration with mice and, more importantly, the Magic Trackpad 2. I don’t mean to speak ill of last year’s AssistiveTouch-powered mouse integration – a feature designed for the Accessibility community that also garnered mainstream attention – but it never quite clicked for me (no pun intended) since it didn’t allow for full, system-wide control of the iPadOS interface. The new native pointer does, and it’s a game-changer for anyone seeking to turn their iPad Pro into a desktop workstation.

\n

The most important difference from last year’s Accessibility feature is the pointer: it is deeply embedded within the system’s UI, which has been updated to recognize the pointer and adapt certain UI elements to it. When you control an iPad with a mouse by using it as a pointing device with AssistiveTouch, the indicator displayed onscreen is, effectively, a virtual replica of your finger; the iPadOS UI has no idea that a pointer is moving onscreen because that Accessibility pointer only supports tap events, which prevents it from having access to features such as scrolling inertia, hover state, and multitouch. By contrast, the pointer in iPadOS 13.4 is entirely integrated with UIKit and dynamically adapts to different UI elements and areas by employing a mix of shapes, visual effects, and magnetic snaps.

\n

By default, iPadOS’ new pointer is a tiny floating dot that moves across the screen and intelligently changes its hue depending on the color of content underneath (a simple touch, but a clever one). The difference between the AssistiveTouch pointer and iPadOS 13.4’s flavor is immediately clear as soon as you start moving it around: all elements of the iPad’s UI can be interacted with using the pointer, from the small separator between apps in Split View and the clock in the status bar to the dock and apps on the Home screen. This is true system-wide integration between the interface and an external input mechanism – a first for Apple’s touch OS. And it’s not just that the pointer can click on any item it hovers over; unlike traditional desktop pointers, there’s a symbiotic relationship between the iPadOS UI and the pointer: interface elements visually react to the pointer, which often changes shape, momentum, color, and behavior based on the target underneath.

\n

Developers can create custom pointer effects and features (and I’ll cover some examples later in this section), but I want to highlight a couple default behaviors that made me realize how Apple’s iPadOS pointer is the perfect blend of utility and whimsy.

\n

As soon as the pointer flies over a text field, it quickly squishes – Knight Bus-style – to transform itself into an insertion point. This type of pointer makes it easy to perform fine-grained text selections by clicking and dragging a text selection onscreen; if you pay close attention to it, you’ll notice that the insertion point even “snaps” to individual lines in a block of text, almost as if magnetically attracted to them. The animation is fun, and selecting text becomes much easier and faster than doing so via touch – especially since Apple (bafflingly enough) got rid of the magnification loupe for text selection in iOS and iPadOS 13.

\n
\n
\n

Selecting text with the system pointer.

\n
\n

The aforementioned snapping behavior is what Apple has officially labeled pointer magnetism, and it highlights another difference from traditional desktop pointers: an increased visual affordance when navigating the iPadOS UI. Whenever the pointer arrives in the proximity of a UI element that has been updated for this new interaction method, the pointer’s blob transforms again, accelerates toward the element, and snaps to it, usually highlighting it with a translucent, rounded rectangle. There are a variety of visual effects developers can employ for buttons and other elements that react to the pointer, including parallax and color changes. Try hovering with the pointer over the toolbars of Notes and Mail, or perhaps over the multitasking controls in Split View, and you’ll see how it dynamically changes its appearance, all while selected elements bounce, wiggle, and react to the pointer to visually communicate that, yes, they’re selected and waiting for interaction. Once again, whimsical and useful context at the same time.

\n
\n
\n

Various types of pointer effects.

\n
\n

In using the new system pointer since the release of iPadOS 13.4, I’ve realized that, unlike others, I appreciate its constant state of transformation because it works well with the underlying aesthetic of the iPadOS UI. Differently from macOS, most buttons in UIKit have no visible shape or contour – they’re just glyphs.9 Arguably, those “buttons” are easier to interact with when you’re not using a mouse or trackpad because you can simply reach out with your finger and touch them to see what happens. But the pointer introduces a layer of abstraction between the interface and user: suddenly you’re not manipulating the screen anymore – you’re controlling a virtual pointer that translates a movement of your finger into an action onscreen. This separation between UI and user is what has stumped novice PC users for decades and why modern smartphones and tablets are generally considered more user-friendly than traditional computers.

\n

And here lies the core of Apple’s idea with pointer support in iPadOS 13.4, and why I ultimately believe they’ve done an excellent job with it: rather than merely mimicking the macOS pointer, Apple sought to find a middle ground between the inherent usability of a touch UI and the abstraction of a pointer. The result is an adaptive pointer that communicates context and the addition of a new layer between the device’s UI and the user – the visual effects that show you what’s being selected and what may happen if you scroll or perform a click. It may seem obvious in hindsight – the best innovations often do – but I believe this was a massive, multi-year undertaking for the UIKit team at Apple; it’s no surprise the system is mature enough to support a variety of integrations for developers and that the company’s explanation is thoughtful and thorough. They must have spent years getting all these pieces in place.

\n

In practice, the net result of Apple’s pointer efforts is a comprehensive system that lets me fully control the iPadOS UI mirrored on my UltraFine 4K monitor without ever touching the iPad Pro. This has fundamentally altered the ergonomics of my setup and improved how quickly I can get work done with multiple apps in this configuration.

\n

With a native pointer, I can finally select text with higher precision than multitouch without taking my hands off the keyboard and trackpad in front of me. iA Writer, my favorite text editor, supported text selection with the pointer without an app update required; obviously, both Mail and Notes (two apps I also use on a daily basis) worked with the pointer right away as soon as I updated to iPadOS 13.4. Even once it transforms into an insertion point, the pointer can perform other functions: if I hold down on the trackpad to left-click then drag, I can drag a text selection somewhere else; if I right-click instead (a two-finger click on the Magic Trackpad 2), the native copy and paste menu comes up. And that’s not all: again without requiring an update, buttons in the enhanced copy and paste menu of iA Writer support the pointer too, so I can hover over them and see which one I selected at a glance.

\n
\"iA

iA Writer’s enhanced copy and paste menu works with the pointer out of the box, just like other standard UIKit elements in the app.

\n

Another interesting side benefit of system-wide pointer support: it’s not unusual for text-heavy iPad apps, particularly those veering toward the “pro” side of the market, to implement their own custom text selection engines. I understand why some developers do this, but it often does more harm than good to the user experience as custom selections always differ in odd ways from the default iPadOS text selection mechanism, breaking muscle memory. Since iPadOS 13.4 came out, I’ve seen a handful of these apps switch back to the native text selection API to fully support the new pointer; for example, the excellent Textastic, whose advanced text editor can now be fully controlled with a mouse or trackpad.

\n

In my experience with using the iPad Pro through an external display, the most remarkable aspect of pointer integration across the system is how many third-party apps have received support “for free”, without requiring an update, simply by virtue of supporting native frameworks and APIs. This creates a virtuous cycle that encourages developers to adopt modern APIs as soon as possible, thus making the ecosystem stronger and allowing Apple to introduce new functionalities that already work with existing components.

\n

NetNewsWire, which I’ve been testing as my go-to RSS reader lately, supported the pointer as soon as 13.4 launched: in the app, I can select and click toolbar items, swipe with two fingers on the trackpad to show actions for individual articles, and even right-click to show the context menu. Similarly, in Twitter for iPad I can click on tweets, switch between views in the sidebar, and even right-click on tweets to bring up the native context menu.10 Developers of apps with custom UI elements may want to look into the new pointer API for further customizations; overall, I’ve been impressed by how many third-party apps mostly worked out of the box with the pointer in iPadOS 13.4.

\n
\"NetNewsWire

NetNewsWire supported the pointer right away.

\n

Speaking of context menus, I like what Apple has done to adapt them to the pointer. Before iPadOS 13.4, if I wanted to invoke a context menu with my Logitech mouse, I had to assign a long-press gesture to a button, click it, wait for the “fake” gesture to perform, then interact with the menu. The native pointer has brought a much better activation method: by default, right-clicking opens a context menu (or copy and paste menu if in a text field); the menu comes up immediately without waiting (there’s no fake long-press to wait for), and it doesn’t carry the context menu’s preview panel, thus replicating the look of a desktop contextual menu. I love how fast invoking these menus is now, and I appreciate that I can hover over each button in the menu before making a choice. The clunkiness involved with showing context menus was one of the pain points of the old mouse integration based on AssistiveTouch; in iPadOS 13.4, right-clicking in an iPad app to show a context menu feels just as natural as its Mac counterpart.

\n
\"When

When activated via the pointer, context menus come up immediately and don’t embed a preview of the selected item.

\n

Multitouch and Updated Apps

\n

What is going to bring the desktop iPad experience even closer to a traditional desktop computer, however, is iPadOS 13.4’s support for multitouch gestures and developers building deeper pointer integrations in their apps.

\n

As I mentioned above, for the past couple weeks I’ve been using Apple’s Magic Trackpad 2 to control my iPad Pro whenever it’s connected to the UltraFine 4K display. I also own a Logitech MX Master 3 mouse with configurable buttons that can be paired with the iPad Pro; after testing both peripherals, I soon realized the multitouch trackpad was going to help me navigate the system and switch between apps more quickly – something I was discouraged from doing with the old setup last year.

\n

In addition to the ability to control UI elements in apps with an adaptive pointer, Apple built full support for navigation across the system through a combination of multitouch gestures and swipe gestures toward specific areas of the screen. The three most common actions when working with multiple apps on iPad – opening the app switcher, moving between apps, and going back to the Home screen – can be performed with three-finger swipes:

\n

I find these gestures intuitive, reliable, and consistent with their touch counterparts when I’m using the iPad without a trackpad. Swiping up with three fingers and pausing to show the app switcher, then scrolling horizontally with two fingers to pick different apps instantly clicked for me – almost as if support for multitouch trackpads had always been part of iPadOS. After having used these gestures, I don’t think I could ever go back to a trackpad without support for three-finger swipes.11

\n

Other gestures Apple baked into iPadOS 13.4 may need some additional fine-tuning. These are the gestures that require you to quickly “slide” with the pointer into a specific area of the screen: the Home indicator to show the dock; the upper right corner to show Control Center; the right edge of the screen to open Slide Over; the upper left corner to view notifications. In my experience, showing the dock is fine, but Control Center, Slide Over, and notifications often fail to activate on the first slide into the associated corner. To overcome this, I’ve started sliding the pointer into the corner twice – first to place the pointer, then to activate the specific function – which seems to trigger the gesture more consistently. I wonder if Apple could tweak the momentum required to activate these features so they always appear immediately.

\n

Despite these initial struggles with sliding the pointer into such “hot corners” (different from Hot Corners, an actual feature of macOS and iPadOS’ Accessibility), I also want to call out how nice it is to interact with Slide Over via the Magic Trackpad. Once Slide Over is open, I can use the same three-finger swipe gestures mentioned above to cycle between apps and close individual apps in the Slide Over stack; alternatively, I can hover with the pointer over the pulling indicator at the top of a Slide Over app, let the pointer attach to it, then drag the app to the other side of the screen or drop it into Split View. These interactions are consistent with the iPad’s existing gesture vocabulary, but they can be performed from a trackpad without touching the screen at all – another reason why I can’t imagine using non-multitouch-enabled trackpads with iPadOS.

\n
\n
\n

Interacting with Slide Over using the pointer.

\n
\n

Pointer and trackpad integration dramatically improves interactions with apps in the context of an iPad Pro used at a desk. Based on what I’ve seen and tested so far, third-party developers have already begun taking advantage of the pointer and two-finger gestures in useful ways.

\n

In an upcoming version of iA Writer, my favorite text editor, you’ll be able to swipe horizontally with two fingers anywhere in the text editor to show and hide the document library. This may not seem like a big deal until you realize how much faster it is to do so from a trackpad instead of having to select the ‘Back’ button in the upper left corner of the app to show the library. Furthermore, iA is also adding support for renaming files by clicking on their title in the title bar, which becomes a highlighted element on hover – a great approach I’d love to see in more apps in the future.

\n
\"The

The upcoming version of iA Writer for iPad will let you click the document’s name in the title bar to rename it. I want this pointer interaction in every app now.

\n

I’ve also been impressed by the deep pointer integration in the latest version of Fantastical for iPad. No matter which view you’re using, you can now hover over events/tasks and they’ll respond to the pointer with a subtle bounce effect that makes the UI feel alive even without touching the screen. But there’s more: you can also select buttons in the upper toolbar and segmented control and – my favorite detail – hover with the pointer over individual days in the sidebar’s mini calendar. As you do that, selected days will be highlighted with a square indicator; do it quickly enough, and you’ll get the result shown in the video below.

\n
\n
\n

Fantastical’s excellent support for the iPadOS 13.4 pointer.

\n
\n

Pointer support has added a new dimension to Fantastical on iPad, which allows me to use it on my UltraFine 4K monitor without compromises. Fantastical is a perfect example of the kind of deep integration with custom effects I’d like to see more iPad developers consider going forward.

\n

Another excellent example is Broadcasts, Steve Troughton-Smith’s new Internet radio app. Broadcasts features both default pointer effects (for instance, when hovering over toolbar buttons to highlight them) as well as custom ones such as the lift animation that occurs when hovering over radio stations in the main grid. Additionally, Troughton-Smith was even able to bring tooltips – a classic Mac feature – to iPadOS when the pointer has snapped and paused on top of a button.

\n
\n
\n

Broadcasts features rich pointer integration, keyboard shortcuts, and Mac-like context menus.

\n
\n
\"Broadcasts

Broadcasts also offers a setting screen to choose whether the app should be mirrored on an external display or output full-screen content. More apps should offer a similar option.

\n

Indeed, besides enabling iPad apps to be fully controlled without touching the device, pointer integration also means developers can easily replicate features from macOS. Nowhere is this more apparent than Screens, Edovia’s popular VNC client that lets you control a Mac/PC from your iPad. Screens has already been updated with pointer integration, and this is where things get kind of amazing in terms of iPadOS and external displays.

\n

When I work with the iPad Pro at my desk, I may have to occasionally check on my Mac mini to monitor its Plex server or transfer FLAC files to my Walkman. I could unplug the iPad Pro’s USB-C cable from the UltraFine display and plug the Mac mini’s cable in again to do this, but there’s a more elegant way to handle it.

\n

With my Mac mini running in the background, I can open Screens on the iPad Pro, which instantly logs me into macOS with my credentials. Here’s why this setup is incredible: Screens for iPad supports full-screen output on external displays (more in the next section), which means I can interact with a full-screen macOS UI on the UltraFine display that is actually being transmitted from an iPad app over USB-C. In the latest version of Screens for iPad, I can use the Magic Trackpad to click-and-drag macOS windows, right-click to open contextual menus, and otherwise use the native macOS pointer from my iPad without even seeing the iPadOS pointer on my external display. It’s a mind-bending setup, but it works beautifully – you’d be forgiven if you looked at the photo below and thought I was using macOS and the iPad Pro next to each other. In reality, that’s just my iPad Pro running Screens in external display mode along with a Magic Trackpad 2.

\n
\"Not

Not a Mac.

\n

Effectively, this is macOS as an app. Thanks to the pointer API in iPadOS 13.4, the folks at Edovia have been able to emulate classic macOS interactions from a trackpad connected to the iPad. In my experience, the approximation is close enough: were it not for the loss of image quality due to the VNC protocol, you’d be fooled into thinking you’re using macOS from a Mac physically plugged into the UltraFine display. Still, because performance and image quality are good enough, as a result of this Screens update I’ve only plugged the Mac mini into the external display twice this week to record AppStories and Connected.

\n

Full-Screen Apps

\n

In future versions of iPadOS, I would love the ability to get rid of pillarboxing when the iPad is connected to an external display. As I described last year, I’ve grown used to the black bars that appear at the sides of my UltraFine 4K display, and the benefits of this setup, at least for me, outweigh the issue; still, I’d welcome the ability to output full-screen app UIs to the external display and control them with a trackpad.

\n

While we wait for iPadOS to properly support external displays, however, it is possible to get an idea of what such a system might look like by using apps that take advantage of an existing API to output full-screen content on external displays. I don’t use this particular mode every day, but it has its niche, and a handful of developers have devised some pretty clever implementations for it.

\n

Originally launched several years ago and primarily designed for gaming purposes, Apple’s second screen API (based on UIScreen) allows iPhone and iPad games to output full-screen content on a display connected via AirPlay or a cable. The idea, as we explored in an old MacStories article, was to let iPhone and iPad users play games on a big screen by using their touch devices as controllers. The API was never really improved by Apple, but that didn’t stop developers of certain productivity apps from exploiting it for potentially unforeseen use cases.

\n

The iPad apps that integrate with this API are few, and because this technology hasn’t been integrated with the iPadOS pointer yet, it is not possible to “navigate” with the pointer from the iPad’s screen to the external monitor, as you would when using a Mac connected to an additional display. In addition to the aforementioned implementation in Screens, however, I also have a couple other examples I’d like to highlight.

\n

MindNode, my favorite mind-mapping app (which I use every year to outline my iOS reviews) added support for displaying full-screen map content on a connected monitor last year. If MindNode detects an external monitor, you can choose between two modes: you can mirror the entire app to the external display (with pillarboxing), or output a full-screen map to it. If you pick the latter, you can also decide whether the full-screen map should follow the zoom and scroll levels of the iPad (keeping both versions of the map in lockstep) or if you want to lock the viewport on the external display.

\n
\"MindNode's

MindNode’s external display settings.

\n
\"MindNode

MindNode can output a full-screen map to the external display.

\n

With the ability to control the viewport, I can lock the external display to a specific branch of the map, which lets me focus on the details of an individual section, while the iPad shows me the entire map, and vice versa. Even though I cannot move the pointer to the external display and directly control the map there, I’ve found this feature beneficial for those times when I want to keep a section of a map as reference in front of me, so I can stay in my text editor on the iPad.

\n

The most impressive implementation of the second screen API, however, is the one in Shiftscreen, a new utility by indie developer Yannik Schrade that’s been entirely built around this technology. Shiftscreen is an iPad app that lets you open webpages and documents – it’s a custom browser and document preview tool. That’s not exciting at all, but here’s the twist: Shiftscreen lets you open these webpages and documents as multiple windows on external monitors, where they’ll be displayed in full-screen, without pillarboxing. On the iPad, you manage these open windows and select different zoom levels to make webpages bigger or smaller; on the external monitor, you see your open webpages and documents (any document that can be previewed with Quick Look is supported by the app) in full-screen.

\n

At a glance, Shiftscreen may seem like a utility designed for teachers or people who often do presentations and may want to output a webpage or PDF document onto an external display in full-screen. The app’s developer has considered that use case, which is why Shiftscreen, in addition to a virtual touchpad to control content on the connected monitor, also has a laserpoint feature. But the use cases for Shiftscreen go far beyond lectures and presentations, and I’ve been taking advantage of the app in a bunch of interesting ways.

\n

First, due to a legacy aspect of the iPad’s multitasking system, it is possible to use two apps in Split View and let one of them output full-screen content on an external monitor; however, that app has to be placed on the left side of the Split View.12 With this in mind, I can open iA Writer or Notes on my iPad, create a Split View with Shiftscreen on the left, and take notes while looking at a big, full-screen webpage on my UltraFine 4K display. Shiftscreen doesn’t have native integration with the iPadOS pointer yet (although it should be coming soon), but clicking and dragging on its virtual touchpad is good enough for now.

\n
\"You

You can manage windows open on the external display from the Shiftscreen app on the iPad.

\n

The full-screen webpage is more comfortable than a tiny Safari window in Split View on my iPad, and I find this an excellent way to get research done on a particular topic. Similarly, because Shiftscreen can also open documents in full-screen, I can open a version of a PDF document in full-screen on the UltraFine and edit a copy of it on my iPad Pro.

\n

What if my research requires me to watch a video though? Thanks to the most recent update to the app, I can navigate with the Shiftscreen browser to the YouTube website, click on a video, and play it back in full-screen on the external monitor while continuing to work on my iPad Pro. This is nothing revolutionary for Mac users, but it’s never been possible on the iPad before, and developer Yannik Schrade found an ingenious solution to the problem by working around an old API designed for games.

\n
\"Watching

Watching YouTube videos in full-screen on an external display via Shiftscreen.

\n

My favorite use of Shiftscreen involves Markdown and a combination of apps. Stay with me, because this is a wild one.

\n

As I detailed last year, I use Working Copy to save articles from iA Writer and share my drafts with the MacStories team. Working Copy comes with a Preview feature that lets you preview Markdown documents as HTML; in the Preview menu, you can also enable an External URL option that allows Working Copy to create a local web server where the preview will be displayed. This local web server runs at a specific address (something like 192.168.1.1), and you can copy its URL to paste it in Safari and see the preview generated by Working Copy. In the latest version of the app, there’s also the option to let this local web server run in the background, which I’ve unlocked so Working Copy can always keep its local web server active.

\n
\"I

I can use Shiftscreen to open a remote preview generated from Working Copy as a full-screen webpage on an external monitor.

\n

I think you know where this is going. After copying the local web server URL for a document’s preview in Working Copy, I can put Working Copy in Slide Over and dismiss it. Then, I can put iA Writer and Shiftscreen in Split View, with Shiftscreen on the left so it outputs full-screen content on the external monitor. Because Working Copy is keeping the preview server running in the background, I can paste the preview’s external URL in Shiftscreen, which will open a webpage in full-screen on the external display. This way, I can edit a document in iA Writer and simultaneously look at a styled, full-screen preview13 of it on my UltraFine 4K monitor, which I can scroll with Shiftscreen. I can do all of this with my iPad Pro, a single USB-C cable, and a combination of apps in Slide Over and Split View.

\n

At this point in the evolution of the iPad and its operating system, I believe Apple is aware of the fact that certain users like to pair their iPads with external displays to get work done. As I noted above, the default UI mirroring system is limited by pillarboxing; at the same time, the API to present full-screen content on external monitors is old, poorly documented, and not integrated with the new reality of iPadOS multiwindow and pointer support. I may have found some ways to take advantage of apps that use the existing full-screen content API, but I look forward to Apple releasing a new, more powerful, fully multitasking- and pointer-enabled version of it in the future.

\n

In the making of this story, the iPad Pro has been sitting (propped up at a slight angle) on the right side of my desk, connected to the UltraFine 4K display with a single USB-C cable. All my interactions with iPadOS took place from the MX Keys keyboard in front of me and the Magic Trackpad 2 in between the keyboard and the iPad. The only time I had to physically touch the iPad was to confirm a purchase from the App Store by double-clicking the iPad’s side button – that’s how comprehensive the new pointer system is in iPadOS 13.4.

\n

For someone with a setup similar to mine – an iPad Pro, a keyboard, and an external monitor – the Magic Trackpad 2 is, right now, the single best accessory that can be paired with iPadOS. The combination of the trackpad, pointer system designed by Apple, and support from third-party developers makes working with the iPad Pro at a desk not only feasible, but fun even – with a level of precision and nimble interactions that were previously inconceivable for iPadOS.

\n

Modular Future

\n

Looking ahead at the next decade of iPad, I believe Apple will continue to evolve the iPad Pro line with a focus on modularity. A modular computer enables a new kind of fluid, versatile user interaction – one that can scale across different contexts, input systems, and form factors. Take a look at the existing iPad Pro and its support for USB-C, keyboards, displays, and trackpads – as I did in this story – and you can already see this strategy at play today, right now.

\n

For this ideal modular future to come to fruition, however, the iPadOS platform will need to grow in some key areas. As I explained above, the iPad’s support for external monitors needs proper integration with the pointer and multiple windows, allowing users to freely move windows across displays and interact with apps on an external display using a pointing device. The fact that Apple added a full-featured system-wide pointer this year makes me think this will happen sooner rather than later. On a similar note, while extensive, iPadOS’ trackpad options could learn a few things from macOS, where all trackpad settings are exposed in one preference panel with clear instructions; the Mac also supports Hot Corners, a feature that is not integrated with iPadOS’ native pointer yet, and which could further simplify iPad multitasking.

\n

More broadly speaking, I expect Apple to continue opening up the iPad’s USB-C port to more types of devices. Whether it’s a scanner, printer, MIDI interface, game controller, or multitrack audio interface, iPad users should be able to plug any USB accessory into the iPad Pro and have it work just like on macOS. Apple has an opportunity to rethink how the necessary drivers for these accessories should be installed (could they be extensions delivered via the App Store?). Regardless of the company’s approach, it should happen; when it comes to USB-C, I don’t see why the iPad Pro shouldn’t be as flexible as the Mac.

\n

If my journey with the iPad over the last eight years – and more specifically with the 2018 iPad Pro – has taught me anything, it’s this: I love working with a modular computer that can be a tablet, laptop, and desktop workstation at the appropriate time. The freedom to choose how I want to hold, touch, or type on my iPad Pro is unparalleled. No other computer lends itself to being used at the beach, at a desk, on the couch, or in my hands as seamlessly and elegantly as the iPad Pro does.

\n

At its core, the iPad Pro is still a tablet; with the right additions, however, it’s also become the modular computer I didn’t know I needed – and now I can’t imagine using anything else. I can’t wait to see where this approach takes me over the next 10 years.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n
\n
  1. \nI bought the 1 TB model because it offered two extra GBs of RAM. In hindsight, that was a good call. ↩︎\n
  2. \n
  3. \nFor those times when I don’t want to take control of the living room TV because someone else is watching it. ↩︎\n
  4. \n
  5. \nThe Paperlike is, well, supposed to feel like real paper for users who write and sketch with the Apple Pencil a lot, but I’m not one of those people. ↩︎\n
  6. \n
  7. \nTwo of them, backlight illumination and adjustable viewing angles, will be fixed by the upcoming Magic Keyboard↩︎\n
  8. \n
  9. \nIsn’t it fun when you can just make up a nickname for something and slap the so-called qualifier on it? ↩︎\n
  10. \n
  11. \nRemember when we could freely go out and drive around? Good times. ↩︎\n
  12. \n
  13. \nWhere by work I also mean “business tasks” that are typically involved with running a company that go beyond “just typing in a text editor”. Some people seem to think that running MacStories only involves “being a blogger”; I wish they were right. ↩︎\n
  14. \n
  15. \nUntil a few weeks ago, I couldn’t get any third-party Bluetooth keyboards to be recognized by macOS’ login screen after a system shutdown. As it turns out, many third-party keyboards don’t work after a Mac has been shut down if you have FileVault enabled since the startup disk is encrypted and doesn’t have access to the necessary Bluetooth drivers to recognize a third-party keyboard. After disabling FileVault on my Mac mini, I can now type my password with the MX Keys at startup just fine. ↩︎\n
  16. \n
  17. \nYes, there is an Accessibility setting to enable button shapes, but that’s optional. ↩︎\n
  18. \n
  19. \nNote how, because of Mac Catalyst, the Twitter team achieved feature parity across its iPad and Mac apps for features such as context menus and keyboard shortcuts↩︎\n
  20. \n
  21. \nWhich is why, unfortunately, I can’t recommend the new Brydge Pro+ keyboard for now – its trackpad is limited to two-finger swipes. ↩︎\n
  22. \n
  23. \nBack in the days of iOS 9, the app on the left was considered the “primary” one. ↩︎\n
  24. \n
  25. \nIn Working Copy, you can put a file called md-styles.css in a repo, and all Markdown documents in that repo will use it as a default stylesheet when previewed. ↩︎\n
  26. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "My iPad Pro desktop setup.\nWhen I started my iPad-only journey in 2012, I was stuck in a hospital bed and couldn’t use my Mac. It’s a story I’ve told many times before: I had to figure out a way to get work done without a Mac, and I realized the iPad – despite its limited ecosystem of apps and lackluster OS at the time – granted me the computing freedom I sought. At a time when I couldn’t use a desk or connect to a Wi-Fi network, a tablet I could hold in my hands and use to comunicate with remote colleagues over a cellular connection was all I needed. Over time, however, that state of necessity became a choice: for a few years now, I’ve preferred working on my iPad Pro and iPadOS (née iOS) in lieu of my Mac mini, even when I’m home and have access to my desk and macOS workstation.\nThe more I think about it, the more I come to this conclusion: the iPad, unlike other computers running a “traditional” desktop OS, possesses the unique quality of being multiple things at once. Hold an iPad in your hands, and you can use it as a classic tablet; pair it with a keyboard cover, and it takes on a laptop form; place it on a desk and connect it to a variety of external accessories, and you’ve got a desktop workstation revolving around a single slab of glass. This multiplicity of states isn’t an afterthought, nor is it the byproduct of happenstance: it was a deliberate design decision on Apple’s part based on the principle of modularity.\nIn looking back at the past decade of iPad and, more specifically, the past two years of the current iPad Pro line, I believe different factors contributed to making the iPad Pro Apple’s first modular computer – a device whose shape and function can optionally be determined by the extra hardware paired with it.\nThe original iPad Pro showed how Apple was willing to go beyond the old “just a tablet” connotation with the Apple Pencil and Smart Keyboard. Three years later, the company followed up on the iPad Pro’s original vision with a switch to USB-C which, as a result, opened the iPad to a wider ecosystem of external accessories and potential configurations. At the same time, even without considerable software enhancements by Apple, the creativity of third-party developers allowed iPad apps to embrace external displays and new file management functionalities. And lastly, just a few weeks ago, Apple unveiled iPadOS’ native cursor mode, finally putting an end to the debate about whether the iPad would ever support the desktop PC’s classic input method.\nSupported By\nConcepts\n\nConcepts: Where ideas take shape\nThe intersection of these evolutionary paths is the modern iPad Pro, a device that fills many roles in my professional and personal life. Ever since I purchased the 2018 iPad Pro1, I’ve been regularly optimizing my setup at home and on the go to take advantage of the device’s versatility. I’ve tested dozens of different keyboards, purchased more USB-C hubs than I care to admit, and tried to minimize overhead by designing a system that lets me use the same external display and keyboard with two different computers – the Mac mini and iPad Pro.\nAt the end of this fun, eye-opening process, I’ve ended up with a computer that is greater than the sum of its parts. By virtue of its modular nature, I find my custom iPad Pro setup superior to a traditional laptop, and more flexible than a regular desktop workstation.\nSo how exactly did I transform the iPad Pro into this new kind of modular computer? Let’s dig in.\n\nTable of ContentsTablet ModeLaptop Mode: Custom Smart Keyboard FolioUSB DAC ModeDesk ModeThe New PointerMultitouch and Updated AppsFull-Screen AppsModular FutureTablet Mode\nAt its core, the iPad Pro is still very much a tablet. And despite the number of desktop-oriented accessories I’m going to cover in this story, I still enjoy the simple act of unplugging everything from my iPad Pro – including its Smart Keyboard Folio cover – and sitting on the couch to read longform articles and books, watch videos, or take care of MacStories admin tasks with the Apple Pencil.\nIt may be a trite statement a decade into the iPad’s existence, but no traditional portable computer, from Apple or other companies, beats the iPad’s inherent simplicity when it comes to holding a large screen in your hands and controlling it with multitouch. In spite of such obviousness, I feel like I should reiterate this sentiment as, somewhere along the conversation surrounding post-PC hybrids and “what’s a computer” rhetoric, we may have lost track of the tablet’s basic appeal.\nThe 12.9” iPad Pro is not a lightweight tablet: its footprint makes it impossible to operate with one hand, and when you compare it to the sheer portability of an iPad mini or Kindle, you’d be hard-pressed not to consider it an unwieldy affair. At the same time though, the big iPad Pro makes for an amazing “couch tablet” experience: watching YouTube videos2 and reading manga are fantastic activities to perform on a 12.9” display resting on your lap; whenever I need to edit an article that’s going to be published on MacStories, I like to put the iPad Pro in portrait mode on my lap (so I see more text onscreen), load up our team’s GitHub repository as an external location in iA Writer (more details here), and use a custom MacStories preview template to edit and read the piece as it would look on our website. The illusion of holding an article in my hands is reinforced by the iPad Pro’s near edge-to-edge design, a unique trait that I don’t appreciate as much when I use the device as a “computer” on a desk, paired with an external keyboard.\nThe large iPad Pro is fantastic for reading longform stories.\nTo make the iPad Pro’s tablet experience more enjoyable and flexible, a few months ago I installed an anti-glare screen protector. Initially, I tested version 2 of the popular Paperlike matte screen protector, but didn’t like its somewhat complex installation procedure and rough texture.3 Then, at the recommendation of some MacStories readers, I started testing the Moshi iVisor screen protector and fell in love with it. This screen protector is a thin film that can be applied to the iPad Pro’s display in 30 seconds; amazingly, it leaves no air bubbles, can be washed and reused, has a smooth texture that is much less aggressive than the Paperlike’s, and, more importantly, adds a matte layer on top of the iPad Pro’s display that nearly eliminates all reflections.\nI started looking into matte screen protectors for a couple reasons. First, I’ve always found it annoying I couldn’t read with my iPad Pro while sitting outside on our large balcony without seeing all kinds of reflections on the tablet’s screen; additionally, looking ahead at summer 2020 and iOS/iPadOS review season, I didn’t want to be in the same situation as last year – trying to desperately find some shade under a beach umbrella in a vain attempt to edit my annual review on the iPad Pro’s reflective screen. If Apple allowed a more expensive, build-to-order matte display for the iPad Pro, I would absolutely go for it because I like working outside in the spring and summer here in Italy. In the absence of an official option, I had to find a third-party alternative.\nI’ve been using my iPad Pro with the Moshi iVisor matte screen protector for about three months now; not only has this modification vastly improved my experience with using the iPad under direct sunlight or other light sources, but when I look at an iPad without a matte screen protector, I don’t know why I didn’t try this approach years ago. Unlike the Paperlike 2, the iVisor can be installed in seconds and easily re-applied without creating air bubbles, and I can still swipe with my fingers across the display without feeling too much friction. Color fidelity and image crispness is somewhat impacted by the addition of a screen protector, but, again, I’ve found the iVisor to perform better than the Paperlike in this regard too.\nImage quality remains crisp and vivid even with the Moshi screen protector on.\nThe Moshi iVisor up close.\nEven though it’s not advertised for those who seek a paper-like experience when handwriting or sketching on iPad Pro, I’ve found the iVisor to add just the right amount of friction for the Apple Pencil too. I’ve never been a heavy user of the Apple Pencil myself (John has far more creative use cases for it when it comes to brainstorming and mind-mapping), but it’s my go-to accessory whenever I have to review and sign PDF documents from my accountant. When I have to do that, I like to grab my iPad Pro and Apple Pencil, relax on the couch (if you have to read boring legal documents, you might as well do it from the comfort of a sofa), and annotate in PDF Viewer. I could do this with my Mac mini, or with a MacBook, but nothing beats the simple act of holding a document and signing it with an Apple Pencil.\nThroughout the rest of this story, you’re going to see and read about various pieces of additional hardware I’ve used to unlock different modes for my iPad Pro. I felt it was equally important, however, to emphasize that one of those many modes still is, perhaps surprisingly to some, to use the iPad Pro as a tablet. No keyboard, no trackpad, no external display: just a screen I can hold in my hands – the only way it used to be years ago, and something I don’t ever want the iPad to lose.\nLaptop Mode: Custom Smart Keyboard Folio\nMy relationship with Apple’s Smart Keyboard Folio was off to a rocky start in November 2018, but, as I wrote last year, with time I’ve grown to appreciate the simplicity and reliability of Apple’s slim keyboard cover. Unlike Bluetooth keyboards, the Smart Keyboard feels like an integral component of the iPad Pro: by eschewing Bluetooth pairing and battery life concerns, the Smart Keyboard is always ready to type as soon as you open it; like the second-generation Apple Pencil, the act of letting the Smart Keyboard make contact with the iPad Pro means the two can work together. The Smart Keyboard has its limits4, but I’ve enjoyed using it for what it is: a full-size keyboard that requires no pairing and adds minimal bulk to my iPad Pro while also offering essential protection for its display. In my mind, no other keyboard accessory ticks all these boxes.\nThat’s not to say I was just willing to accept the Smart Keyboard Folio’s design limitations. As I also detailed last year, I employed the so-called Kickstand Method5 to mod the Smart Keyboard Folio with two small metal kickstands, which allowed me to use the iPad in software keyboard mode (propped up at a slight angle) without removing the Smart Keyboard Folio.\nFor context:\n\n Following a tip from a MacStories reader a few months ago, I came across these metal kickstands by Spigen, which are available for around $10 each on Amazon. These kickstands are designed for smartphones: you attach the base of the kickstand via its built-in adhesive tape to the back of your phone’s case; then, when you need to watch a video or have a video call and would like to keep the screen propped up on a desk, you can just flip open the kickstand – which is sturdy and reliable – and set your phone down. It’s a simple, unobtrusive, robust design that is perhaps a bit more utilitarian than a PopSocket.\n But back to the idea I received on Twitter a while back: what if instead of using a kickstand with an iPhone, I attached two kickstands in parallel to the back of the Smart Keyboard Folio so that, with the cover folded on itself, they would prop up the iPad at an angle, thus creating the “touch mode” Apple didn’t support out of the box?\n\nAnd here’s what the result looked like at the time:\niPad Pro and kickstands.\nUsing the iPad Pro in software keyboard mode without removing the Smart Keyboard Folio.\nNow, some of you may have thought that was just an experiment that wouldn’t last long. And I wouldn’t blame you – it is my job, after all, to try multiple accessories and apps, most of which don’t stick around for long. But the Kickstand Method is different: not only have I been using the original set of kickstands to prop up my iPad whenever I want to type with the software keyboard, sign documents with the Apple Pencil, or just change input methods for a while, but I liked the results so much, a few months ago I added a second set of kickstands to the back of the Smart Keyboard Folio. To help you visualize everything, here’s what my Smart Keyboard Folio looks like now:\nThe new stickers are the result of WWDC 2019 plus an order of hundreds of assorted stickers from Amazon.\nAs you can see, the new kickstands – also placed parallel to each other – sit lower than the original set. With the two additional kickstands, I can now prop up the iPad Pro in “movie mode”: the base of the Smart Keyboard Folio folds on itself so it lays flat on the back of the device; the kickstands rest on the back of the cover, creating a 50-degree angle that is ideal for watching videos, playing games, or FaceTiming with the iPad Pro’s large display.\nMy new angle for the iPad Pro thanks to additional kickstands.\nTogether with adding a memory foam layer to the AirPods Pro silicone tips, these kickstands are my favorite mod I’ve ever applied to a tech accessory. As I explained last year, these kickstands have added practically no weight to my iPad Pro and haven’t dented the Smart Keyboard Folio’s back cover at all. In return, they’ve allowed me to get more out of the Smart Keyboard Folio since I can use it in more contexts that wouldn’t otherwise be supported without the additional viewing angles created by the kickstands. Furthermore, these kickstands are also behind my favorite setup (which I’ll describe later): the iPad Pro laying almost flat on my desk next to the UltraFine 4K display.\nMy newfound appreciation for the Smart Keyboard Folio notwithstanding, I, like other iPad users, am ready to say goodbye and switch to the upcoming Magic Keyboard, which will feature a freely adjustable viewing angle, built-in trackpad, and backlit keys. But I also have questions.\nWill the Magic Keyboard be moddable enough to support folding it on itself and using the Kickstand Method again? (I don’t think it will.) Will it be too heavy to carry around? (I think it’ll be heavier than the Smart Keyboard Folio, but not incredibly so.) Will it be stable enough to use on my lap? (I believe Apple has figured this out.) I don’t have definite answers to any of these questions yet, but I’m keen to find out next month.\nUntil that happens, I wanted to reiterate how much I’ve enjoyed working with my modded Smart Keyboard Folio for the past year. I typed my entire iOS and iPadOS 13 review on it, and I’ve carried it around with me to WWDC, the beach, and countless car trips.6 Two years ago I never thought I’d say this, but the Smart Keyboard Folio has been the best iPad keyboard experience I’ve had to date.\nUSB DAC Mode\nThis particular configuration of my iPad Pro is an extremely niche one, and I believe the majority of MacStories readers will not be interested in it, but I wanted to mention it because it’s fun, geeky, and, in a way, oddly fascinating.\nAs I explained in my recent coverage of Meta for Mac, for the past year I’ve been building a personal music collection by purchasing my favorite albums in the lossless FLAC format. To enjoy these albums, I have a high-resolution Sony Walkman music player which I pair with my absolute favorite over-ear headphones – the Sony MDR-Z1R – via a balanced cable. If you’re curious about all the details behind my setup, I covered everything in the December 2019 episode of the MacStories Unplugged podcast for Club MacStories members; in short, I’ve been discovering sonic details of my favorite songs I never knew existed, and it’s nice to disconnect from the Internet every once in a while and simply enjoy listening to music without the distractions of Twitter or the endless catalog of a music streaming service. It’s pure audio bliss, it’s nice, and we could all use more nice things these days.\nSometimes, however, I want to listen to music with my good Sony headphones and continue working on my iPad Pro at the same time. So after some research (and thanks to the help of Connected listeners), I realized it was possible to use the Sony Walkman as an external DAC (a digital-to-analog converter) that can drive my headphones with high-res music coming from the iPad Pro’s USB-C port.\nMy music library is stored on a Samsung T5 SSD that is connected to my Mac mini server, which is always running in the background and shared on our local network. This means I can access the contents of the T5 drive via SFTP and SMB, allowing me to connect to it from the iPad’s Files app and other iPad file managers as well. To listen to my music library in its original format with my Sony headphones, I can’t use the iPad alone: the MDR-Z1R come with a 4.4mm balanced audio cable, which needs to be plugged into the Walkman. Here’s where things get interesting: the Walkman supports a USB DAC mode, which lets the device connect to a computer and act as an audio passthrough for the headphones connected to it. And as it turns out, the iPad Pro’s USB-C port supports all of this – provided you use the right apps to start music playback.\nHere’s what I did to listen to high-resolution music (both FLAC files in 16/24-bit at 44.1/48/96 kHZ and DSD) from the iPad Pro:\nFirst, I purchased Sony’s micro USB adapter for the Walkman’s proprietary port;\nI bought a cheap and short USB-C to micro USB cable from Amazon;\nI experimented with different audio players for iPad, and eventually settled on Neutron.\nWhen I want to listen to an album in high-resolution and work on the iPad at the same time, all I have to do is enable DAC mode on the Walkman and connect it via USB to the iPad Pro; the iPad will see the Walkman as an external audio device and set it as default output. Then, I can open Neutron, browse my external music library, and start playback. Audio will be passed in its original lossless format from the iPad to the Walkman to my headphones, so I can continue working while retaining the ability to control playback from my keyboard’s media keys.\nMy Sony Walkman as an external DAC for the iPad Pro.\nUnfortunately, Neutron is – how can I say this – not the prettiest app around. The app’s UI is…confusing at best, and it doesn’t scale well to the iPad’s large display. But, from an audio engine perspective, Neutron is incredible: the app is powered by a 32/64-bit audio rendering engine that delivers high-resolution audio via Lightning or USB-C without frequency resampling and with all DSP effects applied – even if the audio stream is coming wirelessly from a server.\nNeutron is not a good-looking app. However, I always leave the app running in the background, so that doesn’t bother me much.\nNeutron makes up for its poor UI with an incredible audio processing engine.\nNeutron is the only app I’ve found that can source audio files from network locations such as SMB or FTP, and it even offers native DSD and gapless playback. If you’re an audiophile, you know what all this means and, like me, you’d be willing to forgive the app’s poor UI in exchange for its astounding functionality. Just take a look at the list of features on Neutron’s website to see what I mean.\nUsing the iPad Pro with an external DAC for high-resolution audio is, quite possibly, the definition of a niche use case. Regardless, this continues to prove my point: it’s also thanks to USB-C that the iPad Pro now supports a variety of accessories, which in turn has allowed the device to fit into new and different setups.\nDesk Mode\nEver since I upgraded my home office with a new desk, Mac mini, and UltraFine 4K monitor in November 2018, I’ve been working toward a single goal: building a setup that would allow me to use the same external display and keyboard with two different computers and OSes – the Mac mini and iPad Pro. Same desk, two vastly different experiences. It took me a while, but thanks to the improvements in iPadOS 13.4 (and a late realization on my part), I’m happy to say I finally have the “desktop iPad Pro” setup I’ve long desired.\nFirst, an explanation is in order for those who may land on this section without being familiar with my tastes and needs. Most days, I only turn on my Mac mini to let it run homebridge and Plex in the background, and I primarily get work done on my iPad Pro. A couple times a week, I also record podcasts on my Mac mini; I could do this from my iPad Pro, but because it’s a more involved setup, I only use my iPad Pro to record podcasts when I do not have access to my desk. The Mac is still essential to an important part of my work, but it’s actively used for less than six hours each week.\nIn case it wasn’t clear, I enjoy working7 on my iPad Pro more than the Mac mini. Or more specifically, I prefer the design, interactions, and app ecosystem of iPadOS to macOS. For this reason, when I was rethinking my home office two years ago, I had an idea:\nWhat if I could create a – you guessed it – modular setup that supported both macOS and iPadOS with minimal adjustments necessary?\nMy desktop setup.\nEnter the UltraFine 4K display, which, thanks to a single USB-C cable, can work as an external monitor both for my Mac mini and 2018 iPad Pro. I’ve written about this display and my penchant for using an iPad Pro with an external monitor (and all its limitations) several times before, most notably here and here. Here’s a key section from last year:\n\n With a single USB-C cable (compatible with USB 3.1 Gen. 2 speeds), the iPad Pro can mirror its UI to an external 4K monitor, support second-screen experiences for apps that integrate with that API, and be charged at the same time. In the case of the UltraFine 4K display, the monitor can also act as a USB hub for the iPad Pro thanks to its four USB-C ports in the back; as I mentioned last year, this allows me to plug the Magic Keyboard (which I normally use via Bluetooth with the Mac mini) into the UltraFine and use it to type on the iPad Pro. To the best of my knowledge, there are no portable USB-C hubs that support 4K@60 mirroring to an external display via USB-C’s DisplayPort alt mode.\n Despite the fact that I can’t touch the UltraFine to control the iOS interface or use a trackpad to show a pointer on it, I’ve gotten used to working with iOS apps on the big screen while the iPad sits next to the keyboard, effectively acting as a giant trackpad with a screen. For instance, when I want to concentrate on writing while avoiding neck strain or eye fatigue, I just plug the iPad Pro into the UltraFine, connect the Magic Keyboard in the back, and type in iA Writer on a larger screen. No, pillarboxing is not ideal, but the bigger fonts and UI elements are great for my eyesight, and I still get to work on iOS, which is the operating system I prefer for my writing tasks.\n\nKeep in mind the second quoted paragraph, as it’s going to be relevant in a bit.\nNow, the reason I want to use my iPad Pro with an external display is simple enough: even with pillarboxing, it’s bigger and doesn’t cause neck strain if I have to type for several consecutive hours. I get to enjoy the benefits of iPadOS and all my favorite apps while using a large display that sits at eye level and is more comfortable than hunching down at a desk to look at my iPad’s smaller display.\nWhen I last wrote about using the iPad Pro with an external display last year, I had the one-cable-to-rule-them-all ordeal pretty much figured out (yay USB-C!), but the external keyboard was a problem: I didn’t want to manually unpair the Magic Keyboard from the Mac mini every time I wanted to use it with the iPad Pro. Additionally, the iPad didn’t support mice and trackpads – via AssistiveTouch or otherwise. Suffice to say, a lot has changed over the past few months.\nThe first aspect I was able to fix8 is the keyboard. I’ve stopped using the Magic Keyboard and I now use the Logitech MX Keys, a Bluetooth keyboard that supports switching between multiple devices with the press of a button. There’s plenty to like about the MX Keys besides its multi-device pairing: it’s backlit, its build quality is terrific, it charges via USB-C, and keys have a bit more travel than the Magic Keyboard. The only downside, at least for me, is that the MX Keys cannot be purchased in a compact layout that omits the number pad on the right side of the keyboard, which I never use.\nUnlike the Magic Keyboard, the MX Keys comes with media keys to activate specific functions such as volume control and media playback. Source: Logitech.\nThe most important feature of the MX Keys, as I mentioned above, is the ability to quickly switch between multiple paired devices – in my case, the Mac mini and iPad Pro. When I want to work with the iPad Pro at my desk, I press the ‘1’ switch button, and the iPad instantly connects to the MX Keys; when it’s time to use the Mac mini, I press ‘2’ and the keyboard pairs with the Mac again. It’s that simple, and I wish Apple’s own extended Magic Keyboard offered a similar function, perhaps based on the company’s proprietary wireless chips.\nWhich brings me to pointer support in iPadOS 13.4 and controlling content mirrored from an iPad onto an external display.\nThe New Pointer\nIn my Beyond the Tablet story last year, which was published before the debut of iPadOS and its support for USB/Bluetooth mice via Accessibility, I described why I enjoyed connecting my iPad Pro to the UltraFine 4K to focus on one task at a time, particularly writing. If I was spending the majority of my time typing in iA Writer, then not having a way to control the iPad’s UI shown on the monitor without touching the iPad’s screen was not an issue – I could just keep typing with the keyboard. I also noted how I could keep the iPad propped up at an angle next to the UltraFine thanks to its custom kickstands and use my right hand or the Apple Pencil for the occasional touch interaction with the display.\nBesides the placement of my iPad Pro, all of this has changed with the release of iPadOS 13.4 and its native integration with mice and, more importantly, the Magic Trackpad 2. I don’t mean to speak ill of last year’s AssistiveTouch-powered mouse integration – a feature designed for the Accessibility community that also garnered mainstream attention – but it never quite clicked for me (no pun intended) since it didn’t allow for full, system-wide control of the iPadOS interface. The new native pointer does, and it’s a game-changer for anyone seeking to turn their iPad Pro into a desktop workstation.\nThe most important difference from last year’s Accessibility feature is the pointer: it is deeply embedded within the system’s UI, which has been updated to recognize the pointer and adapt certain UI elements to it. When you control an iPad with a mouse by using it as a pointing device with AssistiveTouch, the indicator displayed onscreen is, effectively, a virtual replica of your finger; the iPadOS UI has no idea that a pointer is moving onscreen because that Accessibility pointer only supports tap events, which prevents it from having access to features such as scrolling inertia, hover state, and multitouch. By contrast, the pointer in iPadOS 13.4 is entirely integrated with UIKit and dynamically adapts to different UI elements and areas by employing a mix of shapes, visual effects, and magnetic snaps.\n\nThere’s a symbiotic relationship between the iPadOS UI and the pointer.\n\nBy default, iPadOS’ new pointer is a tiny floating dot that moves across the screen and intelligently changes its hue depending on the color of content underneath (a simple touch, but a clever one). The difference between the AssistiveTouch pointer and iPadOS 13.4’s flavor is immediately clear as soon as you start moving it around: all elements of the iPad’s UI can be interacted with using the pointer, from the small separator between apps in Split View and the clock in the status bar to the dock and apps on the Home screen. This is true system-wide integration between the interface and an external input mechanism – a first for Apple’s touch OS. And it’s not just that the pointer can click on any item it hovers over; unlike traditional desktop pointers, there’s a symbiotic relationship between the iPadOS UI and the pointer: interface elements visually react to the pointer, which often changes shape, momentum, color, and behavior based on the target underneath.\nDevelopers can create custom pointer effects and features (and I’ll cover some examples later in this section), but I want to highlight a couple default behaviors that made me realize how Apple’s iPadOS pointer is the perfect blend of utility and whimsy.\nAs soon as the pointer flies over a text field, it quickly squishes – Knight Bus-style – to transform itself into an insertion point. This type of pointer makes it easy to perform fine-grained text selections by clicking and dragging a text selection onscreen; if you pay close attention to it, you’ll notice that the insertion point even “snaps” to individual lines in a block of text, almost as if magnetically attracted to them. The animation is fun, and selecting text becomes much easier and faster than doing so via touch – especially since Apple (bafflingly enough) got rid of the magnification loupe for text selection in iOS and iPadOS 13.\n\n \nSelecting text with the system pointer.\n\nThe aforementioned snapping behavior is what Apple has officially labeled pointer magnetism, and it highlights another difference from traditional desktop pointers: an increased visual affordance when navigating the iPadOS UI. Whenever the pointer arrives in the proximity of a UI element that has been updated for this new interaction method, the pointer’s blob transforms again, accelerates toward the element, and snaps to it, usually highlighting it with a translucent, rounded rectangle. There are a variety of visual effects developers can employ for buttons and other elements that react to the pointer, including parallax and color changes. Try hovering with the pointer over the toolbars of Notes and Mail, or perhaps over the multitasking controls in Split View, and you’ll see how it dynamically changes its appearance, all while selected elements bounce, wiggle, and react to the pointer to visually communicate that, yes, they’re selected and waiting for interaction. Once again, whimsical and useful context at the same time.\n\n \nVarious types of pointer effects.\n\nIn using the new system pointer since the release of iPadOS 13.4, I’ve realized that, unlike others, I appreciate its constant state of transformation because it works well with the underlying aesthetic of the iPadOS UI. Differently from macOS, most buttons in UIKit have no visible shape or contour – they’re just glyphs.9 Arguably, those “buttons” are easier to interact with when you’re not using a mouse or trackpad because you can simply reach out with your finger and touch them to see what happens. But the pointer introduces a layer of abstraction between the interface and user: suddenly you’re not manipulating the screen anymore – you’re controlling a virtual pointer that translates a movement of your finger into an action onscreen. This separation between UI and user is what has stumped novice PC users for decades and why modern smartphones and tablets are generally considered more user-friendly than traditional computers.\nAnd here lies the core of Apple’s idea with pointer support in iPadOS 13.4, and why I ultimately believe they’ve done an excellent job with it: rather than merely mimicking the macOS pointer, Apple sought to find a middle ground between the inherent usability of a touch UI and the abstraction of a pointer. The result is an adaptive pointer that communicates context and the addition of a new layer between the device’s UI and the user – the visual effects that show you what’s being selected and what may happen if you scroll or perform a click. It may seem obvious in hindsight – the best innovations often do – but I believe this was a massive, multi-year undertaking for the UIKit team at Apple; it’s no surprise the system is mature enough to support a variety of integrations for developers and that the company’s explanation is thoughtful and thorough. They must have spent years getting all these pieces in place.\nIn practice, the net result of Apple’s pointer efforts is a comprehensive system that lets me fully control the iPadOS UI mirrored on my UltraFine 4K monitor without ever touching the iPad Pro. This has fundamentally altered the ergonomics of my setup and improved how quickly I can get work done with multiple apps in this configuration.\nWith a native pointer, I can finally select text with higher precision than multitouch without taking my hands off the keyboard and trackpad in front of me. iA Writer, my favorite text editor, supported text selection with the pointer without an app update required; obviously, both Mail and Notes (two apps I also use on a daily basis) worked with the pointer right away as soon as I updated to iPadOS 13.4. Even once it transforms into an insertion point, the pointer can perform other functions: if I hold down on the trackpad to left-click then drag, I can drag a text selection somewhere else; if I right-click instead (a two-finger click on the Magic Trackpad 2), the native copy and paste menu comes up. And that’s not all: again without requiring an update, buttons in the enhanced copy and paste menu of iA Writer support the pointer too, so I can hover over them and see which one I selected at a glance.\niA Writer’s enhanced copy and paste menu works with the pointer out of the box, just like other standard UIKit elements in the app.\nAnother interesting side benefit of system-wide pointer support: it’s not unusual for text-heavy iPad apps, particularly those veering toward the “pro” side of the market, to implement their own custom text selection engines. I understand why some developers do this, but it often does more harm than good to the user experience as custom selections always differ in odd ways from the default iPadOS text selection mechanism, breaking muscle memory. Since iPadOS 13.4 came out, I’ve seen a handful of these apps switch back to the native text selection API to fully support the new pointer; for example, the excellent Textastic, whose advanced text editor can now be fully controlled with a mouse or trackpad.\nIn my experience with using the iPad Pro through an external display, the most remarkable aspect of pointer integration across the system is how many third-party apps have received support “for free”, without requiring an update, simply by virtue of supporting native frameworks and APIs. This creates a virtuous cycle that encourages developers to adopt modern APIs as soon as possible, thus making the ecosystem stronger and allowing Apple to introduce new functionalities that already work with existing components.\nNetNewsWire, which I’ve been testing as my go-to RSS reader lately, supported the pointer as soon as 13.4 launched: in the app, I can select and click toolbar items, swipe with two fingers on the trackpad to show actions for individual articles, and even right-click to show the context menu. Similarly, in Twitter for iPad I can click on tweets, switch between views in the sidebar, and even right-click on tweets to bring up the native context menu.10 Developers of apps with custom UI elements may want to look into the new pointer API for further customizations; overall, I’ve been impressed by how many third-party apps mostly worked out of the box with the pointer in iPadOS 13.4.\nNetNewsWire supported the pointer right away.\nSpeaking of context menus, I like what Apple has done to adapt them to the pointer. Before iPadOS 13.4, if I wanted to invoke a context menu with my Logitech mouse, I had to assign a long-press gesture to a button, click it, wait for the “fake” gesture to perform, then interact with the menu. The native pointer has brought a much better activation method: by default, right-clicking opens a context menu (or copy and paste menu if in a text field); the menu comes up immediately without waiting (there’s no fake long-press to wait for), and it doesn’t carry the context menu’s preview panel, thus replicating the look of a desktop contextual menu. I love how fast invoking these menus is now, and I appreciate that I can hover over each button in the menu before making a choice. The clunkiness involved with showing context menus was one of the pain points of the old mouse integration based on AssistiveTouch; in iPadOS 13.4, right-clicking in an iPad app to show a context menu feels just as natural as its Mac counterpart.\nWhen activated via the pointer, context menus come up immediately and don’t embed a preview of the selected item.\nStand Mode\nThis different mode for my iPad Pro doesn’t take place at my desk, but because it is largely based on the same accessories, it’s worth a mention here. For those times when I want to work at the kitchen table but retain the ability to keep the iPad at eye level and avoid neck strain, I put the device in a Klearlook vertical stand and use it with the MX Keys keyboard and Magic Trackpad 2. The setup looks like this:\nThe iPad Pro and a vertical stand.\nI bought the Klearlook stand at the recommendation of my friend Myke Hurley last year: it is a minimal, relatively sturdy piece of hardware with a weighted base that lets me adjust the height of the screen in front of me and work on the iPad Pro without an external monitor and without staring down at it. Before iPadOS 13.4, I would often have to lift my hands off the keyboard to touch the iPad’s display to perform specific functions; now, my hands always stay on the keyboard and trackpad and I can control the entire iPadOS UI with the pointer. This is yet another example of a different mode that would be impossible to achieve with a non-modular computer.\nMultitouch and Updated Apps\nWhat is going to bring the desktop iPad experience even closer to a traditional desktop computer, however, is iPadOS 13.4’s support for multitouch gestures and developers building deeper pointer integrations in their apps.\nAs I mentioned above, for the past couple weeks I’ve been using Apple’s Magic Trackpad 2 to control my iPad Pro whenever it’s connected to the UltraFine 4K display. I also own a Logitech MX Master 3 mouse with configurable buttons that can be paired with the iPad Pro; after testing both peripherals, I soon realized the multitouch trackpad was going to help me navigate the system and switch between apps more quickly – something I was discouraged from doing with the old setup last year.\nIn addition to the ability to control UI elements in apps with an adaptive pointer, Apple built full support for navigation across the system through a combination of multitouch gestures and swipe gestures toward specific areas of the screen. The three most common actions when working with multiple apps on iPad – opening the app switcher, moving between apps, and going back to the Home screen – can be performed with three-finger swipes:\nA three-finger vertical swipe goes back to the Home screen;\nA three-finger vertical swipe and hold shows the app switcher;\nA three-finger horizontal swipe switches between apps.\nI find these gestures intuitive, reliable, and consistent with their touch counterparts when I’m using the iPad without a trackpad. Swiping up with three fingers and pausing to show the app switcher, then scrolling horizontally with two fingers to pick different apps instantly clicked for me – almost as if support for multitouch trackpads had always been part of iPadOS. After having used these gestures, I don’t think I could ever go back to a trackpad without support for three-finger swipes.11\nOther gestures Apple baked into iPadOS 13.4 may need some additional fine-tuning. These are the gestures that require you to quickly “slide” with the pointer into a specific area of the screen: the Home indicator to show the dock; the upper right corner to show Control Center; the right edge of the screen to open Slide Over; the upper left corner to view notifications. In my experience, showing the dock is fine, but Control Center, Slide Over, and notifications often fail to activate on the first slide into the associated corner. To overcome this, I’ve started sliding the pointer into the corner twice – first to place the pointer, then to activate the specific function – which seems to trigger the gesture more consistently. I wonder if Apple could tweak the momentum required to activate these features so they always appear immediately.\nDespite these initial struggles with sliding the pointer into such “hot corners” (different from Hot Corners, an actual feature of macOS and iPadOS’ Accessibility), I also want to call out how nice it is to interact with Slide Over via the Magic Trackpad. Once Slide Over is open, I can use the same three-finger swipe gestures mentioned above to cycle between apps and close individual apps in the Slide Over stack; alternatively, I can hover with the pointer over the pulling indicator at the top of a Slide Over app, let the pointer attach to it, then drag the app to the other side of the screen or drop it into Split View. These interactions are consistent with the iPad’s existing gesture vocabulary, but they can be performed from a trackpad without touching the screen at all – another reason why I can’t imagine using non-multitouch-enabled trackpads with iPadOS.\n\n \nInteracting with Slide Over using the pointer.\n\nPointer and trackpad integration dramatically improves interactions with apps in the context of an iPad Pro used at a desk. Based on what I’ve seen and tested so far, third-party developers have already begun taking advantage of the pointer and two-finger gestures in useful ways.\nIn an upcoming version of iA Writer, my favorite text editor, you’ll be able to swipe horizontally with two fingers anywhere in the text editor to show and hide the document library. This may not seem like a big deal until you realize how much faster it is to do so from a trackpad instead of having to select the ‘Back’ button in the upper left corner of the app to show the library. Furthermore, iA is also adding support for renaming files by clicking on their title in the title bar, which becomes a highlighted element on hover – a great approach I’d love to see in more apps in the future.\nThe upcoming version of iA Writer for iPad will let you click the document’s name in the title bar to rename it. I want this pointer interaction in every app now.\nI’ve also been impressed by the deep pointer integration in the latest version of Fantastical for iPad. No matter which view you’re using, you can now hover over events/tasks and they’ll respond to the pointer with a subtle bounce effect that makes the UI feel alive even without touching the screen. But there’s more: you can also select buttons in the upper toolbar and segmented control and – my favorite detail – hover with the pointer over individual days in the sidebar’s mini calendar. As you do that, selected days will be highlighted with a square indicator; do it quickly enough, and you’ll get the result shown in the video below.\n\n \nFantastical’s excellent support for the iPadOS 13.4 pointer.\n\nPointer support has added a new dimension to Fantastical on iPad, which allows me to use it on my UltraFine 4K monitor without compromises. Fantastical is a perfect example of the kind of deep integration with custom effects I’d like to see more iPad developers consider going forward.\nAnother excellent example is Broadcasts, Steve Troughton-Smith’s new Internet radio app. Broadcasts features both default pointer effects (for instance, when hovering over toolbar buttons to highlight them) as well as custom ones such as the lift animation that occurs when hovering over radio stations in the main grid. Additionally, Troughton-Smith was even able to bring tooltips – a classic Mac feature – to iPadOS when the pointer has snapped and paused on top of a button.\n\n \nBroadcasts features rich pointer integration, keyboard shortcuts, and Mac-like context menus.\n\nBroadcasts also offers a setting screen to choose whether the app should be mirrored on an external display or output full-screen content. More apps should offer a similar option.\nIndeed, besides enabling iPad apps to be fully controlled without touching the device, pointer integration also means developers can easily replicate features from macOS. Nowhere is this more apparent than Screens, Edovia’s popular VNC client that lets you control a Mac/PC from your iPad. Screens has already been updated with pointer integration, and this is where things get kind of amazing in terms of iPadOS and external displays.\nWhen I work with the iPad Pro at my desk, I may have to occasionally check on my Mac mini to monitor its Plex server or transfer FLAC files to my Walkman. I could unplug the iPad Pro’s USB-C cable from the UltraFine display and plug the Mac mini’s cable in again to do this, but there’s a more elegant way to handle it.\nWith my Mac mini running in the background, I can open Screens on the iPad Pro, which instantly logs me into macOS with my credentials. Here’s why this setup is incredible: Screens for iPad supports full-screen output on external displays (more in the next section), which means I can interact with a full-screen macOS UI on the UltraFine display that is actually being transmitted from an iPad app over USB-C. In the latest version of Screens for iPad, I can use the Magic Trackpad to click-and-drag macOS windows, right-click to open contextual menus, and otherwise use the native macOS pointer from my iPad without even seeing the iPadOS pointer on my external display. It’s a mind-bending setup, but it works beautifully – you’d be forgiven if you looked at the photo below and thought I was using macOS and the iPad Pro next to each other. In reality, that’s just my iPad Pro running Screens in external display mode along with a Magic Trackpad 2.\nNot a Mac.\nEffectively, this is macOS as an app. Thanks to the pointer API in iPadOS 13.4, the folks at Edovia have been able to emulate classic macOS interactions from a trackpad connected to the iPad. In my experience, the approximation is close enough: were it not for the loss of image quality due to the VNC protocol, you’d be fooled into thinking you’re using macOS from a Mac physically plugged into the UltraFine display. Still, because performance and image quality are good enough, as a result of this Screens update I’ve only plugged the Mac mini into the external display twice this week to record AppStories and Connected.\nBut wait, there’s more\nBackup Files to USB Drives from iPad Pro\n\nBy subscribing to Club MacStories you’ll receive MacStories Weekly, our Club-exclusive email newsletter. This week, we’ve got a special iPad at 10 issue that expands on the stories we’re publishing this week including:\nA deep dive into John’s iPad mini Home screen and why it’s organized the way it is\nA collection of favorite iPad games\nAnd more\nAdditionally, I’ve also come up with a system to easily back up files from an iPad Pro to a folder stored on an external USB drive. This system requires two taps to run, and files are always backed up in the same destination folder on an external USB drive without having to select the folder every time. This system is based on Scriptable; in today’s issue of MacStories Weekly (#218), Club members will be able to download the script that makes this possible and read more details about my implementation.\nYou can unlock all of these perks with a Club MacStories subscription, starting at $5/month. And in doing so, you’ll also get access to the complete archive of Club MacStories with over four years worth of exclusive content.\nFull-Screen Apps\nIn future versions of iPadOS, I would love the ability to get rid of pillarboxing when the iPad is connected to an external display. As I described last year, I’ve grown used to the black bars that appear at the sides of my UltraFine 4K display, and the benefits of this setup, at least for me, outweigh the issue; still, I’d welcome the ability to output full-screen app UIs to the external display and control them with a trackpad.\nWhile we wait for iPadOS to properly support external displays, however, it is possible to get an idea of what such a system might look like by using apps that take advantage of an existing API to output full-screen content on external displays. I don’t use this particular mode every day, but it has its niche, and a handful of developers have devised some pretty clever implementations for it.\nOriginally launched several years ago and primarily designed for gaming purposes, Apple’s second screen API (based on UIScreen) allows iPhone and iPad games to output full-screen content on a display connected via AirPlay or a cable. The idea, as we explored in an old MacStories article, was to let iPhone and iPad users play games on a big screen by using their touch devices as controllers. The API was never really improved by Apple, but that didn’t stop developers of certain productivity apps from exploiting it for potentially unforeseen use cases.\nThe iPad apps that integrate with this API are few, and because this technology hasn’t been integrated with the iPadOS pointer yet, it is not possible to “navigate” with the pointer from the iPad’s screen to the external monitor, as you would when using a Mac connected to an additional display. In addition to the aforementioned implementation in Screens, however, I also have a couple other examples I’d like to highlight.\nMindNode, my favorite mind-mapping app (which I use every year to outline my iOS reviews) added support for displaying full-screen map content on a connected monitor last year. If MindNode detects an external monitor, you can choose between two modes: you can mirror the entire app to the external display (with pillarboxing), or output a full-screen map to it. If you pick the latter, you can also decide whether the full-screen map should follow the zoom and scroll levels of the iPad (keeping both versions of the map in lockstep) or if you want to lock the viewport on the external display.\nMindNode’s external display settings.\nMindNode can output a full-screen map to the external display.\nWith the ability to control the viewport, I can lock the external display to a specific branch of the map, which lets me focus on the details of an individual section, while the iPad shows me the entire map, and vice versa. Even though I cannot move the pointer to the external display and directly control the map there, I’ve found this feature beneficial for those times when I want to keep a section of a map as reference in front of me, so I can stay in my text editor on the iPad.\nThe most impressive implementation of the second screen API, however, is the one in Shiftscreen, a new utility by indie developer Yannik Schrade that’s been entirely built around this technology. Shiftscreen is an iPad app that lets you open webpages and documents – it’s a custom browser and document preview tool. That’s not exciting at all, but here’s the twist: Shiftscreen lets you open these webpages and documents as multiple windows on external monitors, where they’ll be displayed in full-screen, without pillarboxing. On the iPad, you manage these open windows and select different zoom levels to make webpages bigger or smaller; on the external monitor, you see your open webpages and documents (any document that can be previewed with Quick Look is supported by the app) in full-screen.\nAt a glance, Shiftscreen may seem like a utility designed for teachers or people who often do presentations and may want to output a webpage or PDF document onto an external display in full-screen. The app’s developer has considered that use case, which is why Shiftscreen, in addition to a virtual touchpad to control content on the connected monitor, also has a laserpoint feature. But the use cases for Shiftscreen go far beyond lectures and presentations, and I’ve been taking advantage of the app in a bunch of interesting ways.\nFirst, due to a legacy aspect of the iPad’s multitasking system, it is possible to use two apps in Split View and let one of them output full-screen content on an external monitor; however, that app has to be placed on the left side of the Split View.12 With this in mind, I can open iA Writer or Notes on my iPad, create a Split View with Shiftscreen on the left, and take notes while looking at a big, full-screen webpage on my UltraFine 4K display. Shiftscreen doesn’t have native integration with the iPadOS pointer yet (although it should be coming soon), but clicking and dragging on its virtual touchpad is good enough for now.\nYou can manage windows open on the external display from the Shiftscreen app on the iPad.\nThe full-screen webpage is more comfortable than a tiny Safari window in Split View on my iPad, and I find this an excellent way to get research done on a particular topic. Similarly, because Shiftscreen can also open documents in full-screen, I can open a version of a PDF document in full-screen on the UltraFine and edit a copy of it on my iPad Pro.\nWhat if my research requires me to watch a video though? Thanks to the most recent update to the app, I can navigate with the Shiftscreen browser to the YouTube website, click on a video, and play it back in full-screen on the external monitor while continuing to work on my iPad Pro. This is nothing revolutionary for Mac users, but it’s never been possible on the iPad before, and developer Yannik Schrade found an ingenious solution to the problem by working around an old API designed for games.\nWatching YouTube videos in full-screen on an external display via Shiftscreen.\nMy favorite use of Shiftscreen involves Markdown and a combination of apps. Stay with me, because this is a wild one.\nAs I detailed last year, I use Working Copy to save articles from iA Writer and share my drafts with the MacStories team. Working Copy comes with a Preview feature that lets you preview Markdown documents as HTML; in the Preview menu, you can also enable an External URL option that allows Working Copy to create a local web server where the preview will be displayed. This local web server runs at a specific address (something like 192.168.1.1), and you can copy its URL to paste it in Safari and see the preview generated by Working Copy. In the latest version of the app, there’s also the option to let this local web server run in the background, which I’ve unlocked so Working Copy can always keep its local web server active.\nI can use Shiftscreen to open a remote preview generated from Working Copy as a full-screen webpage on an external monitor.\nI think you know where this is going. After copying the local web server URL for a document’s preview in Working Copy, I can put Working Copy in Slide Over and dismiss it. Then, I can put iA Writer and Shiftscreen in Split View, with Shiftscreen on the left so it outputs full-screen content on the external monitor. Because Working Copy is keeping the preview server running in the background, I can paste the preview’s external URL in Shiftscreen, which will open a webpage in full-screen on the external display. This way, I can edit a document in iA Writer and simultaneously look at a styled, full-screen preview13 of it on my UltraFine 4K monitor, which I can scroll with Shiftscreen. I can do all of this with my iPad Pro, a single USB-C cable, and a combination of apps in Slide Over and Split View.\nAt this point in the evolution of the iPad and its operating system, I believe Apple is aware of the fact that certain users like to pair their iPads with external displays to get work done. As I noted above, the default UI mirroring system is limited by pillarboxing; at the same time, the API to present full-screen content on external monitors is old, poorly documented, and not integrated with the new reality of iPadOS multiwindow and pointer support. I may have found some ways to take advantage of apps that use the existing full-screen content API, but I look forward to Apple releasing a new, more powerful, fully multitasking- and pointer-enabled version of it in the future.\nIn the making of this story, the iPad Pro has been sitting (propped up at a slight angle) on the right side of my desk, connected to the UltraFine 4K display with a single USB-C cable. All my interactions with iPadOS took place from the MX Keys keyboard in front of me and the Magic Trackpad 2 in between the keyboard and the iPad. The only time I had to physically touch the iPad was to confirm a purchase from the App Store by double-clicking the iPad’s side button – that’s how comprehensive the new pointer system is in iPadOS 13.4.\nFor someone with a setup similar to mine – an iPad Pro, a keyboard, and an external monitor – the Magic Trackpad 2 is, right now, the single best accessory that can be paired with iPadOS. The combination of the trackpad, pointer system designed by Apple, and support from third-party developers makes working with the iPad Pro at a desk not only feasible, but fun even – with a level of precision and nimble interactions that were previously inconceivable for iPadOS.\nModular Future\nLooking ahead at the next decade of iPad, I believe Apple will continue to evolve the iPad Pro line with a focus on modularity. A modular computer enables a new kind of fluid, versatile user interaction – one that can scale across different contexts, input systems, and form factors. Take a look at the existing iPad Pro and its support for USB-C, keyboards, displays, and trackpads – as I did in this story – and you can already see this strategy at play today, right now.\nFor this ideal modular future to come to fruition, however, the iPadOS platform will need to grow in some key areas. As I explained above, the iPad’s support for external monitors needs proper integration with the pointer and multiple windows, allowing users to freely move windows across displays and interact with apps on an external display using a pointing device. The fact that Apple added a full-featured system-wide pointer this year makes me think this will happen sooner rather than later. On a similar note, while extensive, iPadOS’ trackpad options could learn a few things from macOS, where all trackpad settings are exposed in one preference panel with clear instructions; the Mac also supports Hot Corners, a feature that is not integrated with iPadOS’ native pointer yet, and which could further simplify iPad multitasking.\nMore broadly speaking, I expect Apple to continue opening up the iPad’s USB-C port to more types of devices. Whether it’s a scanner, printer, MIDI interface, game controller, or multitrack audio interface, iPad users should be able to plug any USB accessory into the iPad Pro and have it work just like on macOS. Apple has an opportunity to rethink how the necessary drivers for these accessories should be installed (could they be extensions delivered via the App Store?). Regardless of the company’s approach, it should happen; when it comes to USB-C, I don’t see why the iPad Pro shouldn’t be as flexible as the Mac.\nIf my journey with the iPad over the last eight years – and more specifically with the 2018 iPad Pro – has taught me anything, it’s this: I love working with a modular computer that can be a tablet, laptop, and desktop workstation at the appropriate time. The freedom to choose how I want to hold, touch, or type on my iPad Pro is unparalleled. No other computer lends itself to being used at the beach, at a desk, on the couch, or in my hands as seamlessly and elegantly as the iPad Pro does.\nAt its core, the iPad Pro is still a tablet; with the right additions, however, it’s also become the modular computer I didn’t know I needed – and now I can’t imagine using anything else. I can’t wait to see where this approach takes me over the next 10 years.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\n\n\nI bought the 1 TB model because it offered two extra GBs of RAM. In hindsight, that was a good call. ↩︎\n\n\nFor those times when I don’t want to take control of the living room TV because someone else is watching it. ↩︎\n\n\nThe Paperlike is, well, supposed to feel like real paper for users who write and sketch with the Apple Pencil a lot, but I’m not one of those people. ↩︎\n\n\nTwo of them, backlight illumination and adjustable viewing angles, will be fixed by the upcoming Magic Keyboard. ↩︎\n\n\nIsn’t it fun when you can just make up a nickname for something and slap the so-called qualifier on it? ↩︎\n\n\nRemember when we could freely go out and drive around? Good times. ↩︎\n\n\nWhere by work I also mean “business tasks” that are typically involved with running a company that go beyond “just typing in a text editor”. Some people seem to think that running MacStories only involves “being a blogger”; I wish they were right. ↩︎\n\n\nUntil a few weeks ago, I couldn’t get any third-party Bluetooth keyboards to be recognized by macOS’ login screen after a system shutdown. As it turns out, many third-party keyboards don’t work after a Mac has been shut down if you have FileVault enabled since the startup disk is encrypted and doesn’t have access to the necessary Bluetooth drivers to recognize a third-party keyboard. After disabling FileVault on my Mac mini, I can now type my password with the MX Keys at startup just fine. ↩︎\n\n\nYes, there is an Accessibility setting to enable button shapes, but that’s optional. ↩︎\n\n\nNote how, because of Mac Catalyst, the Twitter team achieved feature parity across its iPad and Mac apps for features such as context menus and keyboard shortcuts. ↩︎\n\n\nWhich is why, unfortunately, I can’t recommend the new Brydge Pro+ keyboard for now – its trackpad is limited to two-finger swipes. ↩︎\n\n\nBack in the days of iOS 9, the app on the left was considered the “primary” one. ↩︎\n\n\nIn Working Copy, you can put a file called md-styles.css in a repo, and all Markdown documents in that repo will use it as a default stylesheet when previewed. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-04-03T10:30:20-04:00", "date_modified": "2021-11-23T09:12:45-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "iPad", "iPad at 10", "iPad Pro", "stories" ] }, { "id": "https://www.macstories.net/?p=62830", "url": "https://www.macstories.net/stories/the-mighty-mini-adapting-apples-diminutive-tablet-to-work-and-play/", "title": "The Mighty mini: Adapting Apple\u2019s Diminutive Tablet to Work and Play", "content_html": "
\"\"

\n

Make no mistake, whether it’s a Mac, iPhone, or an iPad, I prefer big screens. I think most people do. A big, bright screen makes reading easier, and a larger canvas for the apps you use is rarely a downside.

\n

Still, there’s a reason we carry mobile phones when a tablet, laptop, or desktop could accomplish the same tasks: portability. Smaller is often better, even as the compromises start to pile up when you shrink a device.

\n

Portability is why foldable phones have captured the imaginations of so many people. They promise the portability of a traditional smartphone with a screen that’s closer to a tablet’s.

\n

Just over one year ago, in March 2019, Apple released two new iPads: a 10.5-inch iPad Air and the first new iPad mini in over three years. The 5th-generation mini was a big surprise, largely because the mini hadn’t been updated in so long, leading many people to write it off as dead.

\n

Perhaps an even bigger surprise, however, was the mini’s hardware. The design didn’t change, but the 5th-generation mini upgraded the device to an A12 processor, the same chip in the then-current iPhone XR and XS. The update also added a Retina laminated screen with True Tone, P3 color support, and the highest pixel density of any iPad. The mini doesn’t support ProMotion, it only supports the first-generation Apple Pencil, and still relies on Touch ID for security. Still, in terms of raw horsepower, the mini is more similar to the 10.5-inch iPad Air than it is different, allowing it to hold its own in Apple’s iPad lineup despite its diminutive size.

\n

[table_of_contents]

\n

The previous fall, I had ordered a 2018 12.9-inch iPad Pro and fell in love with it for writing and other tasks. As much as I enjoy the iPad Pro’s big display, though, it’s not suited for every task. For example, the size of the iPad Pro makes it awkward for reading in bed. Also, although I love to play games on my iPad Pro with a connected controller, that only works well if the iPad is sitting on a table.

\n

When the mini was introduced, I immediately wondered whether Apple’s smallest tablet could be the perfect complement to its largest iPad Pro: a powerful but tiny device that could work well where the Pro doesn’t. I also figured the mini could be a great ‘downtime’ device for activities like games, reading, chatting with friends, and watching TV, movies, and other video content. So, I sold some old gear I no longer used and bought a mini with 256GB of storage, so I’d have plenty of space for games and locally-downloaded video.

\n

The plan was for my new mini to serve almost exclusively as my downtime iPad. What’s happened in practice during the past year is very different than I anticipated originally. My use of the mini has expanded far beyond what I’d expected, despite the compromises that come along with its small size. The iPad Pro remains the device I rely on for most of my needs, but as we approached the iPad’s first decade, the time felt right to consider how far the mini has come and how this unassuming device fits so neatly into the spaces between the other devices I use.

\n

\n

Until a year ago, I’d never owned an iPad mini. Other members of my family had earlier models of the mini, but I’d never felt that the larger iPads were too big for most circumstances, until I moved up from 9.7-inch iPads to the 12.9-inch Pro.

\n
\"The

The 2019 mini.

\n

What drew me to the mini initially was the stark contrast in size between it and the iPad Pro. I wanted something I could hold in one hand to read whether I was lying on the couch or in bed at night. Sometimes that device is my iPhone, but it wasn’t a great solution for a couple of reasons.

\n

First, for longer articles and books, the iPhone is good in a pinch, but the more book-like size of the mini is better. Second, my iPhone is where I have the most notifications turned on; it’s how friends, family, and colleagues get in touch with me all day long. With the mini, I wanted a downtime device that didn’t deluge me with notifications.

\n
\"The

The mini (right) is a better portable reading experience than even the largest iPhone (left).

\n

The beefy specs of the mini were the other thing that drew me in. You don’t need a super-fast processor for reading, but the A12 chip and graphics in the mini meant it would be excellent for games and a device I wouldn’t feel the need to replace anytime soon. The improved display was a draw for reading and video too.

\n

Few plans turn out as expected, though. The mini has cemented itself as the downtime device I anticipated it could be. At the end of the day, it’s the device I usually grab for reading and web browsing. However, what I didn’t anticipate is how, over the course of 2019, the iPad mini has become my alternative work device too.

\n
\"The

The mini is a surprisingly good iPad for writing.

\n

I don’t use the mini every day for work, yet slowly but surely, the mini has won me over as a fantastic ultraportable writing device,1 research tool, and communications station. The mini isn’t good at the same things as the iPad Pro, but it doesn’t have to be. What the mini lacks in flexibility compared to the iPad Pro, it more than makes up for in portability.

\n

The mini hasn’t changed the way I work or play to the degree the Pro has either. Instead, the mini plays an important supporting role, slotting neatly into contexts where a bigger iPad or a Mac would be cumbersome and an iPhone would feel cramped. That’s added flexibility that I didn’t have before, and over time, it has made a meaningful difference across all aspects of my daily computing life.

\n

The Downtime iPad

\n

My iPad Pro is primarily a work device. I do use it for games, checking the news, and wasting time on Twitter, but most of the time, I use it for writing, task management, research, email, and other MacStories work. When my workday is finished, I’m far more likely to grab the mini to watch YouTube videos I’ve saved links to throughout the day, play a game, or catch up on some reading.

\n

When I initially set up the mini, I started fresh, skipping over work-related apps. That didn’t last long. It seemed like a good idea at the time, but it was almost immediately frustrating. If I was using my mini and wanted to do something work-related, it meant switching devices for no reason other than that I didn’t have the mini set up to handle work tasks.

\n

On the one hand, that was the point. If I had declared I was finished working for the day, not having work apps at my fingertips helped enforce that separation. On the other hand, though, life isn’t that neat and tidy. I tested apps and read about technology for fun long before it was ‘work,’ and that hasn’t changed. Instead of fighting it, I’ve found other ways to set boundaries. Sometimes it works, sometimes it doesn’t, but by and large, I’ve found a balance that works for me.2

\n
\"Do

Do Not Disturb is turned on 24/7 and nearly all notifications are turned off.

\n

The compromise I’ve arrived at with the mini is far simpler and less drastic than eliminating every work app from the device. I simply turned on Do Not Disturb months ago and have never switched it off. That alone has made it a device I can leave on my bedside table at night. There’s no chance a late-night notification will wake me. I’ve also turned off nearly all notifications of any kind, which minimizes interruptions. The system works because I have an iPhone and iPad Pro that I can rely on for notifications.

\n

Having work apps on my mini requires self-control, but without notifications interrupting me, I haven’t found that to be an issue. Instead, when I’m finished working for the day, I leave my iPad Pro on my desk, set my iPhone on a Qi charger, and grab my mini to unwind.

\n
\"Reeder

Reeder is where my collected link reading happens on the mini.

\n

Most of my downtime on the mini is spent reading. That’s usually in Reeder, where I follow a mix of tech, media, music, and video game feeds. Throughout the day, I send a variety of stories to Reeder’s built-in read-it-later service, and at the end of the day I like to sit in a comfortable chair and browse through what I’ve collected.

\n

Reading my RSS feeds is an excellent example of the work-but-not-work sort of tasks for which I use the mini. Many of the links I save will end up in the Club MacStories weekly newsletter, but others are reviews of video games, longer news stories, and other topics that interest me. By having work apps like Trello available, I can deal with a link from my mini when I read the related story instead of having to remember to go back and do it later on my iPad Pro.

\n
\"Catching

Catching up on Mac Madness at the end of the day.

\n

Another category of apps that occupy a more prominent spot on my mini than my iPad Pro is video apps. Apple’s TV app, YouTube, Apple Developer, and Matt Comi’s upcoming TV Forecast app are all on my mini’s Home screen for when I want to watch something or check where I left off on a TV show.

\n

As I’d hoped when I bought it, the mini is fantastic for gaming too. If you’ve read my game reviews on MacStories, you know I like to use a controller whenever possible with iOS and iPadOS games. That’s true with the mini too. However, instead of the PS4 controller I use with my iPad Pro, I typically use my mini with the Gamevice. The device, which I was sent for testing, splits a traditional controller into two halves connected by a rubber strap. The iPad mini fits between the two halves with the strap holding the contraption tight to the ends of the iPad.

\n
\"Playing

Playing Dead Cells with the Gamevice.

\n

Once set up, it’s a little like having a giant Nintendo Switch. For games like Dead Cells, it’s fantastic. Better yet, the Gamevice doesn’t need to be charged because it uses the iPad’s Lightning connector for power, and there’s no pairing required. By sending controller commands over the Lightning port, the controls are also very responsive.

\n
\"\"

\n

To be sure, the setup makes the iPad mini a little bulky, but for games that work best with a controller, the trade-off is worth it. With the emphasis on controller support that we’ve seen with Apple Arcade, a Gamevice is a terrific addition to an iPad mini setup.

\n

mini Workstation

\n
\"I

I love writing on the mini in iA Writer.

\n

It’s remarkable to me that a decade after its introduction, I still hear people insist that the iPad is only good for consumption. As I covered in my iPad history story on Monday, part of that is Apple’s own doing. That was a big part of the way the original iPad was pitched.

\n

However, the mini has always been up to the challenge of being used for creative endeavors. For instance, Federico was running MacStories on a mini in 2013. Today’s mini is even more capable. Not only does the latest mini use the A12 chip, but it runs the same version of iPadOS as other iPads, so it supports trackpad and mouse input along with its support for the first-generation Apple Pencil.

\n
\"It's

It’s time for the mini’s bezels and Touch ID to go.

\n

There are hardware limitations, though. The ones that bother me most are the lack of Face ID and the mini’s large bezels. The design, which has hardly changed over the years, looks dated today. Worse, though, is that when I’m controlling the mini with an external keyboard, trackpad, or mouse, having to reach out to unlock it with the Touch ID sensor is an interruption in the flow of using the device that feels anachronistic in 2020. It’s time for the mini to extend edge-to-edge, which would look better and allow for a slightly bigger screen too.

\n

The mini also lacks support for ProMotion and only works with the first-generation Apple Pencil. For my uses, though, both of those omissions are more tolerable. I’ve grown used to the lack of ProMotion, and most of my Apple Pencil use involves UI navigation and taking handwritten notes, neither of which is severely hampered by the lack of the latest Pencil. The bigger challenges of dealing with the iPad mini as a combination work/play device were adapting to the screen size and finding a workable keyboard solution.

\n

Working on a Small Screen

\n

I didn’t fully appreciate what working on the iPad mini’s small screen would be like until I tried it. If you place the iPad mini in its portrait orientation, it’s roughly the size of half the screen of the 12.9-inch iPad Pro in landscape mode. I work in Split View a lot on the Pro, so I figured that at most, the mini’s screen size would mean that I’d use Split View less. That’s turned out to be true, but what I didn’t think about was that the mini’s pixel density is higher than other iPads, which shrinks everything a little.

\n
\"The

The dock icons are too small and close together in portrait mode.

\n

The icons on the mini’s Home screen are closer together, and with a full dock, app icons are tiny and feel crammed together, especially in portrait mode. There are quirks in iPadOS too. For example, if you use Search to find an app, the software keyboard covers up more than half of the app icon for the first result, making it hard to select the app instead of the keyboard.

\n
\"Searching

Searching for an app using the software keyboard in landscape mode is problematic on the mini.

\n

A couple of things have mitigated these kinds of issues. First, Pencil support helps when I’m using the mini in handheld mode. The Pencil is far more precise than my finger, which makes selecting smaller targets easier. Second and most recently, the trackpad and mouse support in iPadOS 13.4 has been a game-changer, making placement of the cursor and aiming the pointer far easier than reaching up from a keyboard to poke at the screen.

\n
\"Pairing

Pairing a Magic Trackpad 2 with the mini is a great combination, but I really need to get one in space gray.

\n

However, one place where the mini’s smaller size is a big advantage is thumb typing. In handheld mode, the mini is just small enough to make typing a message possible in a way that is virtually impossible on the iPad Pro despite the floating keyboard that was added with iPadOS 13.0.

\n

The other difficulty of working on the mini stems from the fact that if you use it with a keyboard, the device is farther away than it would otherwise be. That’s required me to make adjustments to text sizes across the system to ensure that apps are readable, whether I’m holding the mini in my hands or it’s propped up on a table as I type.

\n

It’s a process that has required a lot of trial and error. Apps I use primarily in handheld mode don’t need the text size bumped up, but my text editor absolutely does, for instance. The hardest are apps that get used both in handheld mode and with a keyboard.

\n

Some apps handle text size better than others. I always appreciate an app that doesn’t just rely on Dynamic Type. That’s an important starting point, but the different distances at which I use the mini make in-app text settings a necessity. One of the best examples of an app that handles this well is Safari. Not only can I easily adjust the size of a website’s text from the toolbar, but there’s also a keyboard shortcut. I can also switch to Safari Reader View for the cleanest, most customizable reading experience of all.

\n

The Keyboard Conundrum

\n

Perhaps the greatest difficulty in adjusting to a mini, though, has been finding a keyboard for writing. I type a lot every day and, although I’m not nearly as picky about keyboards as a lot of people I know, I do appreciate a keyboard that’s comfortable.

\n

I have plenty of full-sized keyboards I can use with the mini, but that defeats the purpose of using the device as an ultraportable setup. Instead, I wanted something small and light that I could throw in a bag without complicating or upsizing my mini setup significantly. That’s proven to be a tough combination to find. As a result, I have a few I want to mention, none of which are perfect, but each of which can work depending on your circumstances.

\n
\"Logitech's

Logitech’s Keys to Go. Source: Logitech.

\n

I started too small. The first keyboard I tried was the Logitech Keys To Go keyboard, which is roughly the width of the iPad mini in landscape mode. It’s a strange little keyboard with membrane-type bubble keys that are surprisingly hard to press accurately. The rechargeable battery in the Keys to Go lasts a long time, and the keyboard is splash resistant, but it’s just too small and uncomfortable to use.

\n
\"The

The mini and Studio Neat Canopy combination.

\n

Another option I tried was Apple’s Magic Keyboard with Studio Neat’s Canopy case. This comes close to what I want, but not quite. I like the Magic Keyboard a lot. It’s comfortable to type on, lightweight, and the Canopy protects it in a bag, but the mini sinks too far into the case when it’s used to prop the mini up while typing, making it hard to access the dock. A trackpad and keyboard shortcuts mitigate this issue, but it’s not ideal.

\n
\"Logitech's

Logitech’s K380 keyboard is sturdy and the old-school AAA batteries last two years.

\n

Most often, I’ve found myself turning to the Logitech K380 Multi-Device Bluetooth keyboard. The K380’s keys are round and a little stiff, requiring harder presses to type than the Magic Keyboard. The Logitech keyboard has a couple of interesting advantages, though.

\n

First, it’s solidly built. So far, it’s survived being tossed in my backpack many times and looks and feels the same as when I got it six months ago.

\n

Second, the K380 uses AAA batteries. I turned my nose up at alkaline batteries at first, but Logitech says that they allow the keyboard to run for two years before you need to change the batteries. I’ve had mine for around five months, and it’s still going strong. Battery life is aided by the fact that the keyboard isn’t backlit, which is a shame, but also understandable. For a device I don’t use every day, not having to wonder if it’s fully charged is a big advantage.

\n
\"The

The round keys take getting used to, but next to the Magic Keyboard, this has been the most comfortable keyboard I’ve tried with the mini.

\n

The K380 isn’t an everyday keyboard, and I typically type on it for shorter periods than other keyboards, so its shortcomings are tolerable. I also appreciate the keyboard’s inverted-T arrow key layout, the ability to pair it with three different devices, and the dedicated function key for loading the software keyboard that makes accessing emoji easier.

\n
\"The

The Brydge 7.9 can’t escape the limitations of the mini’s width, but I like it for light typing.

\n

The most recent keyboard I’ve been using with my mini is the Brydge 7.9, a brand-new Bluetooth keyboard which is designed specifically for the mini. Brydge made a similar keyboard for earlier models of the mini, which I’ve tried, and this new model improves on that one in a lot of ways, from an improved key layout to its build quality. The Brydge 7.9 is a mini-sized keyboard, so it’s cramped to type on, but in the few days since Brydge sent it to me to test, it has grown on me, and I expect it’s an option I’ll stick with for certain use cases.

\n

The Brydge 7.9 keyboard connects to the mini via Bluetooth, and like other Brydge keyboards, the mini slots into hinged clamps at the corners of the device. The keyboard is backlit and charges over Micro USB.

\n
\"\"

\n

Along the top edge is a row of function keys. The Home key on the far left side takes you back to the iPad’s Home screen with a single press. Double-pressing the key opens the multitasking view, and pressing and holding triggers Siri. There are keys for locking the iPad’s screen, cycling through the keyboard’s three levels of backlighting, one that toggles the software keyboard, a globe key that is handy for opening the keyboard picker, plus media playback and volume keys. There are also function keys for pairing the keyboard using Bluetooth and turning it on and off. It’s worth mentioning that the keyboard also includes inverted arrow keys and a little divot that makes it easier to access the dock with your finger, both of which I like.

\n

No keyboard that is the width of the iPad mini in landscape mode is going to be comfortable to type on for long periods, and the Brydge 7.9 is no different. For keyboards this size, though, this is one of the better ones I’ve tried. The keys are very close together, and many of the lesser-used keys along the edges are half-width keys, but after using it for several hours, I’ve grown used to it. I still make more mistakes than I would with a full-size keyboard, but I was pleasantly surprised after spending a solid day typing on it.

\n

Still, I prefer this keyboard for editing. Typing the first draft from scratch is too frustrating. The first draft of this story was written on the K380. I switched to the Brydge to write this section and edit the rest of the story. It has definitely slowed me down, but speed is less important with editing, so that has been fine. The Brydge keyboard is also fine for lighter typing tasks like email, messaging apps, and the like.

\n

One advantage of the Brydge keyboard over the K380 is that it transforms the mini into something I can type on when it’s sitting in my lap. I also appreciate that I can easily close the setup just like a laptop, protecting the iPad’s screen and reducing my kit to one paperback book-sized unit.

\n

Aside from the inherent limitation of using such a narrow keyboard, I’m not enamored with the backlighting. Unless I’m looking almost directly down on it, the LEDs under the keys leak light from around the lead edge of the keys in a way that’s distracting in dark settings. Still, even though I’ve only had a few days to work with the Brydge 7.9, I expect to continue using it for editing and other light typing situations when I want to travel as light as possible.

\n

Other Accessories

\n
\"The

The Twelve South Compass 2 easel-style stand.

\n

The other accessories I use with the mini are Twelve South’s Compass 2 stand and the Moshi iVisor AG screen protector. The Compass is an easel-style stand that holds the mini just off a table or desk at a nice viewing angle. I usually write with the iPad mini in landscape mode propped up with the Compass and then switch to portrait mode for editing, which is close to the experience of editing in Split View on an iPad Pro. I appreciate that the Compass folds up very small and comes with a nylon pouch that tucks neatly into a side pocket of my backpack. I haven’t tried many portable iPad stands, but the Compass 2 is both sturdy and easy to pack, which makes it an excellent complement to the mini.

\n

The Moshi screen protector is an idea I stole from Federico. It does a tremendous job reducing glare, which has made writing at my kitchen table easier on a sunny day. I’m looking forward to trying it outside this summer when it gets warmer.

\n

Work Apps

\n
\"On

On the mini, I like to write in landscape and edit in portrait, where I can see more of my text.

\n

The work apps that I use on the mini are largely the same ones I use on the iPad Pro. Working Copy and iA Writer for writing, Trello and Reminders for organizing projects and tasks, and Slack and Messages for communicating with the MacStories team, where the ability to thumb type on the mini comes in handy.

\n
\"Split

Split View (left) can be cramped, so I resort to Slide Over (right) more often on the mini.

\n

One big difference, though, is that I don’t use the mini for taking screenshots for stories I’m writing, nor do I edit images on the mini very often, where the small screen makes that task harder. Another difference in my work use of the mini is that I turn to it for reading and researching far more than writing. The same process of sorting through links and reading that I described above when I’m reading for pleasure is something I do for MacStories too. The difference is those links usually end up in Trello for MacStories Weekly or in Raindrop.io for a story I’m working on. The benefit of the mini is that after a long day sitting at my desk, I can process those links and read those stories in a more comfortable setting, which I enjoy.

\n
\"Raindrop.io

Raindrop.io on the mini is an excellent way to do research away from your desk.

\n

Another significant benefit I’ve gotten from the iPad mini is as a digital notebook. In this mode, the mini serves in a supporting role as I work on my Mac or iPad Pro.

\n
\"The

The mini’s size makes it feel like a ‘real’ notebook when using GoodNotes.

\n

There are a lot of ways to capture ideas as they come to you to avoid interrupting the task at hand, but my favorite on the mini is GoodNotes. By using a separate device, I don’t have to switch away from whatever I’m already working on. Instead, I jot the thought down quickly and come back to it later.

\n

The iPad mini has become the complement to the iPad Pro that I’d hoped, but in a completely different way than I anticipated. It’s also taught me a few things about what works for me when it comes to work-life balance. In the end, that’s made the mini a far more important device than I ever expected it would be.

\n

Limiting the mini to ‘downtime’ apps didn’t fit with the way I work. Once I recognized that and allowed work apps to be available, but silent, I was able to fully enjoy the benefits of the mini’s advantages over the iPad Pro. It’s also led to a mini setup that’s very different from my iPad Pro. Next to my iPhone, that makes the mini my most personal device. It reflects a wider array of my interests and, in doing so, serves a broader range of purposes than the iPad Pro, even if the contexts in which it serves those purposes are narrower.

\n

Over three years between the last two iPad mini revisions was too long. I don’t think the mini needs to be updated to match all the features of the iPad Pro. It ticks off the most important checkboxes, while remaining more affordable than the Pro. Still, I’d especially like to see Face ID and a more Pro-like design. Until that happens, the mini will continue to feel like a slightly oddball relic instead of the tiny powerhouse that it is.

\n

I certainly could get my work done and be entertained without the iPad mini. It isn’t a necessity by any stretch of the imagination. However, spending as much time as I do in front of big displays sitting at a desk, I look forward to grabbing my mini to head off to a coffee shop, or to read on the couch at the end of the day with little thought or planning. The mini is too limited to ever be my primary device, but it fills the gaps in my other computing so smoothly, I’d be mighty upset if I had to give it up.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n
\n
  1. \nIn case anyone was wondering, yes, I did write this story on the mini. ↩︎\n
  2. \n
  3. \nIt’s well beyond the scope of this story, but if you’re a Club MacStories member, the episode of MacStories Unplugged called ‘Everything Is Research’ is where Federico and I explore this topic in depth. ↩︎\n
  4. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Make no mistake, whether it’s a Mac, iPhone, or an iPad, I prefer big screens. I think most people do. A big, bright screen makes reading easier, and a larger canvas for the apps you use is rarely a downside.\nStill, there’s a reason we carry mobile phones when a tablet, laptop, or desktop could accomplish the same tasks: portability. Smaller is often better, even as the compromises start to pile up when you shrink a device.\nPortability is why foldable phones have captured the imaginations of so many people. They promise the portability of a traditional smartphone with a screen that’s closer to a tablet’s.\nSupported By\nConcepts\n\nConcepts: Where ideas take shape\nJust over one year ago, in March 2019, Apple released two new iPads: a 10.5-inch iPad Air and the first new iPad mini in over three years. The 5th-generation mini was a big surprise, largely because the mini hadn’t been updated in so long, leading many people to write it off as dead.\nPerhaps an even bigger surprise, however, was the mini’s hardware. The design didn’t change, but the 5th-generation mini upgraded the device to an A12 processor, the same chip in the then-current iPhone XR and XS. The update also added a Retina laminated screen with True Tone, P3 color support, and the highest pixel density of any iPad. The mini doesn’t support ProMotion, it only supports the first-generation Apple Pencil, and still relies on Touch ID for security. Still, in terms of raw horsepower, the mini is more similar to the 10.5-inch iPad Air than it is different, allowing it to hold its own in Apple’s iPad lineup despite its diminutive size.\n[table_of_contents]\nThe previous fall, I had ordered a 2018 12.9-inch iPad Pro and fell in love with it for writing and other tasks. As much as I enjoy the iPad Pro’s big display, though, it’s not suited for every task. For example, the size of the iPad Pro makes it awkward for reading in bed. Also, although I love to play games on my iPad Pro with a connected controller, that only works well if the iPad is sitting on a table.\nWhen the mini was introduced, I immediately wondered whether Apple’s smallest tablet could be the perfect complement to its largest iPad Pro: a powerful but tiny device that could work well where the Pro doesn’t. I also figured the mini could be a great ‘downtime’ device for activities like games, reading, chatting with friends, and watching TV, movies, and other video content. So, I sold some old gear I no longer used and bought a mini with 256GB of storage, so I’d have plenty of space for games and locally-downloaded video.\nThe plan was for my new mini to serve almost exclusively as my downtime iPad. What’s happened in practice during the past year is very different than I anticipated originally. My use of the mini has expanded far beyond what I’d expected, despite the compromises that come along with its small size. The iPad Pro remains the device I rely on for most of my needs, but as we approached the iPad’s first decade, the time felt right to consider how far the mini has come and how this unassuming device fits so neatly into the spaces between the other devices I use.\n\nUntil a year ago, I’d never owned an iPad mini. Other members of my family had earlier models of the mini, but I’d never felt that the larger iPads were too big for most circumstances, until I moved up from 9.7-inch iPads to the 12.9-inch Pro.\nThe 2019 mini.\nWhat drew me to the mini initially was the stark contrast in size between it and the iPad Pro. I wanted something I could hold in one hand to read whether I was lying on the couch or in bed at night. Sometimes that device is my iPhone, but it wasn’t a great solution for a couple of reasons.\nFirst, for longer articles and books, the iPhone is good in a pinch, but the more book-like size of the mini is better. Second, my iPhone is where I have the most notifications turned on; it’s how friends, family, and colleagues get in touch with me all day long. With the mini, I wanted a downtime device that didn’t deluge me with notifications.\nThe mini (right) is a better portable reading experience than even the largest iPhone (left).\nThe beefy specs of the mini were the other thing that drew me in. You don’t need a super-fast processor for reading, but the A12 chip and graphics in the mini meant it would be excellent for games and a device I wouldn’t feel the need to replace anytime soon. The improved display was a draw for reading and video too.\nFew plans turn out as expected, though. The mini has cemented itself as the downtime device I anticipated it could be. At the end of the day, it’s the device I usually grab for reading and web browsing. However, what I didn’t anticipate is how, over the course of 2019, the iPad mini has become my alternative work device too.\nThe mini is a surprisingly good iPad for writing.\nI don’t use the mini every day for work, yet slowly but surely, the mini has won me over as a fantastic ultraportable writing device,1 research tool, and communications station. The mini isn’t good at the same things as the iPad Pro, but it doesn’t have to be. What the mini lacks in flexibility compared to the iPad Pro, it more than makes up for in portability.\nThe mini hasn’t changed the way I work or play to the degree the Pro has either. Instead, the mini plays an important supporting role, slotting neatly into contexts where a bigger iPad or a Mac would be cumbersome and an iPhone would feel cramped. That’s added flexibility that I didn’t have before, and over time, it has made a meaningful difference across all aspects of my daily computing life.\nThe Downtime iPad\nMy iPad Pro is primarily a work device. I do use it for games, checking the news, and wasting time on Twitter, but most of the time, I use it for writing, task management, research, email, and other MacStories work. When my workday is finished, I’m far more likely to grab the mini to watch YouTube videos I’ve saved links to throughout the day, play a game, or catch up on some reading.\nWhen I initially set up the mini, I started fresh, skipping over work-related apps. That didn’t last long. It seemed like a good idea at the time, but it was almost immediately frustrating. If I was using my mini and wanted to do something work-related, it meant switching devices for no reason other than that I didn’t have the mini set up to handle work tasks.\nOn the one hand, that was the point. If I had declared I was finished working for the day, not having work apps at my fingertips helped enforce that separation. On the other hand, though, life isn’t that neat and tidy. I tested apps and read about technology for fun long before it was ‘work,’ and that hasn’t changed. Instead of fighting it, I’ve found other ways to set boundaries. Sometimes it works, sometimes it doesn’t, but by and large, I’ve found a balance that works for me.2\nDo Not Disturb is turned on 24/7 and nearly all notifications are turned off.\nThe compromise I’ve arrived at with the mini is far simpler and less drastic than eliminating every work app from the device. I simply turned on Do Not Disturb months ago and have never switched it off. That alone has made it a device I can leave on my bedside table at night. There’s no chance a late-night notification will wake me. I’ve also turned off nearly all notifications of any kind, which minimizes interruptions. The system works because I have an iPhone and iPad Pro that I can rely on for notifications.\nHaving work apps on my mini requires self-control, but without notifications interrupting me, I haven’t found that to be an issue. Instead, when I’m finished working for the day, I leave my iPad Pro on my desk, set my iPhone on a Qi charger, and grab my mini to unwind.\nReeder is where my collected link reading happens on the mini.\nMost of my downtime on the mini is spent reading. That’s usually in Reeder, where I follow a mix of tech, media, music, and video game feeds. Throughout the day, I send a variety of stories to Reeder’s built-in read-it-later service, and at the end of the day I like to sit in a comfortable chair and browse through what I’ve collected.\nReading my RSS feeds is an excellent example of the work-but-not-work sort of tasks for which I use the mini. Many of the links I save will end up in the Club MacStories weekly newsletter, but others are reviews of video games, longer news stories, and other topics that interest me. By having work apps like Trello available, I can deal with a link from my mini when I read the related story instead of having to remember to go back and do it later on my iPad Pro.\nCatching up on Mac Madness at the end of the day.\nAnother category of apps that occupy a more prominent spot on my mini than my iPad Pro is video apps. Apple’s TV app, YouTube, Apple Developer, and Matt Comi’s upcoming TV Forecast app are all on my mini’s Home screen for when I want to watch something or check where I left off on a TV show.\nBut wait, there’s more\nMy iPad mini Home Screen\n\nBy subscribing to Club MacStories you’ll receive MacStories Weekly, our Club-exclusive email newsletter. This week, we’ve got a special iPad at 10 issue that expands on the stories we’re publishing this week including:\nA deep dive into my iPad mini Home screen and why it’s organized the way it is\nA special extension of Federico’s iPad story\nA collection of favorite iPad games\nand more\nYou can unlock all of these perks with a Club MacStories subscription, starting at $5/month. And in doing so, you’ll also get access to the complete archive of Club MacStories with over four years worth of exclusive content.\nAs I’d hoped when I bought it, the mini is fantastic for gaming too. If you’ve read my game reviews on MacStories, you know I like to use a controller whenever possible with iOS and iPadOS games. That’s true with the mini too. However, instead of the PS4 controller I use with my iPad Pro, I typically use my mini with the Gamevice. The device, which I was sent for testing, splits a traditional controller into two halves connected by a rubber strap. The iPad mini fits between the two halves with the strap holding the contraption tight to the ends of the iPad.\nPlaying Dead Cells with the Gamevice.\nOnce set up, it’s a little like having a giant Nintendo Switch. For games like Dead Cells, it’s fantastic. Better yet, the Gamevice doesn’t need to be charged because it uses the iPad’s Lightning connector for power, and there’s no pairing required. By sending controller commands over the Lightning port, the controls are also very responsive.\n\nTo be sure, the setup makes the iPad mini a little bulky, but for games that work best with a controller, the trade-off is worth it. With the emphasis on controller support that we’ve seen with Apple Arcade, a Gamevice is a terrific addition to an iPad mini setup.\nmini Workstation\nI love writing on the mini in iA Writer.\nIt’s remarkable to me that a decade after its introduction, I still hear people insist that the iPad is only good for consumption. As I covered in my iPad history story on Monday, part of that is Apple’s own doing. That was a big part of the way the original iPad was pitched.\nHowever, the mini has always been up to the challenge of being used for creative endeavors. For instance, Federico was running MacStories on a mini in 2013. Today’s mini is even more capable. Not only does the latest mini use the A12 chip, but it runs the same version of iPadOS as other iPads, so it supports trackpad and mouse input along with its support for the first-generation Apple Pencil.\nIt’s time for the mini’s bezels and Touch ID to go.\nThere are hardware limitations, though. The ones that bother me most are the lack of Face ID and the mini’s large bezels. The design, which has hardly changed over the years, looks dated today. Worse, though, is that when I’m controlling the mini with an external keyboard, trackpad, or mouse, having to reach out to unlock it with the Touch ID sensor is an interruption in the flow of using the device that feels anachronistic in 2020. It’s time for the mini to extend edge-to-edge, which would look better and allow for a slightly bigger screen too.\nThe mini also lacks support for ProMotion and only works with the first-generation Apple Pencil. For my uses, though, both of those omissions are more tolerable. I’ve grown used to the lack of ProMotion, and most of my Apple Pencil use involves UI navigation and taking handwritten notes, neither of which is severely hampered by the lack of the latest Pencil. The bigger challenges of dealing with the iPad mini as a combination work/play device were adapting to the screen size and finding a workable keyboard solution.\nWorking on a Small Screen\nI didn’t fully appreciate what working on the iPad mini’s small screen would be like until I tried it. If you place the iPad mini in its portrait orientation, it’s roughly the size of half the screen of the 12.9-inch iPad Pro in landscape mode. I work in Split View a lot on the Pro, so I figured that at most, the mini’s screen size would mean that I’d use Split View less. That’s turned out to be true, but what I didn’t think about was that the mini’s pixel density is higher than other iPads, which shrinks everything a little.\nThe dock icons are too small and close together in portrait mode.\nThe icons on the mini’s Home screen are closer together, and with a full dock, app icons are tiny and feel crammed together, especially in portrait mode. There are quirks in iPadOS too. For example, if you use Search to find an app, the software keyboard covers up more than half of the app icon for the first result, making it hard to select the app instead of the keyboard.\nSearching for an app using the software keyboard in landscape mode is problematic on the mini.\nA couple of things have mitigated these kinds of issues. First, Pencil support helps when I’m using the mini in handheld mode. The Pencil is far more precise than my finger, which makes selecting smaller targets easier. Second and most recently, the trackpad and mouse support in iPadOS 13.4 has been a game-changer, making placement of the cursor and aiming the pointer far easier than reaching up from a keyboard to poke at the screen.\nPairing a Magic Trackpad 2 with the mini is a great combination, but I really need to get one in space gray.\nHowever, one place where the mini’s smaller size is a big advantage is thumb typing. In handheld mode, the mini is just small enough to make typing a message possible in a way that is virtually impossible on the iPad Pro despite the floating keyboard that was added with iPadOS 13.0.\nThe other difficulty of working on the mini stems from the fact that if you use it with a keyboard, the device is farther away than it would otherwise be. That’s required me to make adjustments to text sizes across the system to ensure that apps are readable, whether I’m holding the mini in my hands or it’s propped up on a table as I type.\nIt’s a process that has required a lot of trial and error. Apps I use primarily in handheld mode don’t need the text size bumped up, but my text editor absolutely does, for instance. The hardest are apps that get used both in handheld mode and with a keyboard.\nSome apps handle text size better than others. I always appreciate an app that doesn’t just rely on Dynamic Type. That’s an important starting point, but the different distances at which I use the mini make in-app text settings a necessity. One of the best examples of an app that handles this well is Safari. Not only can I easily adjust the size of a website’s text from the toolbar, but there’s also a keyboard shortcut. I can also switch to Safari Reader View for the cleanest, most customizable reading experience of all.\nThe Keyboard Conundrum\nPerhaps the greatest difficulty in adjusting to a mini, though, has been finding a keyboard for writing. I type a lot every day and, although I’m not nearly as picky about keyboards as a lot of people I know, I do appreciate a keyboard that’s comfortable.\nI have plenty of full-sized keyboards I can use with the mini, but that defeats the purpose of using the device as an ultraportable setup. Instead, I wanted something small and light that I could throw in a bag without complicating or upsizing my mini setup significantly. That’s proven to be a tough combination to find. As a result, I have a few I want to mention, none of which are perfect, but each of which can work depending on your circumstances.\nLogitech’s Keys to Go. Source: Logitech.\nI started too small. The first keyboard I tried was the Logitech Keys To Go keyboard, which is roughly the width of the iPad mini in landscape mode. It’s a strange little keyboard with membrane-type bubble keys that are surprisingly hard to press accurately. The rechargeable battery in the Keys to Go lasts a long time, and the keyboard is splash resistant, but it’s just too small and uncomfortable to use.\nThe mini and Studio Neat Canopy combination.\nAnother option I tried was Apple’s Magic Keyboard with Studio Neat’s Canopy case. This comes close to what I want, but not quite. I like the Magic Keyboard a lot. It’s comfortable to type on, lightweight, and the Canopy protects it in a bag, but the mini sinks too far into the case when it’s used to prop the mini up while typing, making it hard to access the dock. A trackpad and keyboard shortcuts mitigate this issue, but it’s not ideal.\nLogitech’s K380 keyboard is sturdy and the old-school AAA batteries last two years.\nMost often, I’ve found myself turning to the Logitech K380 Multi-Device Bluetooth keyboard. The K380’s keys are round and a little stiff, requiring harder presses to type than the Magic Keyboard. The Logitech keyboard has a couple of interesting advantages, though.\nFirst, it’s solidly built. So far, it’s survived being tossed in my backpack many times and looks and feels the same as when I got it six months ago.\nSecond, the K380 uses AAA batteries. I turned my nose up at alkaline batteries at first, but Logitech says that they allow the keyboard to run for two years before you need to change the batteries. I’ve had mine for around five months, and it’s still going strong. Battery life is aided by the fact that the keyboard isn’t backlit, which is a shame, but also understandable. For a device I don’t use every day, not having to wonder if it’s fully charged is a big advantage.\nThe round keys take getting used to, but next to the Magic Keyboard, this has been the most comfortable keyboard I’ve tried with the mini.\nThe K380 isn’t an everyday keyboard, and I typically type on it for shorter periods than other keyboards, so its shortcomings are tolerable. I also appreciate the keyboard’s inverted-T arrow key layout, the ability to pair it with three different devices, and the dedicated function key for loading the software keyboard that makes accessing emoji easier.\nThe Brydge 7.9 can’t escape the limitations of the mini’s width, but I like it for light typing.\nThe most recent keyboard I’ve been using with my mini is the Brydge 7.9, a brand-new Bluetooth keyboard which is designed specifically for the mini. Brydge made a similar keyboard for earlier models of the mini, which I’ve tried, and this new model improves on that one in a lot of ways, from an improved key layout to its build quality. The Brydge 7.9 is a mini-sized keyboard, so it’s cramped to type on, but in the few days since Brydge sent it to me to test, it has grown on me, and I expect it’s an option I’ll stick with for certain use cases.\nThe Brydge 7.9 keyboard connects to the mini via Bluetooth, and like other Brydge keyboards, the mini slots into hinged clamps at the corners of the device. The keyboard is backlit and charges over Micro USB.\n\nAlong the top edge is a row of function keys. The Home key on the far left side takes you back to the iPad’s Home screen with a single press. Double-pressing the key opens the multitasking view, and pressing and holding triggers Siri. There are keys for locking the iPad’s screen, cycling through the keyboard’s three levels of backlighting, one that toggles the software keyboard, a globe key that is handy for opening the keyboard picker, plus media playback and volume keys. There are also function keys for pairing the keyboard using Bluetooth and turning it on and off. It’s worth mentioning that the keyboard also includes inverted arrow keys and a little divot that makes it easier to access the dock with your finger, both of which I like.\nNo keyboard that is the width of the iPad mini in landscape mode is going to be comfortable to type on for long periods, and the Brydge 7.9 is no different. For keyboards this size, though, this is one of the better ones I’ve tried. The keys are very close together, and many of the lesser-used keys along the edges are half-width keys, but after using it for several hours, I’ve grown used to it. I still make more mistakes than I would with a full-size keyboard, but I was pleasantly surprised after spending a solid day typing on it.\nStill, I prefer this keyboard for editing. Typing the first draft from scratch is too frustrating. The first draft of this story was written on the K380. I switched to the Brydge to write this section and edit the rest of the story. It has definitely slowed me down, but speed is less important with editing, so that has been fine. The Brydge keyboard is also fine for lighter typing tasks like email, messaging apps, and the like.\nOne advantage of the Brydge keyboard over the K380 is that it transforms the mini into something I can type on when it’s sitting in my lap. I also appreciate that I can easily close the setup just like a laptop, protecting the iPad’s screen and reducing my kit to one paperback book-sized unit.\nAside from the inherent limitation of using such a narrow keyboard, I’m not enamored with the backlighting. Unless I’m looking almost directly down on it, the LEDs under the keys leak light from around the lead edge of the keys in a way that’s distracting in dark settings. Still, even though I’ve only had a few days to work with the Brydge 7.9, I expect to continue using it for editing and other light typing situations when I want to travel as light as possible.\nOther Accessories\nThe Twelve South Compass 2 easel-style stand.\nThe other accessories I use with the mini are Twelve South’s Compass 2 stand and the Moshi iVisor AG screen protector. The Compass is an easel-style stand that holds the mini just off a table or desk at a nice viewing angle. I usually write with the iPad mini in landscape mode propped up with the Compass and then switch to portrait mode for editing, which is close to the experience of editing in Split View on an iPad Pro. I appreciate that the Compass folds up very small and comes with a nylon pouch that tucks neatly into a side pocket of my backpack. I haven’t tried many portable iPad stands, but the Compass 2 is both sturdy and easy to pack, which makes it an excellent complement to the mini.\nThe Moshi screen protector is an idea I stole from Federico. It does a tremendous job reducing glare, which has made writing at my kitchen table easier on a sunny day. I’m looking forward to trying it outside this summer when it gets warmer.\nWork Apps\nOn the mini, I like to write in landscape and edit in portrait, where I can see more of my text.\nThe work apps that I use on the mini are largely the same ones I use on the iPad Pro. Working Copy and iA Writer for writing, Trello and Reminders for organizing projects and tasks, and Slack and Messages for communicating with the MacStories team, where the ability to thumb type on the mini comes in handy.\nSplit View (left) can be cramped, so I resort to Slide Over (right) more often on the mini.\nOne big difference, though, is that I don’t use the mini for taking screenshots for stories I’m writing, nor do I edit images on the mini very often, where the small screen makes that task harder. Another difference in my work use of the mini is that I turn to it for reading and researching far more than writing. The same process of sorting through links and reading that I described above when I’m reading for pleasure is something I do for MacStories too. The difference is those links usually end up in Trello for MacStories Weekly or in Raindrop.io for a story I’m working on. The benefit of the mini is that after a long day sitting at my desk, I can process those links and read those stories in a more comfortable setting, which I enjoy.\nRaindrop.io on the mini is an excellent way to do research away from your desk.\nAnother significant benefit I’ve gotten from the iPad mini is as a digital notebook. In this mode, the mini serves in a supporting role as I work on my Mac or iPad Pro.\nThe mini’s size makes it feel like a ‘real’ notebook when using GoodNotes.\nThere are a lot of ways to capture ideas as they come to you to avoid interrupting the task at hand, but my favorite on the mini is GoodNotes. By using a separate device, I don’t have to switch away from whatever I’m already working on. Instead, I jot the thought down quickly and come back to it later.\nThe iPad mini has become the complement to the iPad Pro that I’d hoped, but in a completely different way than I anticipated. It’s also taught me a few things about what works for me when it comes to work-life balance. In the end, that’s made the mini a far more important device than I ever expected it would be.\nLimiting the mini to ‘downtime’ apps didn’t fit with the way I work. Once I recognized that and allowed work apps to be available, but silent, I was able to fully enjoy the benefits of the mini’s advantages over the iPad Pro. It’s also led to a mini setup that’s very different from my iPad Pro. Next to my iPhone, that makes the mini my most personal device. It reflects a wider array of my interests and, in doing so, serves a broader range of purposes than the iPad Pro, even if the contexts in which it serves those purposes are narrower.\nOver three years between the last two iPad mini revisions was too long. I don’t think the mini needs to be updated to match all the features of the iPad Pro. It ticks off the most important checkboxes, while remaining more affordable than the Pro. Still, I’d especially like to see Face ID and a more Pro-like design. Until that happens, the mini will continue to feel like a slightly oddball relic instead of the tiny powerhouse that it is.\nI certainly could get my work done and be entertained without the iPad mini. It isn’t a necessity by any stretch of the imagination. However, spending as much time as I do in front of big displays sitting at a desk, I look forward to grabbing my mini to head off to a coffee shop, or to read on the couch at the end of the day with little thought or planning. The mini is too limited to ever be my primary device, but it fills the gaps in my other computing so smoothly, I’d be mighty upset if I had to give it up.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\n\n\nIn case anyone was wondering, yes, I did write this story on the mini. ↩︎\n\n\nIt’s well beyond the scope of this story, but if you’re a Club MacStories member, the episode of MacStories Unplugged called ‘Everything Is Research’ is where Federico and I explore this topic in depth. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-04-02T12:17:49-04:00", "date_modified": "2020-04-20T17:28:58-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "iPad at 10", "ipad mini", "stories" ] }, { "id": "https://www.macstories.net/?p=62816", "url": "https://www.macstories.net/stories/exploring-the-most-impactful-ipad-apps-of-the-decade/", "title": "Exploring the Most Impactful iPad Apps of the Decade", "content_html": "
\"\"


John: It’s hard to understate the importance of the iPad’s large screen. Early critics dismissed the device as a big iPhone, but that criticism revealed a fundamental misunderstanding of the product.

\n

By jumping from the iPhone’s small 3.5-inch display to one that approached 10 inches, the iPad delivered a canvas that allowed Apple and third-party developers to rethink not just the concept of mobile apps, but of apps altogether. The additional screen real estate allowed developers to flatten and spread UIs in a way that made new uses possible. That, in turn, led to richer, deeper experiences for everything from reading a comic book to managing complex projects and automating repetitive tasks, allowing users to interact directly with the software beneath their fingers.

\n

After years of using the very best apps developers have to offer on the iPad, it was remarkably easy for Federico, Ryan, and I to come up with a list of the iPad apps that have been the most impactful for us during the past decade. There’s a lot of factors at play in arriving at these apps. Some forged a path by adopting the latest Apple technologies in a unique way that set an example for apps that followed. Others are apps that define a category that takes unique advantage of the iPad’s hardware. These are also apps that work on the iPhone or Mac too, but are most at home on the iPad’s unique platform.

\n

Although there is no single formula for which iPad apps have been the most impactful, one thing each app in this collection shares is a rich, personal experience. These are apps inspired by and reflected in the image of Steve Jobs sitting onstage in a comfortable black leather chair swiping through photos. The iPad and the apps that run on it have come a long way since then, but the intimacy of directly manipulating apps that transform a slab of glass into anything a developer can imagine hasn’t changed, and remains what makes the iPad so special.

\n

\n

\n

Pythonista

\n

\n
\"\"

\n

Federico: Before Workflow revolutionized automation on iOS and developer tools were an official category on the App Store, there was Pythonista. Created by indie developer Ole Moritz, Pythonista is a Python IDE that lets you write and run traditional Python scripts on your iPad, with a twist: in addition to the Python standard library, Pythonista offers custom modules that bridge native iOS APIs (such as the clipboard, Safari, UIKit, EventKit, etc.) with Python, allowing you to create scripts that control system functionalities in an automated fashion.

\n

This may not sound so groundbreaking today since apps like Shortcuts and Scriptable are based on the same principle. It’s important to note, however, that Pythonista did all of this in 2012, when the App Store was young and the idea of writing code on an iOS device was consistently frowned upon. Ole Moritz’s idea was bold and revolutionary in the iOS app community at the time; I’d go as far as to say Pythonista pioneered the concept of iOS automation, which was later perfected by Workflow and remixed by a slew of other apps.

\n

Pythonista was the app that convinced me that trying to get all my work done from an iPad wasn’t necessarily going to be a failed experiment. Sure, I had to learn the basics of Python scripting, but in return for a small investment of my time, I was able to create all sorts of scripts that could provide functionalities that were missing from iOS – a crucial factor as I was just beginning my transition from the Mac to iPad in late 2012. I used Pythonista to create a variety of enhancements to my iPad experience, from converting Markdown to HTML to uploading images to our CDN and manipulating plain text in ways that no iPad text editor allowed at the time.

\n

Pythonista is still available today, and it recently received a major update with support for the latest features of iOS and iPadOS 13, plus a custom keyboard that lets you run Python scripts from any text field on your device. These days, all my automation happens in Shortcuts, but I have fond memories of Pythonista, and I believe it created a market that didn’t exist at the time, paving the way for Workflow and other iOS automation apps to follow.

\n

\n

Editorial

\n

\n
\"\"

\n

Federico: Following the launch of Pythonista, Ole Moritz had an idea: what if the underlying Python engine of Pythonista was used for a Markdown text editor? And thus Editorial was born.

\n

Editorial is very close to my heart. I started beta testing the app alongside a small group of other people in late 2012, when I was undergoing my last rounds of radiotherapy treatments; in my mind, Editorial will forever be tied to those happy memories. Editorial was also in beta for a long time: I started testing the app in November 2012, and it eventually launched in August 2013. During those 10 months, I had the privilege of submitting feedback and, in a way, nudging Moritz toward adopting or tweaking certain features. By the time Editorial launched to the public, my iPad workflow had been rebuilt around it, and I knew so much about the app, I wrote my first really long review about it, which I eventually turned into an eBook.

\n

Besides my personal memories of it, however, Editorial stands out in the iPad app ecosystem on its merits alone. In hindsight, Ole Moritz did two things absolutely right with Editorial: first, he built the app atop the same Python foundation of Pythonista, but he complemented that with a visual workflow editor that can be considered a precursor to Workflow and its variable-based system; second, Moritz realized the potential of a Sublime Text-inspired editor for iOS, and for this reason allowed users to script and control fundamental aspects of Editorial such as its text editor and text selection engine.

\n
\"Editorial's

Editorial’s text editor.

\n

The combination of these aspects resulted in the most powerful Markdown text editor the iPad has ever seen, which, sadly, is no longer receiving new features. For a couple years, however, I was able to do some wild things in Editorial: I created hundreds of workflows to insert and edit Markdown, control the built-in web browser, fetch data from remote services, and even publish my articles to WordPress without leaving the app.

\n

Before Workflow would show how to build an entire app around a visual automation editor, and before the iPad Pro even existed, Editorial showed that it was possible to create a professional app for iPad while playing by Apple’s rules. These days, iA Writer is my go-to text editor, but Editorial will always have a special place in my heart because it was the app that carried me through the transition from OS X to iOS.

\n

\n

Fantastical

\n

\n
\"\"

\n

Federico: When we first started planning our iPad at 10 special series with the rest of the MacStories team months ago, I thought long and hard about whether I wanted to propose Fantastical, Flexibits’ popular calendar app, as one of the outstanding iPad apps to feature in this collection. After all, the original iPad version, which came out in 2014, left much to be desired in terms of achieving feature parity with its desktop counterpart, and it never quite took advantage of the screen real estate of the 12.9” iPad Pro when it launched the following year. Ultimately, I decided it was worth highlighting Fantastical – the new version of the app – in this roundup because it symbolizes a new generation of iPad apps and encapsulates what modern iPad app development should be like in 2020.

\n

The new Fantastical for iPad takes what worked in the Mac version of the app and translates it to a modern iPad experience that is deeply integrated with iPadOS. Gone is the simplified calendar view of the original iPad app, replaced by a suite of views, which include a full monthly calendar reminiscent of the Mac app’s layout; the sidebar, customizable in size, takes advantage of the iPad’s large display to embed a mini calendar, and each item in the list can be tapped to open detail views as popovers. The difference between the old Fantastical for iPad and the new app is striking; the older version of the app was the product of a different era in iPad app design, when the notion of a “pro iPad app”, before the iPad Pro, was often derided and quickly dismissed.

\n

The new Fantastical for iPad represents a fresh trend in iPad app development – an effort from developers to bring powerful functionality from macOS to the nascent iPadOS platform while adapting it to the iPad’s different interactions and frameworks. Fantastical for iPad supports Split View, context menus, customizable icons, and advanced shortcuts based on parameters; the app can be used via touch, obviously, but it also integrates with keyboard shortcuts; in an upcoming update, which is coming out very soon, the app will also feature deep pointer support to let you navigate its UI and interact with the calendar using a mouse or trackpad.

\n

For all these reasons, Fantastical deserves to be included in this collection: besides being an excellent calendar client, it is a shining example of what to expect from the future of iPad app development – apps that can blend desktop-class functionality with multiple input systems, extensive support for native APIs, and the ability to scale across different screen sizes.

\n

\n

Yoink

\n

\n
\"\"

\n

Federico: Back in May 2017, before Apple’s announcement of iOS 11 at WWDC, I shared a wish list and concept video where, among other things, I imagined a system-wide “shelf” for storing bits of data via drag and drop.

\n

Here’s how I described the idea:

\n

\n The idea behind the Shelf is to make it as effortless as possible to hold something for later without the cognitive load of deciding which app or extension should receive it right away. The Shelf would be heavily tied to the new drag & drop framework and it’d be inspired by previous implementations (such as the NeXTSTEP shelf) and older desktop apps (examples: DragThing and Yoink). Think of it as a transient dock for temporary clippings, or, even better, as a multi-slot clipboard that can hold a variety of items and be consistently available across apps.

\n

The Shelf would be local to each iPad, and it could hold an infinite number of items; the Shelf would be paginated and users could scroll between multiple pages to find the item they’re looking for. Anything could be dropped in the Shelf: from text selections and images to phone numbers and even songs; as long as an item supports drag & drop on iOS 11, the Shelf could hold it for later.\n

\n

While my vision for a deeply integrated shelf might have been overly ambitious, that didn’t stop indie developer Matthias Gansrigler from launching an iOS version of Yoink – an app that has allowed me to perform nearly all the tasks I envisioned in that story from 2017.

\n

Yoink stands out in the iPad app ecosystem for two reasons: it is perhaps the only so-called shelf app that is still updated on a regular basis two years after its original release; and it is the most powerful expression of iPad drag and drop, quite possibly the most advanced and underrated developer framework on iPadOS.

\n

In Yoink, you can drop anything that can be dragged from any iPad app, whether it’s a link, an image, an email message from Mail, or a block of plain or rich text. If you can drag it, you can drop it in Yoink; later, when you need that piece of data again, you can drag it out of Yoink again and drop it in another app, where it’ll be inserted in the original format. But there’s more: by taking advantage of the more advanced aspects of the drag and drop APIs for developers, Yoink stores multiple “flavors” of each item, and each version can be individually dragged out of the app. For example, a block of text can be exported as plain or formatted text; an email message can be dropped elsewhere either as a Mail link, an attachment, or the plain text of the original message’s subject line.

\n

It is this richness of formats and integration with drag and drop that has turned Yoink into a must-have companion utility for all those power users who use their iPads primarily in touch mode. In the years I’ve covered iOS for iPad, now called iPadOS, I’ve never seen any other app offer such deep integration with the drag and drop framework. Aside from this technical achievement, however, Yoink provides essential functionality that has been sadly ignored by Apple: the peace of mind of knowing you can drop something – anything – in a safe place, and take it back with you at a later stage. Yoink lets me enjoy drag and drop on my iPad more, and it’s unique in its category.

\n

\n

Workflow

\n

\n
\"\"

\n

Federico: Few apps have had such a profound impact on the entire iOS app ecosystem as Workflow. Originally showcased in 2014 by a young, independent development team and released later that year after a long review period, Workflow seemingly achieved the impossible: in an era where most “productivity hacks” on iOS revolved around the aforementioned Pythonista and URL schemes, Workflow managed to blend the power and visual scripting of Apple’s Automator with native integration with modern iOS technologies. Rather than forcing users to learn a scripting language or deal with the complexities of encoding URL schemes, Workflow turned automation on its head with a drag and drop-based programming environment that empowered everyone to connect multiple apps and actions together, creating automated workflows that could save them time or make them more productive each day.

\n

Workflow started making waves in the iPhone and iPad community, and it soon became a must-have for all those users who were seeking to live the post-PC life and only work from an iPad. Apple noticed, and in early 2017 they acquired Workflow, later relaunching it as Shortcuts – an app that is now pre-installed on millions of devices and sports deep Siri integration and an even more powerful app automation framework.

\n

Today’s Shortcuts app is both similar to and drastically different from the original Workflow app. Shortcuts no longer relies on URL schemes to let apps communicate with each other and features an all-new editor based on parameters and natural language. At the same time, however, the original vision behind Workflow is still very much alive within Shortcuts, and that is why Workflow deserves to be in this list.

\n

Six years ago, Ari Weinstein, Conrad Kramer, and the rest of the Workflow team came up with ideas and features that are still at the foundation of Shortcuts today, from variables and conditional blocks to the Content Graph engine and the ability to share workflows with other users. The visual automation model and interaction paradigm behind Shortcuts was conceived for the original Workflow app in 2014; today, the mix of automation and native system frameworks is still unmatched on other platforms. At least for me, Workflow was the most important iPad app of the decade – a utility that singlehandedly changed how I was able to get work done on my iPad.

\n

\n

Ulysses

\n

\n
\"\"

\n

Ryan: My favorite writing app, Ulysses, got very serious about the iPad following the iPad Pro’s 2015 debut. The Markdown editor has a long history on the Mac, and existed on the iPad too before 2015, but by the time the iPad Pro launched the app started to fall behind on advancements like Split View and Slide Over multitasking and working with a 12.9-inch display. When Ulysses’ next big update arrived, it represented not only the adoption of those modern features, and the launch of an iPhone version, but also a fresh foundation that would see the iPad version become just as powerful as its Mac companion.

\n

Ulysses offers a unique twist on Markdown editing, offering full Markdown support but opting to hide certain syntax – most notably URLs – behind visual content blocks. This approach isn’t for everyone, but I absolutely love it. I have a hard time using traditional Markdown editors now because I’ve grown so spoiled by the way Ulysses hides links, displays image previews automatically, and by some of its other design choices. The editing interface is clean, minimal, and enables customization of key details like font, font size, and text spacing. When you write for a living, the last thing you want to do is stare at a displeasing editor design, so this is very important.

\n

Another strength of Ulysses is its top-notch export features, several of which I use all the time. Exporting to PDF provides an array of beautiful style options, more of which can be downloaded online or even customized yourself on the Mac. I also export to plain text Markdown regularly so I can save my drafts in Working Copy when collaborating with Federico and John. The most crucial export option for me, however, is WordPress publishing. This feature works flawlessly, offering access to all the tools you’d want such as tags and categories, and it’s something you just won’t find in practically any other Markdown editor.

\n

\n

MindNode

\n

\n
\"\"

\n

Ryan: Mind mapping never felt like a natural fit for me on computers until the iPad. I’m not a big mind mapper in the first place, but I have historically used pen and paper to create free-form mind maps when I need to get my thoughts down in an unfiltered, unstructured way. This practice never translated well to the Mac, but when MindNode released version 5 a few years ago on iOS, redesigning its UI and adopting system drag and drop, something finally changed.

\n

For me, the appeal of mind mapping is its flexibility, and MindNode beautifully retains that. If you think in outlines, Quick Entry mode lets you type an outline and have it instantly converted to a new mind map. If your thought process needs a little less structure, you can add assorted ideas to your map as separate nodes, then form the connections and structure later. Drag and drop is a key aid in this, as you can easily dump ideas or content from other apps into MindNode with a simple gesture, and also connect nodes to each other via touch interactions.

\n

MindNode offers a rich feature set that not only make it a great mind mapping tool, but a fantastic iPadOS citizen as well. Visual Tags are an elegant way to group related information, the delightful sticker set can beautify your map, Focus mode highlights what’s most important in a given moment, and export and theme options are extensive. In the realm of OS support, MindNode uses Files’ document browser, supports Split View, Slide Over, and multiwindowing, and offers some of the best external display integration on iPad.

\n

\n

Agenda

\n

\n
\"\"

\n

Ryan: The date-based notes app Agenda got its start on the Mac, which perhaps explains why its iPad app launched in such great shape. Many new apps begin on the iPhone, then when an iPad version debuts it’s a slightly adapted version that draws inspiration primarily from iOS’ simplicity, often not taking good advantage of the iPad’s large display or offering power user features like external keyboard control. Agenda, by contrast, nailed those things from the start.

\n

Agenda’s iPad layout is one of the best examples I can point to of an app scaling well from the 7.9-inch iPad mini’s display all the way up to the 12.9-inch iPad Pro’s. The app is organized into three different panes: all your projects are on the left, your notes are in the center, and the right contains a calendar agenda view, plus recently edited and related notes. On my iPad Pro, keeping all three panes on-screen at once is a fantastic way to fill out the display; it makes all the information I need visible at a glance. But the nice thing is that on smaller devices, or when using Split View, you can choose to keep just one or two panes visible at a time. More productivity apps could stand to learn from Agenda’s scalability, as too many simply ignore large-screened devices and build for the lowest common denominator.

\n

As I mentioned, Agenda’s external keyboard control is another strength. You can control almost everything in the app without taking your hands off the keyboard, giving Agenda easily one of the best keyboard implementations available on iPad. This is only one of the many pro features Agenda offers on both iPad and Mac – saved searches are another great one.

\n

\n

Notability

\n

\n
\"\"

\n

Ryan: Notability is a digital notebook app that has long been among the best of the App Store. Even prior to the days of the Apple Pencil, Notability excelled at merging the strengths of digital and analog notebooks in a single app. You can handwrite and insert typed text; do free-form highlighting of anything in your document; insert photos, GIFs, and document scans; audio recordings can be initiated and saved in your notes; and more. The app lets you create documents containing a healthy mix of all of these different elements, or edit an existing PDF with the same toolset. And while it’s become more common these days for note apps to have a diverse set of tools, Notability is one of the first apps to popularize this trend on iPad, and it’s continued being a strength of the app through years of the device’s transformation.

\n

Aside from its versatility, Notability’s primary strength is in how easy to use it is. The app has a clean, approachable design that’s a triumph in simplicity, particularly considering how much power lies under the hood. The main interface consists of standard column views of your notes and their folders (dubbed ‘Subjects’ in the app), and when you’ve opened a note, the various tool options are just as easy to navigate. Unlike many other digital notebooks, from my first use Notability has never confused me, I’ve never wondered how to access a given tool, or what a button would do when I pressed it. Because of its elegant, intuitive design, Notability’s a fantastic choice for people new to the iPad, yet somehow it’s just as excellent a choice for pro users.

\n

\n

Pixelmator

\n

\n
\"\"

\n

Ryan: I have a long history with Pixelmator, as the Mac version of the app was the first image editing software that really clicked with me. I dabbled in Photoshop a little in my teen years, but never cared for it largely because I didn’t want to invest the time into learning its full capabilities. Then Pixelmator came along and it was just what I needed: an intuitive and accessible, yet powerful layer-based editor. Over a decade later, Pixelmator’s still my go-to app, with the only change being that for most of that decade I’ve used it on iPad instead of Mac.

\n

Pixelmator took a few years to make its way to the iPad, first arriving in late 2014, but it’s always been a perfect fit for the device. Direct manipulation when working with image layers is far better than using a pointing device. Though my need for it has decreased in recent years thanks to the advent of automation tools in Shortcuts, there are still certain image editing tasks I always turn to Pixelmator for. My most common needs involve cutting backgrounds out of images, resizing images to better fit an article, using the retouch tool to erase unwanted objects from a photo, and bringing multiple images next to each other in a custom layout. Much of this functionality simply isn’t available in other apps, or if it is, it comes with a lot more complexity. Out of all the iPad apps I depend on, Pixelmator might be one I’d have the hardest time replacing.

\n

\n

Kindle

\n

\n
\"\"

\n

Ryan: In the early days of the iPad, one of Apple’s ambitions for the device was that it would be a great e-reader, hence the launch of iBooks (now Apple Books). As time has shown, however, the iPad’s success as an e-reader may owe more to the Kindle app than Apple Books. Amazon’s dominance in the books market is a story for another day, but that aside, the rising popularity of Kindle devices when the iPad launched, combined with Amazon’s growing role in shopping overall, has resulted in the Kindle app having a significant impact on the iPad’s last decade.

\n

While some users aren’t comfortable reading on an iPad, preferring either the e-ink technology of Kindle devices or an analog paperback, many others are perfectly happy using a multipurpose device like the iPad to read books. I’m one of those people, having shifted the entirety of my reading to the iPad. I prefer Apple Books over Kindle, but Amazon’s app and ecosystem have clearly had a bigger impact in the space of digital reading. More books are available on Kindle, and they’re available on both Apple and non-Apple devices alike. Though digital books haven’t killed paper ones, as some analysts originally predicted, they still represent a healthy chunk of the publishing market, and that’s largely thanks to the Kindle app and iPad.

\n

\n

Things

\n

\n
\"\"

\n

Ryan: Popular task manager Things has been an iPad mainstay since the beginning. The iPad launched with Things support, and 10 years later, the task manager is doing quite well thanks to a major makeover with Things 3 and regular updates ever since.

\n

Among a crowded task management market, Things stands out for its beautiful design, approachability, and healthy straddling of the spectrum between overly complex task managers and overly simple ones. It doesn’t provide as many options as task management heavyweights, but it’s also much more robust than basic task or list apps.

\n

One of Things’ greatest achievements on iPad is helping popularize the recent trend of offering full app navigation via a connected keyboard. In May 2018 Things 3.6 launched with a remarkable new approach for an iPad app: enabling nearly everything that can be done via touch to also be possible from the keyboard. Until that point, while many productivity apps offered keyboard shortcut support, it was practically never on the scale of what can be done in productivity tools on the Mac, which have long offered deep keyboard integration. Things showed that even on a touch-first device, iPad users found great value in having the option of going keyboard-first instead.

\n

\n

Luna Display

\n

\n
\"Source:

Source: Luna Display

\n

Ryan: Imagine if your iPad Home screen had an app on it that opened macOS in its entirety. That’s what Luna Display made possible, and in a way that was significantly more reliable and better optimized for touch input than any remote desktop tools that had come before it.

\n

Luna Display’s app pairs with the Luna Display hardware that plugs into your Mac. This enables a level of connection stability and responsiveness that you simply can’t get with app-only solutions such as Team Viewer. Once everything’s set up, your iPad can become a secondary display for your Mac, either hosting an extra screen’s worth of content or else mirroring your primary display, in which case you can gain easy access to macOS from wherever you are via an iPad. You can also, with a little extra work, use Luna Display to make an iPad your only display for a headless Mac mini. I did this for several months last year, since my Mac mini was used only for podcast recording. Whichever setup you choose, one thing is clear: it’s pretty remarkable getting to turn macOS into simply another app on your iPad.

\n

\n

Ferrite Recording Studio

\n

\n
\"\"

\n

Ryan: If you do any podcast-related work on an iPad, Ferrite Recording Studio is a must-have app. Designed both for podcast recording and editing, Ferrite puts all kinds of advanced tools at your disposal to provide a first-class podcasting experience on iPad. It’s truly a unique product on the App Store: Ferrite isn’t a watered-down iPad port of a Mac app, rather it started on the iPad and offers the kind of pro features you would expect from an equivalent Mac experience.

\n

One of Ferrite’s strengths is how versatile it is. Jason Snell edits podcasts in Ferrite using the Apple Pencil and touch input, taking advantage of the gesture customization options the app lets you configure. You can watch a video of him in action, and it’s truly impressive. Ferrite is flexible enough, however, that if you prefer a different mode of editing, that’s just as thoroughly supported. For example, I edit in Ferrite primarily from a connected keyboard. Last year I outlined my process, which has changed a little since then but still relies heavily on keyboard shortcuts. Ferrite’s one of the few iPad apps that lets you fully customize all shortcuts to your exact preferences.

\n

Audio recording and editing is a very specific professional niche, but it’s in such creative niches that the iPad has historically been lacking in quality tools. Apple created versions of Garage Band and iMovie to accompany the original iPad, and they were great in their time, but few developers took up the mantle from there; Apple, similarly, has kept its pro app efforts Mac-first. Five years after the iPad Pro debuted, however, Ferrite is a strong example of things beginning to change.

\n

\n

OmniFocus

\n

\n
\"\"

\n

Ryan: For all 10 years of the iPad’s life, OmniFocus has been a power user solution for getting things done on the platform. Though the app has undergone major changes over the years, it’s always retained a reputation for letting you manage tasks with as much power and precision as you need. Where some apps limit options so they can appeal to the widest set of users, OmniFocus knows its users want all the control they can get.

\n

The latest iteration of OmniFocus, version 3, does a remarkable job offering that rich feature set while nonetheless being more accessible than ever before. The customizable task inspector, replacement of contexts with tags, and even the more playful color scheme were all changes that went a long way toward helping those new to the app get comfortable with it. It’s good to offer a lot of options to users, but when those options clutter up the interface and make it harder to find the tools you actually care about, that’s a problem. In OmniFocus 3, the app took customization to a new level and, in the process, empowered users to essentially build their own personalized task manager using only the tools they cared about.

\n

I can’t talk about OmniFocus 3 without also mentioning the iPad specifically. In landscape, the app’s three panels – perspective view, project view, and inspector – can all stay visible at once for quicker access to the information you need. You can use drag and drop to easily create new tasks in the exact place you want, including in the Forecast to automatically assign a due date; dropping on the Forecast works with existing tasks too for assigning due dates. OmniFocus also supports multiwindowing, so you can keep separate lists and views open in different windows, and it offers plenty of keyboard shortcuts too. The Omni Group has shown by their work that they count the iPad as a first-class citizen in their development.

\n

\n

Comixology

\n

\n
\"\"

\n

John: When the iPad debuted, big-screen iPhones were still more than four years away. It’s hard to recall now, but without bigger iPhones and smaller iPads in the lineup, the jump from 3.5 inches to a 9.7-inch display felt absolutely enormous.

\n

With that increase in screen size came apps uniquely suited to take advantage of its expansive real estate. One of the earliest and most natural extensions of the new screen size was comic book readers, and none has proved as popular over the long haul as Comixology.

\n

The first iPad’s dimensions were not a perfect mapping to the size of a comic book’s page, but it was close enough and certainly much better than an iPhone. Its size was only part of the equation that made the iPad a perfect match for comics, though. The bright, colorful display brings the artwork of comic books to life. The iPad has also meant comic fans can carry enormous libraries of their favorite titles with them wherever they go and read them in new ways, like with Comixology’s Guided View feature.

\n

Not everyone is a fan of Guided View, which takes readers through cells on a page one-by-one, animating the transitions between each. Still, the feature improves the reading experience on smaller iPads, and I like it personally. For those who don’t, though, I’m glad there’s a full-screen option too.

\n

These days, the 11-inch iPad Pro is probably the closest in size to a physical comic book and one that I know comic fans often recommend. I’ve never owned the smaller iPad Pro and only dip into comics occasionally, but when I do, they look gorgeous on my 12.9-inch iPad Pro. Just as often, though, I read on my iPad mini using Guided View, because it’s just so comfortable to hold the mini in one hand, tapping from cell to cell.

\n

Regardless of the iPad you use, if you are a lapsed comic book fan or want to dip your toe in for the first time, give Comixology a try. The convenience and experience of reading comic books digitally on an iPad are truly delightful.

\n

\n

GoodNotes

\n

\n
\"\"

\n

John: I’ve written about GoodNotes often on MacStories and with good reason: it’s been one of my favorite iPad apps for a long time. GoodNotes is not alone in the note-taking category by any stretch of the imagination, but there’s a reason why it’s been featured in Apple ads and onstage at keynotes over and over again. What sets the app apart from the competition is its focus on providing users with the absolute best handwriting experience that feels incredibly natural.

\n

Note-taking apps were around from the earliest days of the iPad, but it was the Apple Pencil that put them in the spotlight. Before then, third-party styluses were available, but they weren’t nearly as good as the Apple Pencil was when it was introduced. In the years since then, the Apple Pencil has trickled down to even the iPad mini. That’s opened the note-taking experience up to more and more people, which has been good to see since you don’t need the most expensive, largest screen to get a lot out of a note-taking app like GoodNotes.

\n

What’s unique about an app like GoodNotes is that it’s very clearly an iPad app first and an iPhone and Mac app second. That’s because if you’re taking handwritten notes, an accessory like the Apple Pencil, and a big touch screen like the iPad’s, are so important. The iOS and Mac versions of GoodNotes are both excellent apps, too, but given a choice, I’ll grab any iPad to use it before I’ll pick up my iPhone or open the app on my Mac. Those apps serve as readers for the notes I take on my iPad and are used sparingly for note-taking themselves.

\n

Personally, I’ve found GoodNotes transformative in the way I take notes. For a long while, I’d carry a small notebook and pen with me. However, the reality is that I rarely am without an iPad nearby when I want to jot something down, and the benefit of having my notes synced almost instantly across all my devices far outweighs those times I don’t have an iPad available. With the expansion of the Apple Pencil across the iPad lineup and the introduction of GoodNotes’ fantastic Mac Catalyst app last fall, I expect we’ll continue to see GoodNotes pushing at the forefront of Apple’s technologies, with the iPad app sitting squarely in the center.

\n

\n

Linea Sketch

\n

\n
\"\"

\n

John: One of the things that some observers get wrong when they look at iPad apps is assuming that simple-to-use apps are also unsophisticated. Linea Sketch by The Iconfactory is a perfect example of what I mean. It’s the kind of app that anyone can pick up and use. You don’t need a deep understanding of drawing app conventions built up over the years or an online class to get started. Instead, the app’s incredibly natural-feeling UI guides users, abstracting away complexity and making the way the app works obviously from the get-go.

\n

Linea Sketch isn’t the most feature-rich drawing app on the App Store, but self-imposed constraints, like how its one screen equals one page of content, are a big part of what makes getting started with the app so easy. For many people interested in sketching, drawing quick diagrams, brainstorming, or mocking up UIs, Linea is all they’ll ever need. Over time, The Iconfactory has continued to add new features, like fill, blend, shape snapping, and versioning, which have extended the app’s power and flexibility without making it hard to use.

\n

One of the things I love about many of the art-creation apps on the iPad is their approachability. Part of that comes from the direct manipulation of the art you are creating. Everyone has picked up a pencil or pen and begun doodling at some point, and apps like Linea Sketch tap into that natural sort of activity with tools that aren’t intimidating to novices but are flexible enough to work for pros too.

\n

In December, The Iconfactory announced that Linea Sketch 3.0, which hasn’t been released yet, will be switching to a subscription business model. That’s a hard switch to make for an app that has been paid up front for the past three years, but The Iconfactory is doing it right by communicating early about their plans and why they are making the change. Not every user is likely to subscribe, but an app of Linea’s caliber requires regular maintenance and support, so I’m hopeful that it works out and Linea Sketch is developed long into the future.

\n

\n

Concepts

\n


\n##\n
\"\"

\n

John: I want to note up front that Concepts is the sponsor of our entire iPad at 10 series this week. I normally wouldn’t include a sponsor’s app in a story like this, except not only was Concepts on my list of influential apps before we ever approached them about sponsoring, but it’s also an app that I’ve used since late 2018, long before they were ever a sponsor.

\n

I also wanted to cover Concepts immediately following Linea Sketch because the two apps show off the range of the iPad so well. Both are drawing apps, but unlike Linea, which feels like sitting down in front of a single sheet of paper with a set of art tools, Concepts is something different altogether.

\n

Concepts is the sort of tool that doesn’t have a good physical-world analog. It’s an infinite canvas app that accommodates an incredibly broad spectrum of creative uses. I’ve used it to mind map big projects, doodle while editing a podcast, and outline upcoming stories, but it can be used to storyboard video, design buildings, create art for art’s sake, and a whole lot more.

\n
\"\"

\n

At its core, Concepts is about drawing, but as the name suggests, that’s a means to an end, which is exploring ideas visually. Those ideas may never be more than rough sketches and handwritten notes as they typically are for me or they may lead to polished artwork like the concept sketches of Yarrow Cheney, the director of the animated version of The Grinch, whom Federico and I interviewed on AppStories.

\n

Cheney explained during our interview that he uses Concepts because it reduces the distance between the idea in his head and recording it for use in a project. That comment has stuck with me because it perfectly captures what the best iPad apps get right. Through direct manipulation and intuitive controls, they reduce friction and get out of the way of ideas, allowing the user to execute on them almost effortlessly.

\n

\n

Procreate

\n

\n
\"\"

\n

John: Procreate has a long history on the iPad. It, too, is a drawing app that rethinks what that means in the context of the iPad. The app has benefitted greatly from Apple’s emphasis on the Pencil and its extension to an ever-increasing number of iPad models.

\n

For a long time, big companies ignored the iPad or built simplified versions of their app that omitted substantial portions of their apps’ functionality. That has begun to change with the introduction of the iPad Pro, but it gave companies like Savage Interactive, that make Procreate, room to reimagine what it means to be a drawing app unburdened by the legacy of a desktop app.

\n

My favorite aspect of Procreate is how elegantly it hides a deep catalog of brushes and other tools behind a minimalistic UI. The app can be used as a free-hand drawing or painting tool with its myriad of brushes, or in conjunction with photos that you import onto its canvas. The latest update to the app also features a straightforward way to transform your drawings into animations that takes no time at all to learn.

\n

Despite its large number of available brushes, Procreate also allows users to design and share their own. The app’s Brush Studio could easily be an app of its own, but it is neatly tucked away where it doesn’t get in the way when you don’t need it, which is a common theme among the best iPad apps.

\n

I don’t consider myself much of an artist, but I enjoy doodling in Procreate. It’s a creative outlet that I find relaxing and an app that I don’t find frustrating. Simply starting to sketch and poking around the UI as needs arise gets you very far in Procreate. Plus, with an excellent series of ‘Learn to Procreate’ videos from Savage, and an online Handbook with all the nitty-gritty details, Procreate is a shining example of what pro apps can be on the iPad.

\n

\n

Pixelmator Photo

\n

\n
\"\"

\n

John: I’ve used Pixelmator’s apps since the original Pixelmator was released long ago. Pixelmator Photo is one of the newer apps in this collection, but no less important. Photo editing apps, like drawing apps, have a long history on the desktop that comes with a lot of baggage. The iPad provided the Pixelmator team an opportunity to rethink the approach to photo editors and bring the latest technologies like machine learning along to assist users.

\n

The app can access photos from either your iCloud Photo Library or the system document browser, which provides access to any photo in iCloud Drive and the other file providers available in the Files app. The app features an extensive selection of parameters that can be tweaked for each photo. What I usually do, though, is start with the magic wand tool, which is powered by machine learning. I usually wind up tweaking an image more by hand, but the magic wand tool usually gets me most of what I want. There are similar machine learning-based adjustments that can be applied to individual categories of tools too.

\n

In addition to the many manual and automatic edits that can be made to pictures, Pixelmator Photo includes an excellent set of high-quality filters that can be applied to obtain a specific look. If you create a group of adjustments to an image that you like a lot, you can even save them as a custom, sharable filter.

\n

I use Pixelmator Photo to edit photos all the time. It’s not the only photo editor I use, but what I appreciate about it is that the app produces much better results than the Photos app’s built-in tools with only a little more effort. That makes it perfect for cleaning up a group of photos quickly when you want to share them right away. The machine learning isn’t always perfect, but by giving me a head start on edits, Pixelmator Photo saves time, allowing me to dial in the last 10% of whatever look I want.

\n

\n

Affinity Photo and Designer

\n

\n
\"\"

\n

John: Affinity’s apps are another excellent example of apps built without the burden of a long desktop history. Affinity makes desktop versions of its apps, but the fact that its iPad apps are developed in tandem with the desktop versions is part of what makes them special, ensuring that they work in concert, allowing users who use one version to move easily to the other.

\n

Affinity’s apps feature UIs with a unique design of their own that works well when moving between the company’s apps. Each app relies on narrow toolbars along the top and sides of the screen with additional tool settings appearing along the bottom of the screen as needed.

\n

For many users, Affinity Photo can serve as a replacement for desktop apps like Adobe Photoshop. The app supports Adobe PSD files, unlimited layers, customizable brushes, and a wide array of tools for manipulating your images. Affinity Designer provides a similarly sophisticated set of tools for working with vector graphics. With tight integration between the apps and their desktop counterparts, the apps’ maker, Serif, has created an end-to-end workflow that serves a wide variety of needs.

\n

One of my favorite touches in both apps is the ability to customize keyboard shortcuts. This isn’t something you see very often on the iPad, and I hope it becomes more prevalent as pro apps multiply.

\n

Affinity’s suite of apps on the iPad and their tight integration with their Mac versions is where I hope a lot of pro Mac and iPad apps wind up. Today, that sort integration is rare, but with Mac Catalyst and SwiftUI, this is the direction Apple seems to be heading and one I expect will benefit the users of pro creative apps more and more over time.

\n

\n

Adobe Photoshop

\n

\n
\"\"

\n

John: Ever since I started writing about apps, I’ve heard about this or that app that would be the next Photoshop. No doubt, Adobe heard that too and decided Photoshop should be the next Photoshop instead.

\n

I give Adobe a lot of credit. Photoshop is thirty years old on the Mac, and instead of trying to graft its complex Mac UI onto the iPad, Adobe started from scratch, rebuilding its core engine to work on both platforms then implementing Photoshop’s tools in a manner that makes sense given the iPad’s hardware and input methods.

\n

Today, there’s a lot of desktop Photoshop that still isn’t available in Photoshop on the iPad, but Adobe’s update history and past announcements suggest that the company is fully committed to building out functionality rapidly. Replicating Photoshop in an iPad-centric way is a tall order, but the features Adobe has implemented so far and the system-level features supported like Split View, Slide Over, dark mode, keyboard shortcuts, and the system document browser are excellent.

\n

Photoshop also represents a unique opportunity for users to learn Photoshop. The Mac version of the app, with its 30 years of features, is intimidating. The iPad version, however, is very approachable. I use Photoshop on the iPad for lightweight compositing work on MacStories projects all the time. I would never have thought to make those same edits in Photoshop on the Mac because it just seemed unapproachable and like too much tool for the task. With some basic Photoshop experience under my belt on the iPad, though, I’ve found myself experimenting with the desktop version too. It’s a virtuous cycle where the iPad and Mac versions of pro apps support each other and extend each other’s user base by working as a tight-knit unit, which I expect will be central to the next chapter of the iPad’s story.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "John: It’s hard to understate the importance of the iPad’s large screen. Early critics dismissed the device as a big iPhone, but that criticism revealed a fundamental misunderstanding of the product.\nBy jumping from the iPhone’s small 3.5-inch display to one that approached 10 inches, the iPad delivered a canvas that allowed Apple and third-party developers to rethink not just the concept of mobile apps, but of apps altogether. The additional screen real estate allowed developers to flatten and spread UIs in a way that made new uses possible. That, in turn, led to richer, deeper experiences for everything from reading a comic book to managing complex projects and automating repetitive tasks, allowing users to interact directly with the software beneath their fingers.\nAfter years of using the very best apps developers have to offer on the iPad, it was remarkably easy for Federico, Ryan, and I to come up with a list of the iPad apps that have been the most impactful for us during the past decade. There’s a lot of factors at play in arriving at these apps. Some forged a path by adopting the latest Apple technologies in a unique way that set an example for apps that followed. Others are apps that define a category that takes unique advantage of the iPad’s hardware. These are also apps that work on the iPhone or Mac too, but are most at home on the iPad’s unique platform.\nSupported By\nConcepts\n\nConcepts: Where ideas take shape\nAlthough there is no single formula for which iPad apps have been the most impactful, one thing each app in this collection shares is a rich, personal experience. These are apps inspired by and reflected in the image of Steve Jobs sitting onstage in a comfortable black leather chair swiping through photos. The iPad and the apps that run on it have come a long way since then, but the intimacy of directly manipulating apps that transform a slab of glass into anything a developer can imagine hasn’t changed, and remains what makes the iPad so special.\n\n\nPythonista\n\n\nFederico: Before Workflow revolutionized automation on iOS and developer tools were an official category on the App Store, there was Pythonista. Created by indie developer Ole Moritz, Pythonista is a Python IDE that lets you write and run traditional Python scripts on your iPad, with a twist: in addition to the Python standard library, Pythonista offers custom modules that bridge native iOS APIs (such as the clipboard, Safari, UIKit, EventKit, etc.) with Python, allowing you to create scripts that control system functionalities in an automated fashion.\nThis may not sound so groundbreaking today since apps like Shortcuts and Scriptable are based on the same principle. It’s important to note, however, that Pythonista did all of this in 2012, when the App Store was young and the idea of writing code on an iOS device was consistently frowned upon. Ole Moritz’s idea was bold and revolutionary in the iOS app community at the time; I’d go as far as to say Pythonista pioneered the concept of iOS automation, which was later perfected by Workflow and remixed by a slew of other apps.\nPythonista was the app that convinced me that trying to get all my work done from an iPad wasn’t necessarily going to be a failed experiment. Sure, I had to learn the basics of Python scripting, but in return for a small investment of my time, I was able to create all sorts of scripts that could provide functionalities that were missing from iOS – a crucial factor as I was just beginning my transition from the Mac to iPad in late 2012. I used Pythonista to create a variety of enhancements to my iPad experience, from converting Markdown to HTML to uploading images to our CDN and manipulating plain text in ways that no iPad text editor allowed at the time.\nPythonista is still available today, and it recently received a major update with support for the latest features of iOS and iPadOS 13, plus a custom keyboard that lets you run Python scripts from any text field on your device. These days, all my automation happens in Shortcuts, but I have fond memories of Pythonista, and I believe it created a market that didn’t exist at the time, paving the way for Workflow and other iOS automation apps to follow.\n\nEditorial\n\n\nFederico: Following the launch of Pythonista, Ole Moritz had an idea: what if the underlying Python engine of Pythonista was used for a Markdown text editor? And thus Editorial was born.\nEditorial is very close to my heart. I started beta testing the app alongside a small group of other people in late 2012, when I was undergoing my last rounds of radiotherapy treatments; in my mind, Editorial will forever be tied to those happy memories. Editorial was also in beta for a long time: I started testing the app in November 2012, and it eventually launched in August 2013. During those 10 months, I had the privilege of submitting feedback and, in a way, nudging Moritz toward adopting or tweaking certain features. By the time Editorial launched to the public, my iPad workflow had been rebuilt around it, and I knew so much about the app, I wrote my first really long review about it, which I eventually turned into an eBook.\nBesides my personal memories of it, however, Editorial stands out in the iPad app ecosystem on its merits alone. In hindsight, Ole Moritz did two things absolutely right with Editorial: first, he built the app atop the same Python foundation of Pythonista, but he complemented that with a visual workflow editor that can be considered a precursor to Workflow and its variable-based system; second, Moritz realized the potential of a Sublime Text-inspired editor for iOS, and for this reason allowed users to script and control fundamental aspects of Editorial such as its text editor and text selection engine.\nEditorial’s text editor.\nThe combination of these aspects resulted in the most powerful Markdown text editor the iPad has ever seen, which, sadly, is no longer receiving new features. For a couple years, however, I was able to do some wild things in Editorial: I created hundreds of workflows to insert and edit Markdown, control the built-in web browser, fetch data from remote services, and even publish my articles to WordPress without leaving the app.\nBefore Workflow would show how to build an entire app around a visual automation editor, and before the iPad Pro even existed, Editorial showed that it was possible to create a professional app for iPad while playing by Apple’s rules. These days, iA Writer is my go-to text editor, but Editorial will always have a special place in my heart because it was the app that carried me through the transition from OS X to iOS.\n\nFantastical\n\n\nFederico: When we first started planning our iPad at 10 special series with the rest of the MacStories team months ago, I thought long and hard about whether I wanted to propose Fantastical, Flexibits’ popular calendar app, as one of the outstanding iPad apps to feature in this collection. After all, the original iPad version, which came out in 2014, left much to be desired in terms of achieving feature parity with its desktop counterpart, and it never quite took advantage of the screen real estate of the 12.9” iPad Pro when it launched the following year. Ultimately, I decided it was worth highlighting Fantastical – the new version of the app – in this roundup because it symbolizes a new generation of iPad apps and encapsulates what modern iPad app development should be like in 2020.\nThe new Fantastical for iPad takes what worked in the Mac version of the app and translates it to a modern iPad experience that is deeply integrated with iPadOS. Gone is the simplified calendar view of the original iPad app, replaced by a suite of views, which include a full monthly calendar reminiscent of the Mac app’s layout; the sidebar, customizable in size, takes advantage of the iPad’s large display to embed a mini calendar, and each item in the list can be tapped to open detail views as popovers. The difference between the old Fantastical for iPad and the new app is striking; the older version of the app was the product of a different era in iPad app design, when the notion of a “pro iPad app”, before the iPad Pro, was often derided and quickly dismissed.\nThe new Fantastical for iPad represents a fresh trend in iPad app development – an effort from developers to bring powerful functionality from macOS to the nascent iPadOS platform while adapting it to the iPad’s different interactions and frameworks. Fantastical for iPad supports Split View, context menus, customizable icons, and advanced shortcuts based on parameters; the app can be used via touch, obviously, but it also integrates with keyboard shortcuts; in an upcoming update, which is coming out very soon, the app will also feature deep pointer support to let you navigate its UI and interact with the calendar using a mouse or trackpad.\nFor all these reasons, Fantastical deserves to be included in this collection: besides being an excellent calendar client, it is a shining example of what to expect from the future of iPad app development – apps that can blend desktop-class functionality with multiple input systems, extensive support for native APIs, and the ability to scale across different screen sizes.\n\nYoink\n\n\nFederico: Back in May 2017, before Apple’s announcement of iOS 11 at WWDC, I shared a wish list and concept video where, among other things, I imagined a system-wide “shelf” for storing bits of data via drag and drop.\nHere’s how I described the idea:\n\n The idea behind the Shelf is to make it as effortless as possible to hold something for later without the cognitive load of deciding which app or extension should receive it right away. The Shelf would be heavily tied to the new drag & drop framework and it’d be inspired by previous implementations (such as the NeXTSTEP shelf) and older desktop apps (examples: DragThing and Yoink). Think of it as a transient dock for temporary clippings, or, even better, as a multi-slot clipboard that can hold a variety of items and be consistently available across apps.\n The Shelf would be local to each iPad, and it could hold an infinite number of items; the Shelf would be paginated and users could scroll between multiple pages to find the item they’re looking for. Anything could be dropped in the Shelf: from text selections and images to phone numbers and even songs; as long as an item supports drag & drop on iOS 11, the Shelf could hold it for later.\n\nWhile my vision for a deeply integrated shelf might have been overly ambitious, that didn’t stop indie developer Matthias Gansrigler from launching an iOS version of Yoink – an app that has allowed me to perform nearly all the tasks I envisioned in that story from 2017.\nYoink stands out in the iPad app ecosystem for two reasons: it is perhaps the only so-called shelf app that is still updated on a regular basis two years after its original release; and it is the most powerful expression of iPad drag and drop, quite possibly the most advanced and underrated developer framework on iPadOS.\nIn Yoink, you can drop anything that can be dragged from any iPad app, whether it’s a link, an image, an email message from Mail, or a block of plain or rich text. If you can drag it, you can drop it in Yoink; later, when you need that piece of data again, you can drag it out of Yoink again and drop it in another app, where it’ll be inserted in the original format. But there’s more: by taking advantage of the more advanced aspects of the drag and drop APIs for developers, Yoink stores multiple “flavors” of each item, and each version can be individually dragged out of the app. For example, a block of text can be exported as plain or formatted text; an email message can be dropped elsewhere either as a Mail link, an attachment, or the plain text of the original message’s subject line.\nIt is this richness of formats and integration with drag and drop that has turned Yoink into a must-have companion utility for all those power users who use their iPads primarily in touch mode. In the years I’ve covered iOS for iPad, now called iPadOS, I’ve never seen any other app offer such deep integration with the drag and drop framework. Aside from this technical achievement, however, Yoink provides essential functionality that has been sadly ignored by Apple: the peace of mind of knowing you can drop something – anything – in a safe place, and take it back with you at a later stage. Yoink lets me enjoy drag and drop on my iPad more, and it’s unique in its category.\n\nWorkflow\n\n\nFederico: Few apps have had such a profound impact on the entire iOS app ecosystem as Workflow. Originally showcased in 2014 by a young, independent development team and released later that year after a long review period, Workflow seemingly achieved the impossible: in an era where most “productivity hacks” on iOS revolved around the aforementioned Pythonista and URL schemes, Workflow managed to blend the power and visual scripting of Apple’s Automator with native integration with modern iOS technologies. Rather than forcing users to learn a scripting language or deal with the complexities of encoding URL schemes, Workflow turned automation on its head with a drag and drop-based programming environment that empowered everyone to connect multiple apps and actions together, creating automated workflows that could save them time or make them more productive each day.\nWorkflow started making waves in the iPhone and iPad community, and it soon became a must-have for all those users who were seeking to live the post-PC life and only work from an iPad. Apple noticed, and in early 2017 they acquired Workflow, later relaunching it as Shortcuts – an app that is now pre-installed on millions of devices and sports deep Siri integration and an even more powerful app automation framework.\nToday’s Shortcuts app is both similar to and drastically different from the original Workflow app. Shortcuts no longer relies on URL schemes to let apps communicate with each other and features an all-new editor based on parameters and natural language. At the same time, however, the original vision behind Workflow is still very much alive within Shortcuts, and that is why Workflow deserves to be in this list.\nSix years ago, Ari Weinstein, Conrad Kramer, and the rest of the Workflow team came up with ideas and features that are still at the foundation of Shortcuts today, from variables and conditional blocks to the Content Graph engine and the ability to share workflows with other users. The visual automation model and interaction paradigm behind Shortcuts was conceived for the original Workflow app in 2014; today, the mix of automation and native system frameworks is still unmatched on other platforms. At least for me, Workflow was the most important iPad app of the decade – a utility that singlehandedly changed how I was able to get work done on my iPad.\n\nUlysses\n\n\nRyan: My favorite writing app, Ulysses, got very serious about the iPad following the iPad Pro’s 2015 debut. The Markdown editor has a long history on the Mac, and existed on the iPad too before 2015, but by the time the iPad Pro launched the app started to fall behind on advancements like Split View and Slide Over multitasking and working with a 12.9-inch display. When Ulysses’ next big update arrived, it represented not only the adoption of those modern features, and the launch of an iPhone version, but also a fresh foundation that would see the iPad version become just as powerful as its Mac companion.\nUlysses offers a unique twist on Markdown editing, offering full Markdown support but opting to hide certain syntax – most notably URLs – behind visual content blocks. This approach isn’t for everyone, but I absolutely love it. I have a hard time using traditional Markdown editors now because I’ve grown so spoiled by the way Ulysses hides links, displays image previews automatically, and by some of its other design choices. The editing interface is clean, minimal, and enables customization of key details like font, font size, and text spacing. When you write for a living, the last thing you want to do is stare at a displeasing editor design, so this is very important.\nAnother strength of Ulysses is its top-notch export features, several of which I use all the time. Exporting to PDF provides an array of beautiful style options, more of which can be downloaded online or even customized yourself on the Mac. I also export to plain text Markdown regularly so I can save my drafts in Working Copy when collaborating with Federico and John. The most crucial export option for me, however, is WordPress publishing. This feature works flawlessly, offering access to all the tools you’d want such as tags and categories, and it’s something you just won’t find in practically any other Markdown editor.\n\nMindNode\n\n\nRyan: Mind mapping never felt like a natural fit for me on computers until the iPad. I’m not a big mind mapper in the first place, but I have historically used pen and paper to create free-form mind maps when I need to get my thoughts down in an unfiltered, unstructured way. This practice never translated well to the Mac, but when MindNode released version 5 a few years ago on iOS, redesigning its UI and adopting system drag and drop, something finally changed.\nFor me, the appeal of mind mapping is its flexibility, and MindNode beautifully retains that. If you think in outlines, Quick Entry mode lets you type an outline and have it instantly converted to a new mind map. If your thought process needs a little less structure, you can add assorted ideas to your map as separate nodes, then form the connections and structure later. Drag and drop is a key aid in this, as you can easily dump ideas or content from other apps into MindNode with a simple gesture, and also connect nodes to each other via touch interactions.\nMindNode offers a rich feature set that not only make it a great mind mapping tool, but a fantastic iPadOS citizen as well. Visual Tags are an elegant way to group related information, the delightful sticker set can beautify your map, Focus mode highlights what’s most important in a given moment, and export and theme options are extensive. In the realm of OS support, MindNode uses Files’ document browser, supports Split View, Slide Over, and multiwindowing, and offers some of the best external display integration on iPad.\n\nAgenda\n\n\nRyan: The date-based notes app Agenda got its start on the Mac, which perhaps explains why its iPad app launched in such great shape. Many new apps begin on the iPhone, then when an iPad version debuts it’s a slightly adapted version that draws inspiration primarily from iOS’ simplicity, often not taking good advantage of the iPad’s large display or offering power user features like external keyboard control. Agenda, by contrast, nailed those things from the start.\nAgenda’s iPad layout is one of the best examples I can point to of an app scaling well from the 7.9-inch iPad mini’s display all the way up to the 12.9-inch iPad Pro’s. The app is organized into three different panes: all your projects are on the left, your notes are in the center, and the right contains a calendar agenda view, plus recently edited and related notes. On my iPad Pro, keeping all three panes on-screen at once is a fantastic way to fill out the display; it makes all the information I need visible at a glance. But the nice thing is that on smaller devices, or when using Split View, you can choose to keep just one or two panes visible at a time. More productivity apps could stand to learn from Agenda’s scalability, as too many simply ignore large-screened devices and build for the lowest common denominator.\nAs I mentioned, Agenda’s external keyboard control is another strength. You can control almost everything in the app without taking your hands off the keyboard, giving Agenda easily one of the best keyboard implementations available on iPad. This is only one of the many pro features Agenda offers on both iPad and Mac – saved searches are another great one.\n\nNotability\n\n\nRyan: Notability is a digital notebook app that has long been among the best of the App Store. Even prior to the days of the Apple Pencil, Notability excelled at merging the strengths of digital and analog notebooks in a single app. You can handwrite and insert typed text; do free-form highlighting of anything in your document; insert photos, GIFs, and document scans; audio recordings can be initiated and saved in your notes; and more. The app lets you create documents containing a healthy mix of all of these different elements, or edit an existing PDF with the same toolset. And while it’s become more common these days for note apps to have a diverse set of tools, Notability is one of the first apps to popularize this trend on iPad, and it’s continued being a strength of the app through years of the device’s transformation.\nAside from its versatility, Notability’s primary strength is in how easy to use it is. The app has a clean, approachable design that’s a triumph in simplicity, particularly considering how much power lies under the hood. The main interface consists of standard column views of your notes and their folders (dubbed ‘Subjects’ in the app), and when you’ve opened a note, the various tool options are just as easy to navigate. Unlike many other digital notebooks, from my first use Notability has never confused me, I’ve never wondered how to access a given tool, or what a button would do when I pressed it. Because of its elegant, intuitive design, Notability’s a fantastic choice for people new to the iPad, yet somehow it’s just as excellent a choice for pro users.\n\nPixelmator\n\n\nRyan: I have a long history with Pixelmator, as the Mac version of the app was the first image editing software that really clicked with me. I dabbled in Photoshop a little in my teen years, but never cared for it largely because I didn’t want to invest the time into learning its full capabilities. Then Pixelmator came along and it was just what I needed: an intuitive and accessible, yet powerful layer-based editor. Over a decade later, Pixelmator’s still my go-to app, with the only change being that for most of that decade I’ve used it on iPad instead of Mac.\nPixelmator took a few years to make its way to the iPad, first arriving in late 2014, but it’s always been a perfect fit for the device. Direct manipulation when working with image layers is far better than using a pointing device. Though my need for it has decreased in recent years thanks to the advent of automation tools in Shortcuts, there are still certain image editing tasks I always turn to Pixelmator for. My most common needs involve cutting backgrounds out of images, resizing images to better fit an article, using the retouch tool to erase unwanted objects from a photo, and bringing multiple images next to each other in a custom layout. Much of this functionality simply isn’t available in other apps, or if it is, it comes with a lot more complexity. Out of all the iPad apps I depend on, Pixelmator might be one I’d have the hardest time replacing.\n\nKindle\n\n\nRyan: In the early days of the iPad, one of Apple’s ambitions for the device was that it would be a great e-reader, hence the launch of iBooks (now Apple Books). As time has shown, however, the iPad’s success as an e-reader may owe more to the Kindle app than Apple Books. Amazon’s dominance in the books market is a story for another day, but that aside, the rising popularity of Kindle devices when the iPad launched, combined with Amazon’s growing role in shopping overall, has resulted in the Kindle app having a significant impact on the iPad’s last decade.\nWhile some users aren’t comfortable reading on an iPad, preferring either the e-ink technology of Kindle devices or an analog paperback, many others are perfectly happy using a multipurpose device like the iPad to read books. I’m one of those people, having shifted the entirety of my reading to the iPad. I prefer Apple Books over Kindle, but Amazon’s app and ecosystem have clearly had a bigger impact in the space of digital reading. More books are available on Kindle, and they’re available on both Apple and non-Apple devices alike. Though digital books haven’t killed paper ones, as some analysts originally predicted, they still represent a healthy chunk of the publishing market, and that’s largely thanks to the Kindle app and iPad.\n\nThings\n\n\nRyan: Popular task manager Things has been an iPad mainstay since the beginning. The iPad launched with Things support, and 10 years later, the task manager is doing quite well thanks to a major makeover with Things 3 and regular updates ever since.\nAmong a crowded task management market, Things stands out for its beautiful design, approachability, and healthy straddling of the spectrum between overly complex task managers and overly simple ones. It doesn’t provide as many options as task management heavyweights, but it’s also much more robust than basic task or list apps.\nOne of Things’ greatest achievements on iPad is helping popularize the recent trend of offering full app navigation via a connected keyboard. In May 2018 Things 3.6 launched with a remarkable new approach for an iPad app: enabling nearly everything that can be done via touch to also be possible from the keyboard. Until that point, while many productivity apps offered keyboard shortcut support, it was practically never on the scale of what can be done in productivity tools on the Mac, which have long offered deep keyboard integration. Things showed that even on a touch-first device, iPad users found great value in having the option of going keyboard-first instead.\n\nLuna Display\n\nSource: Luna Display\nRyan: Imagine if your iPad Home screen had an app on it that opened macOS in its entirety. That’s what Luna Display made possible, and in a way that was significantly more reliable and better optimized for touch input than any remote desktop tools that had come before it.\nLuna Display’s app pairs with the Luna Display hardware that plugs into your Mac. This enables a level of connection stability and responsiveness that you simply can’t get with app-only solutions such as Team Viewer. Once everything’s set up, your iPad can become a secondary display for your Mac, either hosting an extra screen’s worth of content or else mirroring your primary display, in which case you can gain easy access to macOS from wherever you are via an iPad. You can also, with a little extra work, use Luna Display to make an iPad your only display for a headless Mac mini. I did this for several months last year, since my Mac mini was used only for podcast recording. Whichever setup you choose, one thing is clear: it’s pretty remarkable getting to turn macOS into simply another app on your iPad.\n\nFerrite Recording Studio\n\n\nRyan: If you do any podcast-related work on an iPad, Ferrite Recording Studio is a must-have app. Designed both for podcast recording and editing, Ferrite puts all kinds of advanced tools at your disposal to provide a first-class podcasting experience on iPad. It’s truly a unique product on the App Store: Ferrite isn’t a watered-down iPad port of a Mac app, rather it started on the iPad and offers the kind of pro features you would expect from an equivalent Mac experience.\nOne of Ferrite’s strengths is how versatile it is. Jason Snell edits podcasts in Ferrite using the Apple Pencil and touch input, taking advantage of the gesture customization options the app lets you configure. You can watch a video of him in action, and it’s truly impressive. Ferrite is flexible enough, however, that if you prefer a different mode of editing, that’s just as thoroughly supported. For example, I edit in Ferrite primarily from a connected keyboard. Last year I outlined my process, which has changed a little since then but still relies heavily on keyboard shortcuts. Ferrite’s one of the few iPad apps that lets you fully customize all shortcuts to your exact preferences.\nAudio recording and editing is a very specific professional niche, but it’s in such creative niches that the iPad has historically been lacking in quality tools. Apple created versions of Garage Band and iMovie to accompany the original iPad, and they were great in their time, but few developers took up the mantle from there; Apple, similarly, has kept its pro app efforts Mac-first. Five years after the iPad Pro debuted, however, Ferrite is a strong example of things beginning to change.\n\nOmniFocus\n\n\nRyan: For all 10 years of the iPad’s life, OmniFocus has been a power user solution for getting things done on the platform. Though the app has undergone major changes over the years, it’s always retained a reputation for letting you manage tasks with as much power and precision as you need. Where some apps limit options so they can appeal to the widest set of users, OmniFocus knows its users want all the control they can get.\nThe latest iteration of OmniFocus, version 3, does a remarkable job offering that rich feature set while nonetheless being more accessible than ever before. The customizable task inspector, replacement of contexts with tags, and even the more playful color scheme were all changes that went a long way toward helping those new to the app get comfortable with it. It’s good to offer a lot of options to users, but when those options clutter up the interface and make it harder to find the tools you actually care about, that’s a problem. In OmniFocus 3, the app took customization to a new level and, in the process, empowered users to essentially build their own personalized task manager using only the tools they cared about.\nI can’t talk about OmniFocus 3 without also mentioning the iPad specifically. In landscape, the app’s three panels – perspective view, project view, and inspector – can all stay visible at once for quicker access to the information you need. You can use drag and drop to easily create new tasks in the exact place you want, including in the Forecast to automatically assign a due date; dropping on the Forecast works with existing tasks too for assigning due dates. OmniFocus also supports multiwindowing, so you can keep separate lists and views open in different windows, and it offers plenty of keyboard shortcuts too. The Omni Group has shown by their work that they count the iPad as a first-class citizen in their development.\n\nComixology\n\n\nJohn: When the iPad debuted, big-screen iPhones were still more than four years away. It’s hard to recall now, but without bigger iPhones and smaller iPads in the lineup, the jump from 3.5 inches to a 9.7-inch display felt absolutely enormous.\nWith that increase in screen size came apps uniquely suited to take advantage of its expansive real estate. One of the earliest and most natural extensions of the new screen size was comic book readers, and none has proved as popular over the long haul as Comixology.\nThe first iPad’s dimensions were not a perfect mapping to the size of a comic book’s page, but it was close enough and certainly much better than an iPhone. Its size was only part of the equation that made the iPad a perfect match for comics, though. The bright, colorful display brings the artwork of comic books to life. The iPad has also meant comic fans can carry enormous libraries of their favorite titles with them wherever they go and read them in new ways, like with Comixology’s Guided View feature.\nNot everyone is a fan of Guided View, which takes readers through cells on a page one-by-one, animating the transitions between each. Still, the feature improves the reading experience on smaller iPads, and I like it personally. For those who don’t, though, I’m glad there’s a full-screen option too.\nThese days, the 11-inch iPad Pro is probably the closest in size to a physical comic book and one that I know comic fans often recommend. I’ve never owned the smaller iPad Pro and only dip into comics occasionally, but when I do, they look gorgeous on my 12.9-inch iPad Pro. Just as often, though, I read on my iPad mini using Guided View, because it’s just so comfortable to hold the mini in one hand, tapping from cell to cell.\nRegardless of the iPad you use, if you are a lapsed comic book fan or want to dip your toe in for the first time, give Comixology a try. The convenience and experience of reading comic books digitally on an iPad are truly delightful.\n\nGoodNotes\n\n\nJohn: I’ve written about GoodNotes often on MacStories and with good reason: it’s been one of my favorite iPad apps for a long time. GoodNotes is not alone in the note-taking category by any stretch of the imagination, but there’s a reason why it’s been featured in Apple ads and onstage at keynotes over and over again. What sets the app apart from the competition is its focus on providing users with the absolute best handwriting experience that feels incredibly natural.\nNote-taking apps were around from the earliest days of the iPad, but it was the Apple Pencil that put them in the spotlight. Before then, third-party styluses were available, but they weren’t nearly as good as the Apple Pencil was when it was introduced. In the years since then, the Apple Pencil has trickled down to even the iPad mini. That’s opened the note-taking experience up to more and more people, which has been good to see since you don’t need the most expensive, largest screen to get a lot out of a note-taking app like GoodNotes.\nWhat’s unique about an app like GoodNotes is that it’s very clearly an iPad app first and an iPhone and Mac app second. That’s because if you’re taking handwritten notes, an accessory like the Apple Pencil, and a big touch screen like the iPad’s, are so important. The iOS and Mac versions of GoodNotes are both excellent apps, too, but given a choice, I’ll grab any iPad to use it before I’ll pick up my iPhone or open the app on my Mac. Those apps serve as readers for the notes I take on my iPad and are used sparingly for note-taking themselves.\nPersonally, I’ve found GoodNotes transformative in the way I take notes. For a long while, I’d carry a small notebook and pen with me. However, the reality is that I rarely am without an iPad nearby when I want to jot something down, and the benefit of having my notes synced almost instantly across all my devices far outweighs those times I don’t have an iPad available. With the expansion of the Apple Pencil across the iPad lineup and the introduction of GoodNotes’ fantastic Mac Catalyst app last fall, I expect we’ll continue to see GoodNotes pushing at the forefront of Apple’s technologies, with the iPad app sitting squarely in the center.\n\nLinea Sketch\n\n\nJohn: One of the things that some observers get wrong when they look at iPad apps is assuming that simple-to-use apps are also unsophisticated. Linea Sketch by The Iconfactory is a perfect example of what I mean. It’s the kind of app that anyone can pick up and use. You don’t need a deep understanding of drawing app conventions built up over the years or an online class to get started. Instead, the app’s incredibly natural-feeling UI guides users, abstracting away complexity and making the way the app works obviously from the get-go.\nLinea Sketch isn’t the most feature-rich drawing app on the App Store, but self-imposed constraints, like how its one screen equals one page of content, are a big part of what makes getting started with the app so easy. For many people interested in sketching, drawing quick diagrams, brainstorming, or mocking up UIs, Linea is all they’ll ever need. Over time, The Iconfactory has continued to add new features, like fill, blend, shape snapping, and versioning, which have extended the app’s power and flexibility without making it hard to use.\nOne of the things I love about many of the art-creation apps on the iPad is their approachability. Part of that comes from the direct manipulation of the art you are creating. Everyone has picked up a pencil or pen and begun doodling at some point, and apps like Linea Sketch tap into that natural sort of activity with tools that aren’t intimidating to novices but are flexible enough to work for pros too.\nIn December, The Iconfactory announced that Linea Sketch 3.0, which hasn’t been released yet, will be switching to a subscription business model. That’s a hard switch to make for an app that has been paid up front for the past three years, but The Iconfactory is doing it right by communicating early about their plans and why they are making the change. Not every user is likely to subscribe, but an app of Linea’s caliber requires regular maintenance and support, so I’m hopeful that it works out and Linea Sketch is developed long into the future.\n\nConcepts\n\n##\n\nJohn: I want to note up front that Concepts is the sponsor of our entire iPad at 10 series this week. I normally wouldn’t include a sponsor’s app in a story like this, except not only was Concepts on my list of influential apps before we ever approached them about sponsoring, but it’s also an app that I’ve used since late 2018, long before they were ever a sponsor.\nI also wanted to cover Concepts immediately following Linea Sketch because the two apps show off the range of the iPad so well. Both are drawing apps, but unlike Linea, which feels like sitting down in front of a single sheet of paper with a set of art tools, Concepts is something different altogether.\nConcepts is the sort of tool that doesn’t have a good physical-world analog. It’s an infinite canvas app that accommodates an incredibly broad spectrum of creative uses. I’ve used it to mind map big projects, doodle while editing a podcast, and outline upcoming stories, but it can be used to storyboard video, design buildings, create art for art’s sake, and a whole lot more.\n\nAt its core, Concepts is about drawing, but as the name suggests, that’s a means to an end, which is exploring ideas visually. Those ideas may never be more than rough sketches and handwritten notes as they typically are for me or they may lead to polished artwork like the concept sketches of Yarrow Cheney, the director of the animated version of The Grinch, whom Federico and I interviewed on AppStories.\nCheney explained during our interview that he uses Concepts because it reduces the distance between the idea in his head and recording it for use in a project. That comment has stuck with me because it perfectly captures what the best iPad apps get right. Through direct manipulation and intuitive controls, they reduce friction and get out of the way of ideas, allowing the user to execute on them almost effortlessly.\n\nProcreate\n\n\nJohn: Procreate has a long history on the iPad. It, too, is a drawing app that rethinks what that means in the context of the iPad. The app has benefitted greatly from Apple’s emphasis on the Pencil and its extension to an ever-increasing number of iPad models.\nFor a long time, big companies ignored the iPad or built simplified versions of their app that omitted substantial portions of their apps’ functionality. That has begun to change with the introduction of the iPad Pro, but it gave companies like Savage Interactive, that make Procreate, room to reimagine what it means to be a drawing app unburdened by the legacy of a desktop app.\nMy favorite aspect of Procreate is how elegantly it hides a deep catalog of brushes and other tools behind a minimalistic UI. The app can be used as a free-hand drawing or painting tool with its myriad of brushes, or in conjunction with photos that you import onto its canvas. The latest update to the app also features a straightforward way to transform your drawings into animations that takes no time at all to learn.\nDespite its large number of available brushes, Procreate also allows users to design and share their own. The app’s Brush Studio could easily be an app of its own, but it is neatly tucked away where it doesn’t get in the way when you don’t need it, which is a common theme among the best iPad apps.\nI don’t consider myself much of an artist, but I enjoy doodling in Procreate. It’s a creative outlet that I find relaxing and an app that I don’t find frustrating. Simply starting to sketch and poking around the UI as needs arise gets you very far in Procreate. Plus, with an excellent series of ‘Learn to Procreate’ videos from Savage, and an online Handbook with all the nitty-gritty details, Procreate is a shining example of what pro apps can be on the iPad.\n\nPixelmator Photo\n\n\nJohn: I’ve used Pixelmator’s apps since the original Pixelmator was released long ago. Pixelmator Photo is one of the newer apps in this collection, but no less important. Photo editing apps, like drawing apps, have a long history on the desktop that comes with a lot of baggage. The iPad provided the Pixelmator team an opportunity to rethink the approach to photo editors and bring the latest technologies like machine learning along to assist users.\nThe app can access photos from either your iCloud Photo Library or the system document browser, which provides access to any photo in iCloud Drive and the other file providers available in the Files app. The app features an extensive selection of parameters that can be tweaked for each photo. What I usually do, though, is start with the magic wand tool, which is powered by machine learning. I usually wind up tweaking an image more by hand, but the magic wand tool usually gets me most of what I want. There are similar machine learning-based adjustments that can be applied to individual categories of tools too.\nIn addition to the many manual and automatic edits that can be made to pictures, Pixelmator Photo includes an excellent set of high-quality filters that can be applied to obtain a specific look. If you create a group of adjustments to an image that you like a lot, you can even save them as a custom, sharable filter.\nI use Pixelmator Photo to edit photos all the time. It’s not the only photo editor I use, but what I appreciate about it is that the app produces much better results than the Photos app’s built-in tools with only a little more effort. That makes it perfect for cleaning up a group of photos quickly when you want to share them right away. The machine learning isn’t always perfect, but by giving me a head start on edits, Pixelmator Photo saves time, allowing me to dial in the last 10% of whatever look I want.\n\nAffinity Photo and Designer\n\n\nJohn: Affinity’s apps are another excellent example of apps built without the burden of a long desktop history. Affinity makes desktop versions of its apps, but the fact that its iPad apps are developed in tandem with the desktop versions is part of what makes them special, ensuring that they work in concert, allowing users who use one version to move easily to the other.\nAffinity’s apps feature UIs with a unique design of their own that works well when moving between the company’s apps. Each app relies on narrow toolbars along the top and sides of the screen with additional tool settings appearing along the bottom of the screen as needed.\nFor many users, Affinity Photo can serve as a replacement for desktop apps like Adobe Photoshop. The app supports Adobe PSD files, unlimited layers, customizable brushes, and a wide array of tools for manipulating your images. Affinity Designer provides a similarly sophisticated set of tools for working with vector graphics. With tight integration between the apps and their desktop counterparts, the apps’ maker, Serif, has created an end-to-end workflow that serves a wide variety of needs.\nOne of my favorite touches in both apps is the ability to customize keyboard shortcuts. This isn’t something you see very often on the iPad, and I hope it becomes more prevalent as pro apps multiply.\nAffinity’s suite of apps on the iPad and their tight integration with their Mac versions is where I hope a lot of pro Mac and iPad apps wind up. Today, that sort integration is rare, but with Mac Catalyst and SwiftUI, this is the direction Apple seems to be heading and one I expect will benefit the users of pro creative apps more and more over time.\n\nAdobe Photoshop\n\n\nJohn: Ever since I started writing about apps, I’ve heard about this or that app that would be the next Photoshop. No doubt, Adobe heard that too and decided Photoshop should be the next Photoshop instead.\nI give Adobe a lot of credit. Photoshop is thirty years old on the Mac, and instead of trying to graft its complex Mac UI onto the iPad, Adobe started from scratch, rebuilding its core engine to work on both platforms then implementing Photoshop’s tools in a manner that makes sense given the iPad’s hardware and input methods.\nToday, there’s a lot of desktop Photoshop that still isn’t available in Photoshop on the iPad, but Adobe’s update history and past announcements suggest that the company is fully committed to building out functionality rapidly. Replicating Photoshop in an iPad-centric way is a tall order, but the features Adobe has implemented so far and the system-level features supported like Split View, Slide Over, dark mode, keyboard shortcuts, and the system document browser are excellent.\nPhotoshop also represents a unique opportunity for users to learn Photoshop. The Mac version of the app, with its 30 years of features, is intimidating. The iPad version, however, is very approachable. I use Photoshop on the iPad for lightweight compositing work on MacStories projects all the time. I would never have thought to make those same edits in Photoshop on the Mac because it just seemed unapproachable and like too much tool for the task. With some basic Photoshop experience under my belt on the iPad, though, I’ve found myself experimenting with the desktop version too. It’s a virtuous cycle where the iPad and Mac versions of pro apps support each other and extend each other’s user base by working as a tight-knit unit, which I expect will be central to the next chapter of the iPad’s story.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-04-01T12:30:08-04:00", "date_modified": "2021-11-23T09:39:29-05:00", "authors": [ { "name": "MacStories Team", "url": "https://www.macstories.net/author/macstoriestaff/", "avatar": "https://secure.gravatar.com/avatar/a5941d3461ac64d3d2687017df01306d?s=512&d=mm&r=g" } ], "tags": [ "iPad at 10", "stories" ] }, { "id": "https://www.macstories.net/?p=62802", "url": "https://www.macstories.net/stories/full-of-potential-developers-on-the-ipads-past-present-and-future/", "title": "Full of Potential: Developers on the iPad\u2019s Past, Present, and Future", "content_html": "
\"\"

\n

From the start, the iPad has always been rife with potential. This is partly because it launched as a new type of product category, with unexplored use cases prompting users towards a different computing experience. But it’s also because the device’s very nature – a slab of glass that becomes its software – evokes countless possibilities.

\n

To celebrate 10 years of iPad, I spoke to the developers of many of the device’s best apps across areas of productivity and creative work. They’re the people who make that slab of glass into something new, realizing the iPad’s potential but also showing, by their constant work of iteration and reinvention, that there’s always more that can be done.

\n

In sharing their stories from the last decade, the people I spoke with outlined some of the best and worst things about iPad development, memories of their reactions to the product’s introduction, and dreams for where its future might lead. All throughout, it’s clear how much excitement remains for the iPad’s potential even 10 years on.

\n

\n

Hopeful Beginnings

\n
\"The

The early days of the iPad’s App Store.

\n

Several of the developers I spoke with created iPad apps right when the device launched. Cultured Code (makers of Things) CEO Werner Jainek, for example, shared:

\n

\n “I remember the excitement we all felt when the iPad first came out. We were blown away. We put everything else on pause and worked straight for four weeks to get Things for iPad ready. It was a lot of fun!”\n

\n

Similarly from Alexander Griekspoor, Co-Founder at Momenta B.V., the team behind Agenda:

\n

\n “I still have fond memories of the initial iPad launch, and how keen we were to be in the App Store on launch day. We had the tools to build the app, but we didn’t have an iPad! Apple invited us to go to the labs and try it out on a device, but we are based in Europe, so it wasn’t really an option. Instead, we sent our colleague Charles, who happened to live in Silicon Valley, and he communicated problems back to us. The app launched in the App Store before we had ever touched an iPad, and Charles was made to stand in line to buy and FedEx one to us as soon as it was available.”\n

\n

For many, the iPad represented an opportunity to create a brand new property that wasn’t possible before. That’s the story of Procreate, according to Savage Interactive’s CEO & Co-Founder James Cuda:

\n

\n “It’s safe to say without iPad, we would not have developed Procreate. Before iPad, there wasn’t an accessible digital drawing platform you could recommend to your grandma or your children. There were a fair amount of desktop painting simulation applications, and of course there was Photoshop, but there was nothing around that had been designed from inception as a focused and natural digital drawing application. Ten years ago, no platform existed that was capable of supporting such an experience.”\n

\n

The Highs and Lows of iPad Development

\n
\"\"

\n

Bright beginnings quickly gave way to the expected mix of joys and frustrations found in long-term platform development.

\n

On the positive side, the iPad in many respects has lived up to its potential of enabling new, more accessible computing experiences for users. Canis, the Wooji Juice Lead Developer behind Ferrite Recording Studio, shared:

\n

\n “I hear a lot from people producing podcasts on iPad, who have either switched from desktop and are enjoying the way audio editing feels on iOS, or [those] for whom editing on desktop was too high of a wall to climb at all.

\n

Being able to play a chord on the touchscreen while also adjusting the dials of a synthesiser, for example (Gorillaz and The Flaming Lips have both produced albums using some of my software). Or moving an audio or video clip, which can feel a lot better when you just pick it up with a finger and place it where you want it to go, instead of using your finger on a trackpad to steer a virtual finger around the screen, to do the same thing, but indirectly.”\n

\n
\n

Momenta B.V.’s Griekspoor echoed the iPad’s distinctness from other platforms:

\n

\n “The best part of developing for iPad is still the magic of the device itself. There’s something very nice about seeing your app come alive on a lightweight piece of glass, and being able to directly interact with it using your fingers. It’s very different to running the app on your Mac.

\n

For our app, Agenda, the iPad is a great fit, particularly in meetings, where the device is less intrusive than a laptop.”\n

\n

For Ulysses’ team, it’s the iPad’s commonalities with other platforms, rather than its differences, that help it fill a key role in their development process. Founder and Executive Director Max Seelemann explains:

\n

\n “For our development, iPad is the bridge between the desktop and the mobile world. When working on new interfaces, we often start at either end of the scale – on iPhone or on the Mac. More often than not, we then conceptualize for the iPad before moving to the other end of the scale. The iPad is a great step in-between the two, because it resembles the available screen real estate of a Mac application but uses interaction models like on the iPhone.”\n

\n

Unsurprisingly, there is also plenty that developers wish was different about iPad development. The big common theme among those I spoke with surrounded OS limitations that seem outdated a decade into the device’s life. Cultured Code’s Jainek illustrates this with an example where Things has been an iPad pioneer: keyboard navigation.

\n

\n “We have a very active user base on the iPad, and we’re keen to deliver an outstanding experience for them. Sometimes, the OS makes this harder than it should be. For example, when we set out to build powerful keyboard support for Things, we realized that we had to build it all from scratch. All of the keyboard navigation, selection logic, use of modifier keys – all of it. It’s important that the OS provides this kind of functionality to developers. It ensures consistency and leads to a much higher adoption rate.”\n

\n
\n

Ideas on Canvas’ Engineering Lead for MindNode, Matthias Tretter, picks up that thread:

\n

\n “Many of the things you see in modern iPad apps have to be implemented manually by each developer team, even across Apple’s own apps. This not only takes a lot of time, but the implementations are also all-so-slightly different, resulting in small inconsistencies across apps. Take the currently highly popular sheets presented from the bottom as an example. These sheets originated in Apple’s Maps app and a few others, and are now found everywhere across iOS. Sometimes you can swipe them up to make them bigger, sometimes you can swipe them down to move them to the bottom of the screen, sometimes you can swipe them down to dismiss them. If the developers sweat the details, the movement of the sheet follows the movement of your finger and has a nice spring-based bounce animation once you let it go. If not, movement might feel a bit unnatural or off.

\n

In the end this unfortunately often is a lose-lose situation: developers need to invest a lot of time to create these components that could easily be provided by the OS. Time that – especially in small teams – can’t be invested into the core experience of your product. The user loses by having to face inconsistencies. And if the user loses, Apple loses as well.”\n

\n

Much of the time, users are unaware of these OS-produced “losses” because they simply result in features or apps that can never be created. That’s what happened with one pro-focused project from the Pixelmator team. Here’s Tomas Andrijauskas, Lead Developer on Pixelmator Photo, with the story:

\n

\n “Even though in terms of its raw compute power, iPad competes with and even surpasses consumer desktop hardware, the current memory limitations constrain things quite a lot. So, with every decision we make in terms of features and updates, we have to keep memory in mind.

\n

The decision to create Pixelmator Photo was made pretty much on a whim – we had been working on Pixelmator Pro for iPad but the memory limitations meant we couldn’t bring the same nondestructive editing experience from the Mac to iPad. We persevered but, when it became obvious that there were too many technical hurdles to overcome, one day we decided to take a subset of the tools (the colors adjustments + repair tool + crop tool) from the app, refine the workflow for photo editing, and add as much machine learning magic as we could manage. We had never done anything like this and had no idea what to expect but, about six months later, we had won ourselves our second Apple Design Award. That was definitely pretty cool!”\n

\n

Although the story had a happy ending, I can’t help but think of what a full-fledged Pixelmator Pro on the iPad could have looked like if iPadOS made it possible.

\n
\"Pixelmator

Pixelmator Photo was originally intended to be a full iPad version of Pixelmator Pro.

\n

Canis of Wooji Juice summarizes well how the iPad and its OS can be both a blessing and a curse:

\n

\n “A platform that operates under a lot of constraints can be both limiting and freeing; the App Store can be both great, and immensely frustrating, as can the iOS APIs; iOS 13 brought many much-needed updates, but was also plagued with bugs, a number of which still haven’t been fixed. The APIs are higher quality than many other platforms I’ve developed for, but the documentation is often lacking and Apple itself is largely a black box.”\n

\n

The Next 10 Years

\n
\"\"

\n

Despite these drawbacks of iPad development, there remains a strong sense of enthusiasm for where the device might go next. As Savage’s James Cuda said, “The iPad was the catalyst for us, and as a platform it’s still as thrilling and packed with promise in 2020 as it was in 2010.

\n

He’s not alone. The team behind GoodNotes shared:

\n

\n “Now that people are more comfortable leaving their laptop or desktop behind and truly relying on iPad as their main productivity tool, the best thing about developing for iPad for us is being part of this transformation, and having a chance to be creative and come up with ways to improve how people work and study.”\n

\n

Ideas on Canvas’ Tretter:

\n

\n “iOS and iPadOS are still wonderful platforms to develop for. Especially on the iPad there is so much potential to explore, experiment, and drive the platform forward as a community. While it’s not like in the early days anymore, there are still many novel ideas born on iPad, spreading across apps and even back into the OS. This often sparks joy – I love playing around with new paradigms, discovering hidden gems in apps, as well as adding them to MindNode.”\n

\n
\"MindNode

MindNode was the first iPad app to implement multiple modular panels.

\n

Cultured Code’s Jainek:

\n

\n “The first time I held [an iPad] in my hands it felt so natural, so perfectly adapted in size and weight to us humans, that it really felt like the computing device of the future. I still feel that way today. Despite its shortcomings, developing for the iPad is developing for the future.

\n

I think the iPad is about to enter a whole new phase. Apple voiced a strong commitment to the platform last summer by introducing iPadOS, and we’re beginning to see the first benefits: new keyboard APIs, mouse support, etc.”\n

\n

Not to say there aren’t new challenges created by the iPad’s recent advancements. Ole Zorn, Creator of Editorial, shared an insightful concern:

\n

\n “I think it has become a lot harder to justify making iPad-only apps, and that tends to limit some ideas that just wouldn’t work very well on iPhone (but would perhaps need the additional audience). iPad development used to be much more distinct from iPhone development, but if you want to support e.g. Split View, you basically have to build an iPhone app as well. In a lot of ways, that’s also a good thing of course, because the experience is consistent across platforms, but there’s a risk that the iPad platform loses a bit of its uniqueness that way.”\n

\n

It’s rare to see iPad-only experiences these days, and now that Mac Catalyst makes cross-platform development easier than ever, that trend is likely to continue. Hopefully as the iPad Pro’s market grows, that larger user base will enable more developers to pursue building experiences unique to the platform, such as the recent app Looom.

\n
\"The

The Apple Pencil is one of several catalysts in the iPad’s evolution.

\n

The first half of the iPad’s life was marked by massive success, but not much continued innovation; 2015’s iPad Pro debut started moving the device in a new direction, one that’s seen a lot more change. Developers have followed that shift, going all-in on helping chart new territory for what’s possible on iPad. Savage’s James Cuda shares:

\n

\n “In 2013 we started an initiative to develop Procreate for other platforms. We even went so far as to invest a considerable amount of capital and developed a prototype for one particular platform. It was exciting to see Procreate evolve, however by 2015 Apple released the incredible iPad Pro and Apple Pencil combination, which changed everything.

\n

Overnight Apple had created the single most compelling solution for creative content creation. This moment was a seismic shift for us. We threw everything out the window and embarked on an entirely new strategy, because it was so incredibly clear developing for alternate platforms was a monumental step backwards. This was the future. iPad Pro and Apple Pencil. A beautiful large multitouch surface coupled with the most accurate stylus ever.”\n

\n

The iPad Pro is crucial to understanding the iPad’s direction moving into its second decade, because every ounce of innovation from Apple in both iPad hardware and software is Pro-focused. It’s great that regular iPads still gain features like mouse and trackpad support, and can use accessories like the Apple Pencil now, but there’s no doubt that the most exciting iPad developments are happening on and optimized for the Pro line.

\n

Hardware like the Magic Keyboard and software like iPadOS 13.4 are significant investments in the iPad Pro’s future. The one area we haven’t seen much effort from Apple is in pro-focused apps. The iWork suite is strong, but there’s nothing that compares to Final Cut Pro or Logic Pro on the iPad. Which leaves the work of building pro-level experiences to the developer community.

\n

As Apple continues evolving the hardware and OS, developers will be empowered to do what they do best: build experiences that turn that slab of glass into something altogether new. Which, in turn, will empower users to go and make something wonderful themselves.

\n

10 years in, the iPad – and particularly the iPad Pro – is still full of potential.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "From the start, the iPad has always been rife with potential. This is partly because it launched as a new type of product category, with unexplored use cases prompting users towards a different computing experience. But it’s also because the device’s very nature – a slab of glass that becomes its software – evokes countless possibilities.\nTo celebrate 10 years of iPad, I spoke to the developers of many of the device’s best apps across areas of productivity and creative work. They’re the people who make that slab of glass into something new, realizing the iPad’s potential but also showing, by their constant work of iteration and reinvention, that there’s always more that can be done.\nSupported By\nConcepts\n\n\nConcepts: Where ideas take shape\nIn sharing their stories from the last decade, the people I spoke with outlined some of the best and worst things about iPad development, memories of their reactions to the product’s introduction, and dreams for where its future might lead. All throughout, it’s clear how much excitement remains for the iPad’s potential even 10 years on.\n\nHopeful Beginnings\nThe early days of the iPad’s App Store.\nSeveral of the developers I spoke with created iPad apps right when the device launched. Cultured Code (makers of Things) CEO Werner Jainek, for example, shared:\n\n “I remember the excitement we all felt when the iPad first came out. We were blown away. We put everything else on pause and worked straight for four weeks to get Things for iPad ready. It was a lot of fun!”\n\nSimilarly from Alexander Griekspoor, Co-Founder at Momenta B.V., the team behind Agenda:\n\n “I still have fond memories of the initial iPad launch, and how keen we were to be in the App Store on launch day. We had the tools to build the app, but we didn’t have an iPad! Apple invited us to go to the labs and try it out on a device, but we are based in Europe, so it wasn’t really an option. Instead, we sent our colleague Charles, who happened to live in Silicon Valley, and he communicated problems back to us. The app launched in the App Store before we had ever touched an iPad, and Charles was made to stand in line to buy and FedEx one to us as soon as it was available.”\n\nFor many, the iPad represented an opportunity to create a brand new property that wasn’t possible before. That’s the story of Procreate, according to Savage Interactive’s CEO & Co-Founder James Cuda:\n\n “It’s safe to say without iPad, we would not have developed Procreate. Before iPad, there wasn’t an accessible digital drawing platform you could recommend to your grandma or your children. There were a fair amount of desktop painting simulation applications, and of course there was Photoshop, but there was nothing around that had been designed from inception as a focused and natural digital drawing application. Ten years ago, no platform existed that was capable of supporting such an experience.”\n\nThe Highs and Lows of iPad Development\n\nBright beginnings quickly gave way to the expected mix of joys and frustrations found in long-term platform development.\nOn the positive side, the iPad in many respects has lived up to its potential of enabling new, more accessible computing experiences for users. Canis, the Wooji Juice Lead Developer behind Ferrite Recording Studio, shared:\n\n “I hear a lot from people producing podcasts on iPad, who have either switched from desktop and are enjoying the way audio editing feels on iOS, or [those] for whom editing on desktop was too high of a wall to climb at all.\n Being able to play a chord on the touchscreen while also adjusting the dials of a synthesiser, for example (Gorillaz and The Flaming Lips have both produced albums using some of my software). Or moving an audio or video clip, which can feel a lot better when you just pick it up with a finger and place it where you want it to go, instead of using your finger on a trackpad to steer a virtual finger around the screen, to do the same thing, but indirectly.”\n\n\nMomenta B.V.’s Griekspoor echoed the iPad’s distinctness from other platforms:\n\n “The best part of developing for iPad is still the magic of the device itself. There’s something very nice about seeing your app come alive on a lightweight piece of glass, and being able to directly interact with it using your fingers. It’s very different to running the app on your Mac.\n For our app, Agenda, the iPad is a great fit, particularly in meetings, where the device is less intrusive than a laptop.”\n\nFor Ulysses’ team, it’s the iPad’s commonalities with other platforms, rather than its differences, that help it fill a key role in their development process. Founder and Executive Director Max Seelemann explains:\n\n “For our development, iPad is the bridge between the desktop and the mobile world. When working on new interfaces, we often start at either end of the scale – on iPhone or on the Mac. More often than not, we then conceptualize for the iPad before moving to the other end of the scale. The iPad is a great step in-between the two, because it resembles the available screen real estate of a Mac application but uses interaction models like on the iPhone.”\n\nUnsurprisingly, there is also plenty that developers wish was different about iPad development. The big common theme among those I spoke with surrounded OS limitations that seem outdated a decade into the device’s life. Cultured Code’s Jainek illustrates this with an example where Things has been an iPad pioneer: keyboard navigation.\n\n “We have a very active user base on the iPad, and we’re keen to deliver an outstanding experience for them. Sometimes, the OS makes this harder than it should be. For example, when we set out to build powerful keyboard support for Things, we realized that we had to build it all from scratch. All of the keyboard navigation, selection logic, use of modifier keys – all of it. It’s important that the OS provides this kind of functionality to developers. It ensures consistency and leads to a much higher adoption rate.”\n\n\nIdeas on Canvas’ Engineering Lead for MindNode, Matthias Tretter, picks up that thread:\n\n “Many of the things you see in modern iPad apps have to be implemented manually by each developer team, even across Apple’s own apps. This not only takes a lot of time, but the implementations are also all-so-slightly different, resulting in small inconsistencies across apps. Take the currently highly popular sheets presented from the bottom as an example. These sheets originated in Apple’s Maps app and a few others, and are now found everywhere across iOS. Sometimes you can swipe them up to make them bigger, sometimes you can swipe them down to move them to the bottom of the screen, sometimes you can swipe them down to dismiss them. If the developers sweat the details, the movement of the sheet follows the movement of your finger and has a nice spring-based bounce animation once you let it go. If not, movement might feel a bit unnatural or off.\n In the end this unfortunately often is a lose-lose situation: developers need to invest a lot of time to create these components that could easily be provided by the OS. Time that – especially in small teams – can’t be invested into the core experience of your product. The user loses by having to face inconsistencies. And if the user loses, Apple loses as well.”\n\nMuch of the time, users are unaware of these OS-produced “losses” because they simply result in features or apps that can never be created. That’s what happened with one pro-focused project from the Pixelmator team. Here’s Tomas Andrijauskas, Lead Developer on Pixelmator Photo, with the story:\n\n “Even though in terms of its raw compute power, iPad competes with and even surpasses consumer desktop hardware, the current memory limitations constrain things quite a lot. So, with every decision we make in terms of features and updates, we have to keep memory in mind.\n The decision to create Pixelmator Photo was made pretty much on a whim – we had been working on Pixelmator Pro for iPad but the memory limitations meant we couldn’t bring the same nondestructive editing experience from the Mac to iPad. We persevered but, when it became obvious that there were too many technical hurdles to overcome, one day we decided to take a subset of the tools (the colors adjustments + repair tool + crop tool) from the app, refine the workflow for photo editing, and add as much machine learning magic as we could manage. We had never done anything like this and had no idea what to expect but, about six months later, we had won ourselves our second Apple Design Award. That was definitely pretty cool!”\n\nAlthough the story had a happy ending, I can’t help but think of what a full-fledged Pixelmator Pro on the iPad could have looked like if iPadOS made it possible.\nPixelmator Photo was originally intended to be a full iPad version of Pixelmator Pro.\nCanis of Wooji Juice summarizes well how the iPad and its OS can be both a blessing and a curse:\n\n “A platform that operates under a lot of constraints can be both limiting and freeing; the App Store can be both great, and immensely frustrating, as can the iOS APIs; iOS 13 brought many much-needed updates, but was also plagued with bugs, a number of which still haven’t been fixed. The APIs are higher quality than many other platforms I’ve developed for, but the documentation is often lacking and Apple itself is largely a black box.”\n\nThe Next 10 Years\n\nDespite these drawbacks of iPad development, there remains a strong sense of enthusiasm for where the device might go next. As Savage’s James Cuda said, “The iPad was the catalyst for us, and as a platform it’s still as thrilling and packed with promise in 2020 as it was in 2010.”\nHe’s not alone. The team behind GoodNotes shared:\n\n “Now that people are more comfortable leaving their laptop or desktop behind and truly relying on iPad as their main productivity tool, the best thing about developing for iPad for us is being part of this transformation, and having a chance to be creative and come up with ways to improve how people work and study.”\n\nIdeas on Canvas’ Tretter:\n\n “iOS and iPadOS are still wonderful platforms to develop for. Especially on the iPad there is so much potential to explore, experiment, and drive the platform forward as a community. While it’s not like in the early days anymore, there are still many novel ideas born on iPad, spreading across apps and even back into the OS. This often sparks joy – I love playing around with new paradigms, discovering hidden gems in apps, as well as adding them to MindNode.”\n\nMindNode was the first iPad app to implement multiple modular panels.\nCultured Code’s Jainek:\n\n “The first time I held [an iPad] in my hands it felt so natural, so perfectly adapted in size and weight to us humans, that it really felt like the computing device of the future. I still feel that way today. Despite its shortcomings, developing for the iPad is developing for the future.\n I think the iPad is about to enter a whole new phase. Apple voiced a strong commitment to the platform last summer by introducing iPadOS, and we’re beginning to see the first benefits: new keyboard APIs, mouse support, etc.”\n\nNot to say there aren’t new challenges created by the iPad’s recent advancements. Ole Zorn, Creator of Editorial, shared an insightful concern:\n\n “I think it has become a lot harder to justify making iPad-only apps, and that tends to limit some ideas that just wouldn’t work very well on iPhone (but would perhaps need the additional audience). iPad development used to be much more distinct from iPhone development, but if you want to support e.g. Split View, you basically have to build an iPhone app as well. In a lot of ways, that’s also a good thing of course, because the experience is consistent across platforms, but there’s a risk that the iPad platform loses a bit of its uniqueness that way.”\n\nIt’s rare to see iPad-only experiences these days, and now that Mac Catalyst makes cross-platform development easier than ever, that trend is likely to continue. Hopefully as the iPad Pro’s market grows, that larger user base will enable more developers to pursue building experiences unique to the platform, such as the recent app Looom.\nThe Apple Pencil is one of several catalysts in the iPad’s evolution.\nThe first half of the iPad’s life was marked by massive success, but not much continued innovation; 2015’s iPad Pro debut started moving the device in a new direction, one that’s seen a lot more change. Developers have followed that shift, going all-in on helping chart new territory for what’s possible on iPad. Savage’s James Cuda shares:\n\n “In 2013 we started an initiative to develop Procreate for other platforms. We even went so far as to invest a considerable amount of capital and developed a prototype for one particular platform. It was exciting to see Procreate evolve, however by 2015 Apple released the incredible iPad Pro and Apple Pencil combination, which changed everything.\n Overnight Apple had created the single most compelling solution for creative content creation. This moment was a seismic shift for us. We threw everything out the window and embarked on an entirely new strategy, because it was so incredibly clear developing for alternate platforms was a monumental step backwards. This was the future. iPad Pro and Apple Pencil. A beautiful large multitouch surface coupled with the most accurate stylus ever.”\n\nThe iPad Pro is crucial to understanding the iPad’s direction moving into its second decade, because every ounce of innovation from Apple in both iPad hardware and software is Pro-focused. It’s great that regular iPads still gain features like mouse and trackpad support, and can use accessories like the Apple Pencil now, but there’s no doubt that the most exciting iPad developments are happening on and optimized for the Pro line.\nHardware like the Magic Keyboard and software like iPadOS 13.4 are significant investments in the iPad Pro’s future. The one area we haven’t seen much effort from Apple is in pro-focused apps. The iWork suite is strong, but there’s nothing that compares to Final Cut Pro or Logic Pro on the iPad. Which leaves the work of building pro-level experiences to the developer community.\nAs Apple continues evolving the hardware and OS, developers will be empowered to do what they do best: build experiences that turn that slab of glass into something altogether new. Which, in turn, will empower users to go and make something wonderful themselves.\n10 years in, the iPad – and particularly the iPad Pro – is still full of potential.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-03-31T11:00:13-04:00", "date_modified": "2020-04-01T13:07:42-04:00", "authors": [ { "name": "Ryan Christoffel", "url": "https://www.macstories.net/author/ryanchristoffel/", "avatar": "https://secure.gravatar.com/avatar/6f92854b21cbef25629d7efb809a9de7?s=512&d=mm&r=g" } ], "tags": [ "developers", "iPad", "iPad at 10", "iPad Pro", "stories" ] }, { "id": "https://www.macstories.net/?p=62785", "url": "https://www.macstories.net/stories/for-ipad-accessibility-gives-its-just-a-big-iphone-new-meaning/", "title": "For iPad, Accessibility Gives \u2018It\u2019s Just a Big iPhone\u2019 New Meaning", "content_html": "
\"\"

\n

Perhaps the most common complaint hurled against the iPad over its first decade of life is that it‘s little more than a bigger iPhone. At a fundamental level, the criticism is certainly valid: by and large, the iPad runs the same software as the iPhone. The penchant for bemoaning this bigness emanates from discontentment over the fact that substantial improvements to the iPad’s software have come at a glacially slow pace. Until last year, meaningful upgrades tailored to the tablet were few and far between.1 As much as Apple has extolled the iPad for being “unlike any computer,” the truth is the product stagnated for quite a while in terms of software.2 For better or worse, the company has been preoccupied with savoring every last drop of mother’s milk from the cash cow that is the iPhone. The iPad was left to wither thirstily when it came to its own growth, and it suffered for some time as a result.

\n

In actuality, the iPad being more or less a scaled-up iPhone isn’t necessarily an entirely bad thing. The reason is iOS; familiarity breeds comfort – Apple shrewdly created the iPad’s user interface (and to lesser extents, Apple Watch and Apple TV) to largely resemble the iPhone. Especially for less nerdy users, the consistency across devices makes for a seamless, less intimidating experience. From icons to text to features to the touchscreen, the iPad being so similar to the iPhone means acclimating to the device takes minimal time and effort. From an accessibility standpoint, easy acclimation sets the tone for an enjoyable user experience. The foremost reason this is important is that the easier it is to acclimate to a device, the easier it is to find and configure mission-critical accessibility features.

\n

Thus, it’s not at all unreasonable to look at what was heretofore a pejorative assessment – the iPad is nothing but a big iPhone – and turn it into a positive. One of the unheralded aspects of the device’s success is how its approachable, intuitive nature has made it a hit in accessibility-centric contexts such as special education classrooms and as a communicative aid. Such advances get right at the heart of the oft-cited Steve Jobs quote on the so-called intersection of technology and the liberal arts, when he said, “It’s in Apple’s DNA that technology alone is not enough.” Assistive technology obviously caters to the humanities part of the liberal arts, and it’s not hard to see how the iPad’s roots as ostensibly a bigger iPhone can be an asset rather than a liability. You just have to be willing to keep an open mind.

\n

\n

“In many ways, education and accessibility beautifully overlap,” Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, told me in a 2018 interview for TechCrunch. “For us, the concept of differentiated learning and how the accessibility tools that we build in [products] help make that [learning] possible is really important to us.”

\n

Form Follows Function

\n

In order to completely understand why iPad has been so successful for students, parents, and educators, it’s crucial to first examine the iPad at the product’s most basic level. This means the object itself and its interaction model.

\n

In written retrospectives and on podcasts, many in the Apple commentariat have pointed to the chair Jobs sat in as he demonstrated the original iPad on January 27, 2010 as a metaphor for Apple’s goal with the device. The chair helped telegraph what the iPad meant to Apple; to wit, that the product’s form factor (and accompanying software) represented the company’s conception for the future of personal computing. That Jobs was casually sitting back in that cushy recliner, browsing photos, writing email, and reading the (Flash-free) New York Times, sent the message that the iPad was highly unlike any conventional computer. To paraphrase Jony Ive, the iPad’s physical nature meant it conformed to the user rather than forcing the user to conform to it – which it continues to do with aplomb to this day. It was primarily for this reason Jobs described the product as “magical and revolutionary.”

\n

In terms of accessibility, the idea that the iPad is nothing but a slab of metal and glass is the main attraction. For users with certain cognitive delays, for example, the iPad is a near perfect device. To use it is to literally put your hands on the screen. iPadOS is far less conceptually complex than macOS; there are no required pointing devices or windows to fiddle with.3 Likewise, visually impaired users can hold the tablet as close to their face as necessary to see. In both scenarios, this flexibility is made possible because Apple rethought the computing experience down to its essence. And that’s saying nothing about the deeply-integrated system accessibility features designed to aid and enhance the user experience for everyone, regardless of ability.

\n

What this ultimately means is that the advent of the iPad ushered in a new era, an era in which the device is tailor-made to enable learning through technology. Particularly for those in special education settings, the iPad was (and remains) very much the Platonic ideal in terms of modern, accessible,4 and engaging teaching tools. And it’s because one of the iPad’s greatest strengths is that it’s essentially a big iPhone.

\n

Between Two Worlds

\n
\"\"

\n

Before getting into tech journalism in May 2013, I spent more than a decade as a paraeducator (the more professional name for a teacher’s aide) working in middle school and preschool special education classrooms.5 The last nine years of my career were spent with children aged 3 to 5, where we used technology like the iPad (and iPod touch) in small-group activities. Having spent most of my time with students on the autism spectrum, I was trained in several teaching methodologies designed for autistic children. I found the iPad to be great in not only gauging expressive and receptive language skills,6 it also was great in introducing and reinforcing pre-academic concepts such as identifying colors, letters, and shapes. In fact, my first-ever bylined piece, titled “Re-Enabled,” centered around how iOS and the iPad made life more accessible for my students, as well as myself.7

\n

During my teaching years, I also studied early childhood development part-time at my then-local community college. The combination of real world, hands-on experience of my job and the textbook theoretical learning in my schooling helped me synthesize the disparate experiences I had at work and as a student. This proved especially beneficial as I was learning how children typically develop from infancy through early elementary age while at the same time working daily in a decidedly atypical environment, developmentally speaking. I was very much “between worlds,” so to speak.

\n

What I wrote seven years ago in The Magazine remains true today. While there is no substitute for analog, real-life, play-based learning experiences for young children, there can be no denying the realities of the modern world. The iPad as a teaching tool not only bridges the analog and digital worlds, it allows for more immersive and innovative opportunities for students than ever before. For early childhood special education, where learning and teaching oftentimes take new meaning, the iPad can be utterly transformative. Not only for the child, but for the adult too.

\n

I cannot emphasize enough how astonishing it was to watch my students, some of whom had fairly severe cognitive delays, take to the iPad so naturally and with such immediacy. Launching apps, swiping through pages, even the advanced multitasking gestures were all, pun intended, child’s play. This speaks volumes of not only the allure of things that light up and make noise, but of the genius of Apple building the iPad’s UI on top of the iPhone’s operating system. Like the decision to base iOS on many of OS X’s underpinnings, the choice to base the iPad on iOS was immensely prudent by Apple. We reap the benefits of it every day.

\n

The bottom line is that if technology like the iPad is embraced and used with purpose,8 then it absolutely can augment the traditional materials and teaching methodologies that have been around schools forever. In my time in classrooms, I’ve seen firsthand how old and new can go meaningfully together.

\n

Putting the i in iPad

\n
\"\"

\n

It’s no secret Apple believes strongly in the iPad. Although the iPad’s software story has looked weak until recently, the hardware has never been better – arguably the company’s best, maybe ever. The current iPad Pro design language is breathtakingly beautiful, so much so that many in the Apple community (myself included) hope Apple carries it over to the iPhone as soon as possible. Accessory-wise, the Smart Keyboard and Apple Pencil are first-class peripherals. The second-generation Pencil is, in my estimation, every bit as delightful and magical as AirPods. Like the earbuds, it’s not apparent in using the Pencil that it’s effectively a teeny-tiny computer with sensors and other tech inside. Using it is uncannily like using a regular pencil, albeit one that never smears or needs sharpening.

\n

Apple has gone on the record about how iPads can be used in classrooms, most notably with its Everyone Can Code initiative and Swift Playgrounds on iOS and the Mac. Herrlinger has said to me many times over the years that when Apple says everyone can code, they really do mean everyone; staying true to their ethos, Apple built Everyone Can Code with accessibility top of mind. Swift Playgrounds is fully accessible to disabled users, offering robust integration with features like VoiceOver and Switch Control. There’s even downloadable tactile puzzles in Braille and coding videos presented in American Sign Language for aspiring blind and visually impaired and hard-of-hearing/deaf coders, respectively.

\n

“We believe in focusing on ability rather than disability. We believe coding is a language – a language that should be accessible to everyone,” Apple CEO Tim Cook said to me at an Everyone Can Code kickoff event in 2018.

\n

Apple’s approach to the iPad in the classroom is broad in scope; they don’t focus on providing esoteric, specialized tools for teachers and other professionals in special education. The company instead leverages its massive App Store ecosystem by allowing third-party developers to create apps for learning and communication. That said, the Apple-designed Classroom and Schoolwork apps can hold relevance to accessibility. This marches in lockstep with Apple’s overarching philosophy surrounding the iPad. The act of individualizing what one needs or wants from the device gets to the core of what has made the iPad so unique to so many.

\n

Meg Wilson, a former special education teacher who now works at Apple, told me in a 2018 interview that Schoolwork in particular presents “an amazing opportunity for collaboration amongst service providers” when it comes to continually updating colleagues on students’ progress throughout the school year; this perpetual stream of updates also opens the door to faster adjustment of a student’s goals and objectives. Likewise, Wilson told me a teacher or other support staff could add their entire caseload into Schoolwork and have progress reports at the ready anytime, anywhere. And for Classroom, educators can “lock” a group of students (or just one) into an app so that she or he stays on task and completes their activity.9 For students who don’t need much scaffolding or prompting, it’s easy to share work with their teacher(s) with a couple taps using the system share sheet.

\n

“When you are creative with technology, you change people’s lives,” Wilson said.

\n

Being creative with technology is what the iPad was made for. As Apple sees it, the impact the device has on education is a microcosm of what makes the iPad so special. Students and teachers alike can turn the iPad into whatever they need it to be in order to enhance the classroom experience. Tim Cook often opens events with his boilerplate statement about how Apple’s North Star is building products that enrich and empower lives, and it isn’t an empty bromide. The iPad does precisely that for special education professionals on a daily basis. To say the iPad has been a revolution in special education settings isn’t an exaggeration – it’s exactly right.

\n

“Education and accessibility are inextricably tied in many ways,” Herrlinger said in a recent interview. “What the iPad has done for classrooms to empower and enrich the lives of students is what we strive for when making these products.”

\n

Using Your Words

\n
\"\"

\n

One of the most prominent places where the iPad excels as a teaching tool is in speech therapy sessions. Speech and language therapy plays an integral role in special education settings,10 as SLPs (speech and language pathologists) support the work happening in the classroom. These specialists extend the experience by tirelessly working to facilitate language and pro-social behavior between children and staff, but most importantly, between their peers and parents. This work goes a long way in providing enriching, developmentally appropriate programs.

\n

Kirsty Maksel, a speech therapist based in the Bay Area who works at an elementary school, has used iPads with her students over the past 4–5 years. A self-professed “late adopter,” her iPad is an older model, one with the 30-pin iPod connector of yore.11 For students who have limited verbal skills, or who are totally non-verbal, Maksel says the iPad “has made voice output communication software much more accessible than a $15,000 to $20,000 dollar dedicated device.” She said her iPad is used in both group (“push-in”) and individual (“pull-out”) sessions.12

\n

For Maksel, the iPad is used to encourage communication. She mainly uses voice-output communication apps, telling me Proloquo2go and Go Talk are mainstays for her. “They [her students] are learning to request, comment, and command. Some are learning to ask structured questions. I [also] model core vocabulary13 as they are learning new vocabulary words,” she said. Maksel added she uses YouTube to play songs that connect thematically to books she uses, noting that she expects students to sing along and mimic motor movements as songs go on.

\n

In this environment, the iPad is not a toy – it’s a teaching tool. Thus, Maksel has tried to curtail some of the device’s irresistibility by removing irrelevant apps from the device, as well as putting it in a case. An external speaker, she said, also helps in this regard because it’s needed for better sound. A byproduct of this is that it also directs the sound away from the device itself, which can help with attention.

\n

Like Maksel, Courtney Caviggia is a Bay Area speech therapist who utilizes the iPad. Working with preschoolers, Caviggia has spent the last eight years using “a few different generations” of the iPad mini. For her purposes, like Maksel, Caviggia uses the iPad as a means to promote communication during sessions with her students. Apps are obviously the centerpiece of the tablet experience, and Caviggia has many favorites, including the aforementioned Proloquo2go, Go Talk, and YouTube.

\n

“I love interactive apps, music and dance apps for children, children’s apps, learning apps, and ones that give positive reinforcement for correct answers provided. I enjoy apps with visual supports for students,” she said. “I have also used several AAC apps and systems that are used on the iPad and they have been extremely easy to use and adapt to student needs.”

\n

Caviggia told me she’s been using the iPad for so long that she’s “found it to be an asset in therapy.” It does everything from model turn-taking to being a discrete communication device to being a motivator and reward for hard work and good behavior. But the assets lie not only with her students; they benefit her just as much. Take data-tracking, for instance. Data collection is critically important to special educators, who rely on the information when writing progress reports and participating in annual IEP meetings. What Meg Wilson said is a truism for special educators that’s practically as essential to life as food and water: “What we need in special ed is data – we need data.” In Caviggia’s case, she said she likes how many apps she uses include score-taking, which allows her to focus all her energy on the student rather than interrupt the flow to jot something down for later.

\n

For all the iPad’s strengths, Caviggia has a few gripes. The iPad’s appeal is a double-edged sword, because the device can be as distracting as it is motivating. “Some students have a hard time taking turns with such a highly motivating object,” she said. Caviggia further lamented how “frustrating” it can be that the iPad has a finite lifespan in terms of software updates. As the iPad ages, access to app updates, new apps, and new OS features can be limited if a hardware upgrade isn’t feasible.14

\n

Dillan Barmache is a non-verbal autistic man and self-described “advocate for autistic voices.” Apple told his story in April 2016 as part of recognizing Autism Acceptance Month. Then 16, he explained in a poignant, now-private video called “Dillan’s Voice” how the iPad “changed everything in my life” because the device allowed him to finally have a voice with which he could communicate with people.

\n

“All my life I wanted so badly to connect with people,” Barmache said in the video. “Without a voice, people only see my autism and not the real me.”

\n

Barmache’s mother, Tami, was awestruck at her son’s voice. She stated in the video: “As his mom, it’s just the most incredible thing ever to suddenly start to hear your child’s voice,” she said. “Not being able to speak is not the same as not having something to say.”

\n

In an interview with Mashable that coincided with the premiere of his film, Dillan went into greater detail about what it feels like to be autistic, how the iPad empowered him, and how amplifying his voice affected those around him. He put it succinctly in one sentence, saying: “The iPad allows me to be seen.”

\n

Developing Education

\n
\"\"

\n

In my interviews with Maksel and Caviggia, both mentioned one of their most heavily-used apps is AssistiveWare’s communication board app, Proloquo2go. This is noteworthy because not only is the app professional (and, at $250, professionally priced) but because the coming of the iPad made it possible for the app to exist.

\n

The Amsterdam-based AssistiveWare has been a presence on the iOS App Store since the beginning; work on Proloquo2go began in May 2008, with the first version shipping nearly a year later in April 2009.15 Company founder and CEO David Niemeijer said to me via email that Proloquo2go was optimized for Apple’s tablet right from the get-go, and has been in the store “from day one.” To date, the app has been downloaded over 250,000 times worldwide. The advent of the App Store and the iPad not only helped hundreds of thousands of users over the past decade, but it helped AssistiveWare too. What started as a one-man operation, Niemeijer said, has grown to a team of over 30 members who were paid a visit by Tim Cook.

\n

Proloquo2go was “propelled into a whole other league” upon the iPad’s introduction, Niemeijer told me. “It was immediately clear to me that this would be the perfect device [for the app],” he said. It was previously available on the smaller iPhone and iPod touch. Furthermore, he and his team saw “two key benefits” of the tablet’s large screen. “Firstly, it allowed us to offer more words on the screen at once,” he said. “Secondly, many people with speech challenges also have fine-motor or visual challenges, making the larger screen much easier to work with.”

\n

Niemeijer wrote in 2018 about Proloquo2go’s origin story, as well as about how the App Store’s opening in 2008 and the iPad’s debut two years later kickstarted what he called the “democratization of communication.” Like Maksel, he observed how the iPad was a game-changer for families and professionals. At $500, iPad was eminently more affordable and readily available than dedicated AAC devices, most of which are exorbitantly priced and only available on the whims of a slow approval process. “Professionals were no longer, through a formal assessment process, the gatekeepers to communication devices,” Niemeijer wrote on Medium. “Parents and individuals with communication challenges could now make their own decisions.”

\n

Regarding the iPad in general, Niemeijer thinks it’s “an amazing device that is only getting better and better.” He said he’s happy the entry-level model continues to become more affordable, and is “super happy” the iPad mini is still around for younger children. As for the Pro models, Niemeijer told me their considerably better sound make using communication apps like his better, and wishes the speakers in the less expensive iPads had better audio fidelity. He’s also impressed by iOS and iPadOS 13’s addition of pointer support, saying the newfound compatibility with joysticks, trackballs, and mice have further expanded access to the tablet.

\n

Mark Coppin is Director of Disability Services at North Dakota State University. An assistive technology professional for over three decades, Coppin’s responsibilities at the school focus on supporting people with disabilities by providing suitable accommodations. Prior to the iPad, he used the iPod touch with students, but said he realized the tablet would “profoundly change special education and education in general.” He currently uses both an 11-inch and a 12.9-inch iPad Pro; he uses both sizes so he can “evaluate the best size and fit for those I work with,” Coppin said.

\n

Yet it was the iPod that sold him on technology’s capacity for facilitating learning.

\n

“It was a device that my students gravitated to,” he said. “I was working a lot with students on the autism spectrum at the time, and the device seemed to make sense for many of them. It was visual and they were able to control the device by interacting directly with the screen. It was also very engaging and new content was being developed everyday.”

\n

The iPad’s arrival was a watershed moment of sorts for Coppin, as it exponentially built upon what was on the iPod touch in a literally bigger way. “I saw this huge screen, a portable device, and an affordable device priced at a consumer level. I knew it would be a powerful solution for my students and clients,” he said. “This device replaced many expensive assistive technology solutions; it could be personalized and customized to meet the needs of my students and clients.”

\n

He continued: “I have seen a huge shift in our field because of the iPad,” he said. “Tools and solutions that were once considered assistive technology or specialized solutions are now a part of the OS. That means they are available to everyone – they are free and powerful solutions that level the playing field.”

\n

As for the iPad at his workplace, Coppin told me students at the university love the iPad for how engaging it is and how it invites them to explore their creativity. He told me “the beautiful thing” about the iPad is that every modification is transparent to anyone. Moreover, Coppin appreciates how the iPad’s flexibility allows him to individualize the device such that it caters to a student’s needs and desires, as well as being a great adherent to Universal Design for Learning best practices. He also noted how NDSU professors have embraced the iPad, saying the device allows for myriad ways of learning, networking, and staying engaged with the curriculum.

\n

“It gives me the opportunity to optimize their [students’] learning,” he said. “For a student who struggles with fine-motor control, I can make accommodations to compensate for their fine-motor limitations. For a student who cannot see, I can show them VoiceOver. For a struggling reader or non-reader, I can show them features where they can have the text read to them. For a student who cannot touch the screen, I can turn on Switch Control and give them independent access to the device.” Coppin said his personal favorite iPad features all involve accessibility, but his most favorite is Switch Control. He loves how it levels the field by allowing someone to play a game such as Angry Birds like anyone else. He said he knew several students with cerebral palsy who were “desperate” to try Switch Control on the iPad after struggling to use the device’s touchscreen. “If they could play Angry Birds, they could do practically anything everyone else could,” he said.

\n

Coppin was effusive overall in his praise of the iPad. He’s an ardent supporter of not only iPadOS as a platform, but also of how technology can unleash new ways of learning and connecting for students and teachers. “I love it when new accessibility features come out because, when I explore these features, I have a particular person in mind,” he said. “I know that that one feature can make a profound difference in that person’s life. One small feature can make so much difference.”

\n

On the flip side, Coppin has no nits to pick with the iPad itself. What bothers him is how some educators perceive the iPad and how it’s utilized in some classrooms.

\n

“There needs to be some intentionality in its use. But some educators take it too far and they think that the iPad will solve everything,” he said. “They think that if you give a student an iPad, they will automatically learn. In the end, you still have to teach and be pedagogically sound. The iPad is just another educational tool but it can be a powerful tool in the hands of a good teacher.”

\n

Coincidentally, Coppin revealed his friendship with AssistiveWare’s Niemeijer. Coppin is yet another Proloquo2go devotee, telling me that he knew “it would completely revolutionize the industry” the first time he used the application.

\n

Pondering the iPad’s Potential

\n
\"\"

\n

As I wrote earlier, there were a slew of think pieces from various voices in the Apple community published in January as the iPad’s tenth birthday approached.16 One of those pieces was written by Stratechery’s Ben Thompson, who posited the tablet as a tragic figure because, while phenomenal in myriad ways yet so hindered in others, it never has quite lived up to the stratospheric hype surrounding it over the past decade. He wrote, in part: “the tragedy of the iPad is not that it flopped; it is that it never did, and likely never will, reach that potential so clearly seen ten years ago.”

\n

However niche relative to the iPad’s overall standing, one could make a legitimate case that, in an accessibility context, the device has more than lived up to its potential. Consider the testimonies from people like Maksel and Niemeijer on using iPads as communication boards. This is a non-trivial, literally life-altering use case. Using the iPad for “real work” is not solely the domain of power users, the iPad aficionados who routinely push iPadOS to the limits by using complex shortcuts and by editing podcasts with Apple Pencil. There’s a tendency in the tech community at large to associate “real work” with producing in creative mediums like photography, music, art, and coding. Apple is guilty of falling into this trap, given their proclivity for showcasing such work in ads and at events. While there’s nothing inherently wrong with this approach, what is missed is the idea that “real work” needn’t be cerebral or showy to be meaningful. Real work is also about using iPads in speech therapy sessions and to teach foundational skills like colors. It’s not as flashy and nerdy, and doesn’t make many headlines, but is nonetheless serious in its own right.17

\n

Consider also the iPad’s form factor. Much of the consternation from certain factions of the Apple community surrounding the iPad, particularly since the iPad Pro debuted in 2015, has stemmed from whether it can (or should) replace Mac laptops. It already has for many people, due to the versatile nature of the device and the software enhancements that have come over the last few iOS releases. But there’s another reason to prefer an iPad over a MacBook: as ever, it’s accessibility.

\n

I’ve literally called the iPad “the most accessible computer Apple has ever built.” High praise, but the declaration is justified in two key ways. First, iOS is simpler and more streamlined than macOS. For someone with certain cognitive impairments, for instance, the default one-app-at-a-time windowing model does a lot to reduce cognitive load and increase concentration. Secondly, someone who’s visually impaired shouldn’t have to chase the mouse pointer around the screen, nor be as precise with clicking smaller menu bar items and other UI buttons. Most importantly, because an iPad’s screen is more flexible than a laptop’s, people like me don’t necessarily have to press their noses against the glass in order to see well.

\n

Niemeijer echoed my thoughts over the advantages of the iPad’s form factor.

\n

“Thanks to the wide range of accessibility features Apple built into iOS, they have achieved a true universal design. iPads are truly for everyone and people with disabilities can use an iPad for much the same things as everyone else,” he said. “If you think about it, an all-glass user interface is not the first thing you think of when providing access to someone who cannot see the screen or cannot touch it – yet if you cannot see or cannot use your hands, you can still use an iPad thanks to all the built-in accessibility features.”

\n

As are all Apple products, Macs are fully accessible and have been for several years. Nevertheless, the ergonomic differences between iPads and MacBooks are very real; that an iPad can be much better is again a testament to Apple hitting the bullseye when it conjured up the original product’s hardware design. Put another way, pitting one versus the other isn’t always about lazy tropes about real work or software preferences and limitations. Sometimes, as is the case here, one can be clearly better due to a multitude of factors that include accessibility. It doesn’t mean either is categorically worse or we dislike what we don’t use. It means, for many, “better” is judged by more than an arbitrary getting-things-done metric.

\n

Although Thompson makes a compelling argument that the iPad has missed its potential, it’s equally as compelling to counter his assertion by highlighting accessibility’s role. In this sense, the iPad has clearly been triumphant. As a tool for assistive technology, it has reached self-actualization and then some.

\n

Putting the iPad in Perspective

\n

To appreciate the breakthroughs the iPad has afforded in areas like special education and communication, one must take a holistic view of history. The iPad is not perfect; it can (and will) always improve. As time marches on, so too will technology. For accessibility, however, it comes closest to perfection, to fully realizing Apple’s ambitions with the device. This is not insignificant and cannot be overstated; accessibility has been just as instrumental in shaping the iPad’s journey as the much-ballyhooed Mac-like productivity capabilities. This aspect of the tablet’s story is an angle equally deserving of such recognition.

\n

The iPad has been so prosperous for accessibility, and for the mainstream user too, because of the fact that it’s “just a big iPhone.” This former mockery is myopic; it’s unwarranted and mistakenly conflates elaboration with progress. It’s an interesting dichotomy – Apple clearly has evolved the device, as they should, but I think it’s easy to get caught up in all the hustle and bustle. People don’t stop and smell the roses, so to speak, and appreciate the simpler part of life. Although watching iPadOS mature with wonderful tools is great in the grand scheme, I think people sometimes forget about the iPad’s soul, the elemental parts of it that make it so beloved. This hearkens back to the founding design: it was like the iPhone. There’s something to be said for its simplicity enduring amidst the incessant want for more.

\n

To me, the iPad is neither an abject failure nor a victim of some tragedy. The iPad has had a fruitful life, period. Judging whether something is good or bad, technologically speaking, can be subjective. This is especially true for accessibility. Being a big iPhone can indeed be a good thing; it just depends on who’s using it. A grid of icons and a single app paradigm are more befitting across a variety of accessibility-oriented situations. This needs to be celebrated more often.

\n

Given how complex the makeover has been, that the iPad’s defining characteristic – where the app is the screen and the screen is the app – persists so prominently is no small feat. The iPad’s essential simplicity remains endearing to legions of accessibility-minded people, folks like Caviggia and Coppin. Illustrating how the iPad works well as “just a big iPhone” is profound; it shines a powerful perspective on a notion that has plagued the iPad since its earliest days with Jobs in that chair.

\n

There’s an adage popular with those in special education circles, which is that people who work in special education are special. That’s undoubtedly true. I can’t help but think there’s a charming synergy between the people who work in the field and technological marvels like the iPad which empowers them to change the lives of children and their families. As Herrlinger and Wilson have said to me on multiple occasions, that’s Apple’s entire modus operandi in a nutshell.

\n

The Bottom Line

\n

If this story were one of Aesop’s Fables, the moral probably would be that the iPad’s barometer for success does not solely hinge on loads of multi-modal, multi-windowed, whiz-bang productivity features. That they’re optional is immaterial to the broader point here because, in the end, they tell only one part of the story. The contributions the iPad has made to further accessibility matter too, which is why it’s important to point out that being just a big iPhone is okay, enlightening even.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n
\n
  1. \nGiven the release of the updated iPad Pro and last year’s introduction of iPadOS, it feels safe to say the days of lackadaisical iPad upgrade cycles are officially over. ↩︎\n
  2. \n
  3. \nMost companies would kill for a business like the iPad. It made $6 billion in revenue during the last holiday quarter, according to Jason Snell of Six Colors↩︎\n
  4. \n
  5. \nThe issues surrounding the intricacies of the iPadOS multitasking system go beyond the scope of this article. ↩︎\n
  6. \n
  7. \nAccessible both ways: in the availability sense and in the disability sense. ↩︎\n
  8. \n
  9. \nThis is on top of the fact that, as a person with disabilities, I received special education services throughout my academic career growing up. ↩︎\n
  10. \n
  11. \nExpressive language is saying, “My iPad is on the table.” Receptive language is asking your friend at the table to “give me the iPad” and have them do it. ↩︎\n
  12. \n
  13. \nThe piece ran a few months before I quit my job in order to try my hand at being a full-time writer. ↩︎\n
  14. \n
  15. \nIn other words, avoiding the temptation to use the thing as a glorified, really expensive pacifier. ↩︎\n
  16. \n
  17. \nThis feature takes its cue from Guided Access, first introduced – on stage, by Scott Forstall at WWDC 2012 eight years ago – in iOS 6. ↩︎\n
  18. \n
  19. \nAnd of course, in clinical settings outside of education, like private practice↩︎\n
  20. \n
  21. \nAlthough these iPads are technically considered obsolete, the fact that they’re still put to good use means obsolescence is relative. They may not support the latest OS, but the OS they do run is perfectly fine for what’s needed. ↩︎\n
  22. \n
  23. \nGroup sessions can occur when the therapist comes into a class to work with the entire class, or when she/he takes two or more out of the classroom for therapy. To push in describes the former, whereas pull out describes the latter. ↩︎\n
  24. \n
  25. \nCore vocabulary words include no, mine, go, and done, amongst others. ↩︎\n
  26. \n
  27. \nSchools have criminally low budgets, which is why you hear tales about teachers taking money from their own pocket to buy supplies. Unless it’s a grant or some other pot of gold, big-ticket items like new iPads aren’t high on the priority list. ↩︎\n
  28. \n
  29. \nThey’ve been part of the Mac development scene since 2002. ↩︎\n
  30. \n
  31. \nIf you want to be pedantic, Apple announced the first iPad on January 27. But it wasn’t officially put on sale until April 3. ↩︎\n
  32. \n
  33. \nThis is why accessibility coverage in tech journalism matters so much. Not only does it improve diversity and inclusion by amplifying the voices of an underrepresented and marginalized group, it amplifies unique and deserving perspectives on the tools so indispensable to humanity. They can be a welcome reprieve to the seemingly-endless stream of dystopian Facebook horror stories. ↩︎\n
  34. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Perhaps the most common complaint hurled against the iPad over its first decade of life is that it‘s little more than a bigger iPhone. At a fundamental level, the criticism is certainly valid: by and large, the iPad runs the same software as the iPhone. The penchant for bemoaning this bigness emanates from discontentment over the fact that substantial improvements to the iPad’s software have come at a glacially slow pace. Until last year, meaningful upgrades tailored to the tablet were few and far between.1 As much as Apple has extolled the iPad for being “unlike any computer,” the truth is the product stagnated for quite a while in terms of software.2 For better or worse, the company has been preoccupied with savoring every last drop of mother’s milk from the cash cow that is the iPhone. The iPad was left to wither thirstily when it came to its own growth, and it suffered for some time as a result.\nIn actuality, the iPad being more or less a scaled-up iPhone isn’t necessarily an entirely bad thing. The reason is iOS; familiarity breeds comfort – Apple shrewdly created the iPad’s user interface (and to lesser extents, Apple Watch and Apple TV) to largely resemble the iPhone. Especially for less nerdy users, the consistency across devices makes for a seamless, less intimidating experience. From icons to text to features to the touchscreen, the iPad being so similar to the iPhone means acclimating to the device takes minimal time and effort. From an accessibility standpoint, easy acclimation sets the tone for an enjoyable user experience. The foremost reason this is important is that the easier it is to acclimate to a device, the easier it is to find and configure mission-critical accessibility features.\nSupported By\nConcepts\n\n\nConcepts: Where ideas take shape\nThus, it’s not at all unreasonable to look at what was heretofore a pejorative assessment – the iPad is nothing but a big iPhone – and turn it into a positive. One of the unheralded aspects of the device’s success is how its approachable, intuitive nature has made it a hit in accessibility-centric contexts such as special education classrooms and as a communicative aid. Such advances get right at the heart of the oft-cited Steve Jobs quote on the so-called intersection of technology and the liberal arts, when he said, “It’s in Apple’s DNA that technology alone is not enough.” Assistive technology obviously caters to the humanities part of the liberal arts, and it’s not hard to see how the iPad’s roots as ostensibly a bigger iPhone can be an asset rather than a liability. You just have to be willing to keep an open mind.\n\n“In many ways, education and accessibility beautifully overlap,” Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, told me in a 2018 interview for TechCrunch. “For us, the concept of differentiated learning and how the accessibility tools that we build in [products] help make that [learning] possible is really important to us.”\nForm Follows Function\nIn order to completely understand why iPad has been so successful for students, parents, and educators, it’s crucial to first examine the iPad at the product’s most basic level. This means the object itself and its interaction model.\nIn written retrospectives and on podcasts, many in the Apple commentariat have pointed to the chair Jobs sat in as he demonstrated the original iPad on January 27, 2010 as a metaphor for Apple’s goal with the device. The chair helped telegraph what the iPad meant to Apple; to wit, that the product’s form factor (and accompanying software) represented the company’s conception for the future of personal computing. That Jobs was casually sitting back in that cushy recliner, browsing photos, writing email, and reading the (Flash-free) New York Times, sent the message that the iPad was highly unlike any conventional computer. To paraphrase Jony Ive, the iPad’s physical nature meant it conformed to the user rather than forcing the user to conform to it – which it continues to do with aplomb to this day. It was primarily for this reason Jobs described the product as “magical and revolutionary.”\nIn terms of accessibility, the idea that the iPad is nothing but a slab of metal and glass is the main attraction. For users with certain cognitive delays, for example, the iPad is a near perfect device. To use it is to literally put your hands on the screen. iPadOS is far less conceptually complex than macOS; there are no required pointing devices or windows to fiddle with.3 Likewise, visually impaired users can hold the tablet as close to their face as necessary to see. In both scenarios, this flexibility is made possible because Apple rethought the computing experience down to its essence. And that’s saying nothing about the deeply-integrated system accessibility features designed to aid and enhance the user experience for everyone, regardless of ability.\nWhat this ultimately means is that the advent of the iPad ushered in a new era, an era in which the device is tailor-made to enable learning through technology. Particularly for those in special education settings, the iPad was (and remains) very much the Platonic ideal in terms of modern, accessible,4 and engaging teaching tools. And it’s because one of the iPad’s greatest strengths is that it’s essentially a big iPhone.\nBetween Two Worlds\n\nBefore getting into tech journalism in May 2013, I spent more than a decade as a paraeducator (the more professional name for a teacher’s aide) working in middle school and preschool special education classrooms.5 The last nine years of my career were spent with children aged 3 to 5, where we used technology like the iPad (and iPod touch) in small-group activities. Having spent most of my time with students on the autism spectrum, I was trained in several teaching methodologies designed for autistic children. I found the iPad to be great in not only gauging expressive and receptive language skills,6 it also was great in introducing and reinforcing pre-academic concepts such as identifying colors, letters, and shapes. In fact, my first-ever bylined piece, titled “Re-Enabled,” centered around how iOS and the iPad made life more accessible for my students, as well as myself.7\nDuring my teaching years, I also studied early childhood development part-time at my then-local community college. The combination of real world, hands-on experience of my job and the textbook theoretical learning in my schooling helped me synthesize the disparate experiences I had at work and as a student. This proved especially beneficial as I was learning how children typically develop from infancy through early elementary age while at the same time working daily in a decidedly atypical environment, developmentally speaking. I was very much “between worlds,” so to speak.\nWhat I wrote seven years ago in The Magazine remains true today. While there is no substitute for analog, real-life, play-based learning experiences for young children, there can be no denying the realities of the modern world. The iPad as a teaching tool not only bridges the analog and digital worlds, it allows for more immersive and innovative opportunities for students than ever before. For early childhood special education, where learning and teaching oftentimes take new meaning, the iPad can be utterly transformative. Not only for the child, but for the adult too.\nI cannot emphasize enough how astonishing it was to watch my students, some of whom had fairly severe cognitive delays, take to the iPad so naturally and with such immediacy. Launching apps, swiping through pages, even the advanced multitasking gestures were all, pun intended, child’s play. This speaks volumes of not only the allure of things that light up and make noise, but of the genius of Apple building the iPad’s UI on top of the iPhone’s operating system. Like the decision to base iOS on many of OS X’s underpinnings, the choice to base the iPad on iOS was immensely prudent by Apple. We reap the benefits of it every day.\nThe bottom line is that if technology like the iPad is embraced and used with purpose,8 then it absolutely can augment the traditional materials and teaching methodologies that have been around schools forever. In my time in classrooms, I’ve seen firsthand how old and new can go meaningfully together.\nPutting the i in iPad\n\nIt’s no secret Apple believes strongly in the iPad. Although the iPad’s software story has looked weak until recently, the hardware has never been better – arguably the company’s best, maybe ever. The current iPad Pro design language is breathtakingly beautiful, so much so that many in the Apple community (myself included) hope Apple carries it over to the iPhone as soon as possible. Accessory-wise, the Smart Keyboard and Apple Pencil are first-class peripherals. The second-generation Pencil is, in my estimation, every bit as delightful and magical as AirPods. Like the earbuds, it’s not apparent in using the Pencil that it’s effectively a teeny-tiny computer with sensors and other tech inside. Using it is uncannily like using a regular pencil, albeit one that never smears or needs sharpening.\nApple has gone on the record about how iPads can be used in classrooms, most notably with its Everyone Can Code initiative and Swift Playgrounds on iOS and the Mac. Herrlinger has said to me many times over the years that when Apple says everyone can code, they really do mean everyone; staying true to their ethos, Apple built Everyone Can Code with accessibility top of mind. Swift Playgrounds is fully accessible to disabled users, offering robust integration with features like VoiceOver and Switch Control. There’s even downloadable tactile puzzles in Braille and coding videos presented in American Sign Language for aspiring blind and visually impaired and hard-of-hearing/deaf coders, respectively.\n“We believe in focusing on ability rather than disability. We believe coding is a language – a language that should be accessible to everyone,” Apple CEO Tim Cook said to me at an Everyone Can Code kickoff event in 2018.\nApple’s approach to the iPad in the classroom is broad in scope; they don’t focus on providing esoteric, specialized tools for teachers and other professionals in special education. The company instead leverages its massive App Store ecosystem by allowing third-party developers to create apps for learning and communication. That said, the Apple-designed Classroom and Schoolwork apps can hold relevance to accessibility. This marches in lockstep with Apple’s overarching philosophy surrounding the iPad. The act of individualizing what one needs or wants from the device gets to the core of what has made the iPad so unique to so many.\nMeg Wilson, a former special education teacher who now works at Apple, told me in a 2018 interview that Schoolwork in particular presents “an amazing opportunity for collaboration amongst service providers” when it comes to continually updating colleagues on students’ progress throughout the school year; this perpetual stream of updates also opens the door to faster adjustment of a student’s goals and objectives. Likewise, Wilson told me a teacher or other support staff could add their entire caseload into Schoolwork and have progress reports at the ready anytime, anywhere. And for Classroom, educators can “lock” a group of students (or just one) into an app so that she or he stays on task and completes their activity.9 For students who don’t need much scaffolding or prompting, it’s easy to share work with their teacher(s) with a couple taps using the system share sheet.\n“When you are creative with technology, you change people’s lives,” Wilson said.\nBeing creative with technology is what the iPad was made for. As Apple sees it, the impact the device has on education is a microcosm of what makes the iPad so special. Students and teachers alike can turn the iPad into whatever they need it to be in order to enhance the classroom experience. Tim Cook often opens events with his boilerplate statement about how Apple’s North Star is building products that enrich and empower lives, and it isn’t an empty bromide. The iPad does precisely that for special education professionals on a daily basis. To say the iPad has been a revolution in special education settings isn’t an exaggeration – it’s exactly right.\n“Education and accessibility are inextricably tied in many ways,” Herrlinger said in a recent interview. “What the iPad has done for classrooms to empower and enrich the lives of students is what we strive for when making these products.”\nUsing Your Words\n\nOne of the most prominent places where the iPad excels as a teaching tool is in speech therapy sessions. Speech and language therapy plays an integral role in special education settings,10 as SLPs (speech and language pathologists) support the work happening in the classroom. These specialists extend the experience by tirelessly working to facilitate language and pro-social behavior between children and staff, but most importantly, between their peers and parents. This work goes a long way in providing enriching, developmentally appropriate programs.\nKirsty Maksel, a speech therapist based in the Bay Area who works at an elementary school, has used iPads with her students over the past 4–5 years. A self-professed “late adopter,” her iPad is an older model, one with the 30-pin iPod connector of yore.11 For students who have limited verbal skills, or who are totally non-verbal, Maksel says the iPad “has made voice output communication software much more accessible than a $15,000 to $20,000 dollar dedicated device.” She said her iPad is used in both group (“push-in”) and individual (“pull-out”) sessions.12\nFor Maksel, the iPad is used to encourage communication. She mainly uses voice-output communication apps, telling me Proloquo2go and Go Talk are mainstays for her. “They [her students] are learning to request, comment, and command. Some are learning to ask structured questions. I [also] model core vocabulary13 as they are learning new vocabulary words,” she said. Maksel added she uses YouTube to play songs that connect thematically to books she uses, noting that she expects students to sing along and mimic motor movements as songs go on.\nIn this environment, the iPad is not a toy – it’s a teaching tool. Thus, Maksel has tried to curtail some of the device’s irresistibility by removing irrelevant apps from the device, as well as putting it in a case. An external speaker, she said, also helps in this regard because it’s needed for better sound. A byproduct of this is that it also directs the sound away from the device itself, which can help with attention.\nLike Maksel, Courtney Caviggia is a Bay Area speech therapist who utilizes the iPad. Working with preschoolers, Caviggia has spent the last eight years using “a few different generations” of the iPad mini. For her purposes, like Maksel, Caviggia uses the iPad as a means to promote communication during sessions with her students. Apps are obviously the centerpiece of the tablet experience, and Caviggia has many favorites, including the aforementioned Proloquo2go, Go Talk, and YouTube.\n“I love interactive apps, music and dance apps for children, children’s apps, learning apps, and ones that give positive reinforcement for correct answers provided. I enjoy apps with visual supports for students,” she said. “I have also used several AAC apps and systems that are used on the iPad and they have been extremely easy to use and adapt to student needs.”\nCaviggia told me she’s been using the iPad for so long that she’s “found it to be an asset in therapy.” It does everything from model turn-taking to being a discrete communication device to being a motivator and reward for hard work and good behavior. But the assets lie not only with her students; they benefit her just as much. Take data-tracking, for instance. Data collection is critically important to special educators, who rely on the information when writing progress reports and participating in annual IEP meetings. What Meg Wilson said is a truism for special educators that’s practically as essential to life as food and water: “What we need in special ed is data – we need data.” In Caviggia’s case, she said she likes how many apps she uses include score-taking, which allows her to focus all her energy on the student rather than interrupt the flow to jot something down for later.\nFor all the iPad’s strengths, Caviggia has a few gripes. The iPad’s appeal is a double-edged sword, because the device can be as distracting as it is motivating. “Some students have a hard time taking turns with such a highly motivating object,” she said. Caviggia further lamented how “frustrating” it can be that the iPad has a finite lifespan in terms of software updates. As the iPad ages, access to app updates, new apps, and new OS features can be limited if a hardware upgrade isn’t feasible.14\nDillan Barmache is a non-verbal autistic man and self-described “advocate for autistic voices.” Apple told his story in April 2016 as part of recognizing Autism Acceptance Month. Then 16, he explained in a poignant, now-private video called “Dillan’s Voice” how the iPad “changed everything in my life” because the device allowed him to finally have a voice with which he could communicate with people.\n“All my life I wanted so badly to connect with people,” Barmache said in the video. “Without a voice, people only see my autism and not the real me.”\nBarmache’s mother, Tami, was awestruck at her son’s voice. She stated in the video: “As his mom, it’s just the most incredible thing ever to suddenly start to hear your child’s voice,” she said. “Not being able to speak is not the same as not having something to say.”\nIn an interview with Mashable that coincided with the premiere of his film, Dillan went into greater detail about what it feels like to be autistic, how the iPad empowered him, and how amplifying his voice affected those around him. He put it succinctly in one sentence, saying: “The iPad allows me to be seen.”\nDeveloping Education\n\nIn my interviews with Maksel and Caviggia, both mentioned one of their most heavily-used apps is AssistiveWare’s communication board app, Proloquo2go. This is noteworthy because not only is the app professional (and, at $250, professionally priced) but because the coming of the iPad made it possible for the app to exist.\nThe Amsterdam-based AssistiveWare has been a presence on the iOS App Store since the beginning; work on Proloquo2go began in May 2008, with the first version shipping nearly a year later in April 2009.15 Company founder and CEO David Niemeijer said to me via email that Proloquo2go was optimized for Apple’s tablet right from the get-go, and has been in the store “from day one.” To date, the app has been downloaded over 250,000 times worldwide. The advent of the App Store and the iPad not only helped hundreds of thousands of users over the past decade, but it helped AssistiveWare too. What started as a one-man operation, Niemeijer said, has grown to a team of over 30 members who were paid a visit by Tim Cook.\nProloquo2go was “propelled into a whole other league” upon the iPad’s introduction, Niemeijer told me. “It was immediately clear to me that this would be the perfect device [for the app],” he said. It was previously available on the smaller iPhone and iPod touch. Furthermore, he and his team saw “two key benefits” of the tablet’s large screen. “Firstly, it allowed us to offer more words on the screen at once,” he said. “Secondly, many people with speech challenges also have fine-motor or visual challenges, making the larger screen much easier to work with.”\nNiemeijer wrote in 2018 about Proloquo2go’s origin story, as well as about how the App Store’s opening in 2008 and the iPad’s debut two years later kickstarted what he called the “democratization of communication.” Like Maksel, he observed how the iPad was a game-changer for families and professionals. At $500, iPad was eminently more affordable and readily available than dedicated AAC devices, most of which are exorbitantly priced and only available on the whims of a slow approval process. “Professionals were no longer, through a formal assessment process, the gatekeepers to communication devices,” Niemeijer wrote on Medium. “Parents and individuals with communication challenges could now make their own decisions.”\nRegarding the iPad in general, Niemeijer thinks it’s “an amazing device that is only getting better and better.” He said he’s happy the entry-level model continues to become more affordable, and is “super happy” the iPad mini is still around for younger children. As for the Pro models, Niemeijer told me their considerably better sound make using communication apps like his better, and wishes the speakers in the less expensive iPads had better audio fidelity. He’s also impressed by iOS and iPadOS 13’s addition of pointer support, saying the newfound compatibility with joysticks, trackballs, and mice have further expanded access to the tablet.\nMark Coppin is Director of Disability Services at North Dakota State University. An assistive technology professional for over three decades, Coppin’s responsibilities at the school focus on supporting people with disabilities by providing suitable accommodations. Prior to the iPad, he used the iPod touch with students, but said he realized the tablet would “profoundly change special education and education in general.” He currently uses both an 11-inch and a 12.9-inch iPad Pro; he uses both sizes so he can “evaluate the best size and fit for those I work with,” Coppin said.\nYet it was the iPod that sold him on technology’s capacity for facilitating learning.\n“It was a device that my students gravitated to,” he said. “I was working a lot with students on the autism spectrum at the time, and the device seemed to make sense for many of them. It was visual and they were able to control the device by interacting directly with the screen. It was also very engaging and new content was being developed everyday.”\nThe iPad’s arrival was a watershed moment of sorts for Coppin, as it exponentially built upon what was on the iPod touch in a literally bigger way. “I saw this huge screen, a portable device, and an affordable device priced at a consumer level. I knew it would be a powerful solution for my students and clients,” he said. “This device replaced many expensive assistive technology solutions; it could be personalized and customized to meet the needs of my students and clients.”\nHe continued: “I have seen a huge shift in our field because of the iPad,” he said. “Tools and solutions that were once considered assistive technology or specialized solutions are now a part of the OS. That means they are available to everyone – they are free and powerful solutions that level the playing field.”\nAs for the iPad at his workplace, Coppin told me students at the university love the iPad for how engaging it is and how it invites them to explore their creativity. He told me “the beautiful thing” about the iPad is that every modification is transparent to anyone. Moreover, Coppin appreciates how the iPad’s flexibility allows him to individualize the device such that it caters to a student’s needs and desires, as well as being a great adherent to Universal Design for Learning best practices. He also noted how NDSU professors have embraced the iPad, saying the device allows for myriad ways of learning, networking, and staying engaged with the curriculum.\n“It gives me the opportunity to optimize their [students’] learning,” he said. “For a student who struggles with fine-motor control, I can make accommodations to compensate for their fine-motor limitations. For a student who cannot see, I can show them VoiceOver. For a struggling reader or non-reader, I can show them features where they can have the text read to them. For a student who cannot touch the screen, I can turn on Switch Control and give them independent access to the device.” Coppin said his personal favorite iPad features all involve accessibility, but his most favorite is Switch Control. He loves how it levels the field by allowing someone to play a game such as Angry Birds like anyone else. He said he knew several students with cerebral palsy who were “desperate” to try Switch Control on the iPad after struggling to use the device’s touchscreen. “If they could play Angry Birds, they could do practically anything everyone else could,” he said.\nCoppin was effusive overall in his praise of the iPad. He’s an ardent supporter of not only iPadOS as a platform, but also of how technology can unleash new ways of learning and connecting for students and teachers. “I love it when new accessibility features come out because, when I explore these features, I have a particular person in mind,” he said. “I know that that one feature can make a profound difference in that person’s life. One small feature can make so much difference.”\nOn the flip side, Coppin has no nits to pick with the iPad itself. What bothers him is how some educators perceive the iPad and how it’s utilized in some classrooms.\n“There needs to be some intentionality in its use. But some educators take it too far and they think that the iPad will solve everything,” he said. “They think that if you give a student an iPad, they will automatically learn. In the end, you still have to teach and be pedagogically sound. The iPad is just another educational tool but it can be a powerful tool in the hands of a good teacher.”\nCoincidentally, Coppin revealed his friendship with AssistiveWare’s Niemeijer. Coppin is yet another Proloquo2go devotee, telling me that he knew “it would completely revolutionize the industry” the first time he used the application.\nPondering the iPad’s Potential\n\nAs I wrote earlier, there were a slew of think pieces from various voices in the Apple community published in January as the iPad’s tenth birthday approached.16 One of those pieces was written by Stratechery’s Ben Thompson, who posited the tablet as a tragic figure because, while phenomenal in myriad ways yet so hindered in others, it never has quite lived up to the stratospheric hype surrounding it over the past decade. He wrote, in part: “the tragedy of the iPad is not that it flopped; it is that it never did, and likely never will, reach that potential so clearly seen ten years ago.”\nHowever niche relative to the iPad’s overall standing, one could make a legitimate case that, in an accessibility context, the device has more than lived up to its potential. Consider the testimonies from people like Maksel and Niemeijer on using iPads as communication boards. This is a non-trivial, literally life-altering use case. Using the iPad for “real work” is not solely the domain of power users, the iPad aficionados who routinely push iPadOS to the limits by using complex shortcuts and by editing podcasts with Apple Pencil. There’s a tendency in the tech community at large to associate “real work” with producing in creative mediums like photography, music, art, and coding. Apple is guilty of falling into this trap, given their proclivity for showcasing such work in ads and at events. While there’s nothing inherently wrong with this approach, what is missed is the idea that “real work” needn’t be cerebral or showy to be meaningful. Real work is also about using iPads in speech therapy sessions and to teach foundational skills like colors. It’s not as flashy and nerdy, and doesn’t make many headlines, but is nonetheless serious in its own right.17\nConsider also the iPad’s form factor. Much of the consternation from certain factions of the Apple community surrounding the iPad, particularly since the iPad Pro debuted in 2015, has stemmed from whether it can (or should) replace Mac laptops. It already has for many people, due to the versatile nature of the device and the software enhancements that have come over the last few iOS releases. But there’s another reason to prefer an iPad over a MacBook: as ever, it’s accessibility.\nI’ve literally called the iPad “the most accessible computer Apple has ever built.” High praise, but the declaration is justified in two key ways. First, iOS is simpler and more streamlined than macOS. For someone with certain cognitive impairments, for instance, the default one-app-at-a-time windowing model does a lot to reduce cognitive load and increase concentration. Secondly, someone who’s visually impaired shouldn’t have to chase the mouse pointer around the screen, nor be as precise with clicking smaller menu bar items and other UI buttons. Most importantly, because an iPad’s screen is more flexible than a laptop’s, people like me don’t necessarily have to press their noses against the glass in order to see well.\nNiemeijer echoed my thoughts over the advantages of the iPad’s form factor.\n“Thanks to the wide range of accessibility features Apple built into iOS, they have achieved a true universal design. iPads are truly for everyone and people with disabilities can use an iPad for much the same things as everyone else,” he said. “If you think about it, an all-glass user interface is not the first thing you think of when providing access to someone who cannot see the screen or cannot touch it – yet if you cannot see or cannot use your hands, you can still use an iPad thanks to all the built-in accessibility features.”\nAs are all Apple products, Macs are fully accessible and have been for several years. Nevertheless, the ergonomic differences between iPads and MacBooks are very real; that an iPad can be much better is again a testament to Apple hitting the bullseye when it conjured up the original product’s hardware design. Put another way, pitting one versus the other isn’t always about lazy tropes about real work or software preferences and limitations. Sometimes, as is the case here, one can be clearly better due to a multitude of factors that include accessibility. It doesn’t mean either is categorically worse or we dislike what we don’t use. It means, for many, “better” is judged by more than an arbitrary getting-things-done metric.\nAlthough Thompson makes a compelling argument that the iPad has missed its potential, it’s equally as compelling to counter his assertion by highlighting accessibility’s role. In this sense, the iPad has clearly been triumphant. As a tool for assistive technology, it has reached self-actualization and then some.\nPutting the iPad in Perspective\nTo appreciate the breakthroughs the iPad has afforded in areas like special education and communication, one must take a holistic view of history. The iPad is not perfect; it can (and will) always improve. As time marches on, so too will technology. For accessibility, however, it comes closest to perfection, to fully realizing Apple’s ambitions with the device. This is not insignificant and cannot be overstated; accessibility has been just as instrumental in shaping the iPad’s journey as the much-ballyhooed Mac-like productivity capabilities. This aspect of the tablet’s story is an angle equally deserving of such recognition.\nThe iPad has been so prosperous for accessibility, and for the mainstream user too, because of the fact that it’s “just a big iPhone.” This former mockery is myopic; it’s unwarranted and mistakenly conflates elaboration with progress. It’s an interesting dichotomy – Apple clearly has evolved the device, as they should, but I think it’s easy to get caught up in all the hustle and bustle. People don’t stop and smell the roses, so to speak, and appreciate the simpler part of life. Although watching iPadOS mature with wonderful tools is great in the grand scheme, I think people sometimes forget about the iPad’s soul, the elemental parts of it that make it so beloved. This hearkens back to the founding design: it was like the iPhone. There’s something to be said for its simplicity enduring amidst the incessant want for more.\nTo me, the iPad is neither an abject failure nor a victim of some tragedy. The iPad has had a fruitful life, period. Judging whether something is good or bad, technologically speaking, can be subjective. This is especially true for accessibility. Being a big iPhone can indeed be a good thing; it just depends on who’s using it. A grid of icons and a single app paradigm are more befitting across a variety of accessibility-oriented situations. This needs to be celebrated more often.\nGiven how complex the makeover has been, that the iPad’s defining characteristic – where the app is the screen and the screen is the app – persists so prominently is no small feat. The iPad’s essential simplicity remains endearing to legions of accessibility-minded people, folks like Caviggia and Coppin. Illustrating how the iPad works well as “just a big iPhone” is profound; it shines a powerful perspective on a notion that has plagued the iPad since its earliest days with Jobs in that chair.\nThere’s an adage popular with those in special education circles, which is that people who work in special education are special. That’s undoubtedly true. I can’t help but think there’s a charming synergy between the people who work in the field and technological marvels like the iPad which empowers them to change the lives of children and their families. As Herrlinger and Wilson have said to me on multiple occasions, that’s Apple’s entire modus operandi in a nutshell.\nThe Bottom Line\nIf this story were one of Aesop’s Fables, the moral probably would be that the iPad’s barometer for success does not solely hinge on loads of multi-modal, multi-windowed, whiz-bang productivity features. That they’re optional is immaterial to the broader point here because, in the end, they tell only one part of the story. The contributions the iPad has made to further accessibility matter too, which is why it’s important to point out that being just a big iPhone is okay, enlightening even.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\n\n\nGiven the release of the updated iPad Pro and last year’s introduction of iPadOS, it feels safe to say the days of lackadaisical iPad upgrade cycles are officially over. ↩︎\n\n\nMost companies would kill for a business like the iPad. It made $6 billion in revenue during the last holiday quarter, according to Jason Snell of Six Colors. ↩︎\n\n\nThe issues surrounding the intricacies of the iPadOS multitasking system go beyond the scope of this article. ↩︎\n\n\nAccessible both ways: in the availability sense and in the disability sense. ↩︎\n\n\nThis is on top of the fact that, as a person with disabilities, I received special education services throughout my academic career growing up. ↩︎\n\n\nExpressive language is saying, “My iPad is on the table.” Receptive language is asking your friend at the table to “give me the iPad” and have them do it. ↩︎\n\n\nThe piece ran a few months before I quit my job in order to try my hand at being a full-time writer. ↩︎\n\n\nIn other words, avoiding the temptation to use the thing as a glorified, really expensive pacifier. ↩︎\n\n\nThis feature takes its cue from Guided Access, first introduced – on stage, by Scott Forstall at WWDC 2012 eight years ago – in iOS 6. ↩︎\n\n\nAnd of course, in clinical settings outside of education, like private practice. ↩︎\n\n\nAlthough these iPads are technically considered obsolete, the fact that they’re still put to good use means obsolescence is relative. They may not support the latest OS, but the OS they do run is perfectly fine for what’s needed. ↩︎\n\n\nGroup sessions can occur when the therapist comes into a class to work with the entire class, or when she/he takes two or more out of the classroom for therapy. To push in describes the former, whereas pull out describes the latter. ↩︎\n\n\nCore vocabulary words include no, mine, go, and done, amongst others. ↩︎\n\n\nSchools have criminally low budgets, which is why you hear tales about teachers taking money from their own pocket to buy supplies. Unless it’s a grant or some other pot of gold, big-ticket items like new iPads aren’t high on the priority list. ↩︎\n\n\nThey’ve been part of the Mac development scene since 2002. ↩︎\n\n\nIf you want to be pedantic, Apple announced the first iPad on January 27. But it wasn’t officially put on sale until April 3. ↩︎\n\n\nThis is why accessibility coverage in tech journalism matters so much. Not only does it improve diversity and inclusion by amplifying the voices of an underrepresented and marginalized group, it amplifies unique and deserving perspectives on the tools so indispensable to humanity. They can be a welcome reprieve to the seemingly-endless stream of dystopian Facebook horror stories. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-03-30T12:00:02-04:00", "date_modified": "2020-03-30T13:55:41-04:00", "authors": [ { "name": "Steven Aquino", "url": "https://www.macstories.net/author/stevenaquino/", "avatar": "https://secure.gravatar.com/avatar/c9f2417ddd309914c3883fca633738b8?s=512&d=mm&r=g" } ], "tags": [ "accessibility", "iPad", "iPad at 10", "stories" ] }, { "id": "https://www.macstories.net/?p=62779", "url": "https://www.macstories.net/stories/the-ipad-at-10-emerging-from-the-shadow-of-the-iphone/", "title": "The iPad at 10: Emerging from the Shadow of the iPhone", "content_html": "
\"\"

\n

The trouble with looking back over a long period is that time has a way of compressing history. The clarity of hindsight makes it easy to look back at almost anything and be disappointed in some way with how it turned out years later.

\n

We certainly saw that with the anniversary of the introduction of the iPad. Considered in isolation a decade later, it’s easy to find shortcomings with the iPad. However, the endpoints of the iPad’s timeline don’t tell the full story.

\n

It’s not that the device is short on ways it could be improved; of course, it isn’t. However, the path of the iPad over the past decade isn’t a straight line from point A to point B. The iPad’s course has been influenced by countless decisions along the way bearing consequences that were good, bad, and sometimes unintended.

\n

The 10th anniversary of the iPad isn’t a destination, it’s just an arbitrary point from which to take stock of where things have been and consider where they are going. To do that, it’s instructive to look at more than the endpoints of the iPad’s history and consider what has happened in between. Viewed from that perspective, the state of the iPad ten years later, while at times frustrating, also holds reason for optimism. No single product in Apple’s lineup has more room to grow or potential to change the computing landscape than the iPad does today.

\n

\n

Just a Big iPhone

\n
\"The

The original iPad.

\n

No product is launched into a vacuum, and the origin of the iPad is no different. It’s influenced by what came before it. In the ten years since April 3, 2010, that context has profoundly affected the iPad’s adoption and trajectory.

\n

In hindsight, it seems obvious that the iPad would follow in the footsteps of the iPhone’s design, but that was far from clear in 2010. Take a look at the gallery of mockups leading up to the iPad’s introduction that I collected for a story in January. The designs were all over the place. More than two years after the iPhone’s launch, some people still expected the iPad to be a Mac tablet based on OS X.

\n
\"An

An early Apple tablet mockup (Source: digitaltrends.com).

\n

However, after spending over two years acclimating consumers to a touch-based UI and app-centric approach to computing, building on the iPhone’s success was only natural. Of course, that led to criticisms that the iPad was ‘just a big iPhone/iPod touch’ from some commentators, but that missed one of the most interesting and overlooked part of the iPad’s introduction. Sure, Apple leaned into the similarities between the iPhone and the iPad, but even that initial introduction hinted at uses for the iPad that went far beyond what the iPhone could do.

\n

Apple walked a careful line during that introductory keynote in January 2010. Steve Jobs’ presentation and most of the demos were a careful mix of showing off familiar interactions while simultaneously explaining how the iPad was better for certain activities than the iPhone or the Mac. That went hand-in-hand with UI changes on the iPad that let apps spread out, flattening view hierarchies. The company didn’t stop there, though.

\n
\"\"

\n

Apple also provided a glimpse of the future. Toward the end of the iPad’s debut keynote, Phil Schiller came out for a demo of iWork. It was a final unexpected twist. Up to this point, the demos had focused primarily on consumption use cases, including web and photo browsing, reading in iBooks, and watching videos.

\n

The iWork demo was a stark departure from the rest of the presentation. Schiller took the audience on a tour of Pages, Numbers, and Keynote. They weren’t feature-for-feature copies of their Mac counterparts. Instead, each app included novel, touch-first adaptations that included innovative design elements like a custom software keyboard for entering formulas in Numbers. Apple also introduced a somewhat odd combination keyboard and docking station accessory for typing on the iPad in portrait mode.

\n

What stands out most about this last segment of the iPad keynote isn’t that the iPad could be used as a productivity tool, but the contrast between how little time was spent on it compared to how advanced the iWork suite was from day one. As I explained in January, I suspect that to a degree, Apple was hedging its bets. Perhaps, like the original Apple Watch, the company knew it had built something special, but it wasn’t sure which of the device’s uses would stick with consumers.

\n

The iPhone was still a young product in 2010. It was a hit, but consumers were still new to its interaction model. The App Store was relatively new, too, and filled with apps, many of which would be considered simple by today’s standards.

\n
\n

By tying the iPad closely to the iPhone, Apple provided users with a clear path to learning how to use its new tablet. Spotlighting familiar uses from the iPhone and demonstrating how they were better on the iPad tapped into a built-in market for the company’s new device. But the more I look at that first iPad keynote in the context of the past decade, the more I’m convinced that there was something else happening during that last segment.

\n

Apple knew it was on to something special that went beyond finding better ways to do the same things you could already do on an iPhone. If that was the extent of the iPad’s potential, the company wouldn’t have invested the considerable time it must have taken to adapt the iWork suite to the iPad, and later update the Mac versions to align better with their iOS counterparts. iWork was a harbinger of the more complex, fully-featured pro apps on the iPad that have only recently begun to find a foothold on the platform.

\n

However, there’s no doubt that it was the emphasis on consumption apps that set the stage for the iPad’s early years.

\n

The First Five Years

\n

Initial sales of the iPad were strong. The first day it was available, Apple sold 300,000 iPads. After four weeks, that number climbed to one million. Adoption was faster than the iPhone. In fact, it was faster than any non-phone consumer electronic product in history, surpassing the DVD player.

\n

The sales results were a vindication of the iPad’s launch formula. So when it was time to roll out the iPad 2, Apple sought to capture lightning in a bottle again by using the same playbook, right down to the black leather chair used for onstage demos in the first keynote.

\n
\"A

A GarageBand for iPad billboard.

\n

The structure of the iPad 2 presentation was the same too, closing with a demo of new, sophisticated apps from Apple that was reminiscent of Schiller’s iWork demo. This time, the company showed off iMovie and GarageBand. iWork had been impressive the year before, but something about the creative possibilities of making movies and music on a device you held in your hands recaptured the excitement and promise the iLife suite had ignited on the Mac years before. In 2010, Apple showed that you could get work done on an iPad, but in 2011, the company showed customers they could make art.

\n
\n

In the wake of the iPad 2 keynote, the iPad seemed poised to jump-start a computing revolution, but instead, Apple’s iPad efforts appeared to stall. Steve Jobs’ passing in late 2011 may have been a contributing factor, but paradoxically, it was Apple’s successful strategy of tying the iPad so closely to the iPhone that also held it back.

\n

iPad sales continued to climb in 2011 and 2012. Apple responded by rapidly iterating on the device and expanding the lineup. In 2012 alone, Apple released the third and fourth generation iPads, plus the iPad mini. By the time the ultra-thin iPad Air debuted in late 2013, though, things had begun to take a turn for the worse.

\n
\"The

The original iPad Air.

\n

After those first two keynotes, Apple’s push into productivity and creativity apps on the iPad slowed. At the same time, iPads were built to last. Users didn’t feel the same sense of urgency to replace them every couple years as they did with iPhones. Coupled with the fact that few apps were pushing the hardware, sales began to decline.

\n

The picture wasn’t completely bleak, though. Third parties took up the cause and pushed the capabilities of the platform with apps like Procreate, Affinity Photo, Editorial, and MindNode, and for some uses, the iPad was a worthy substitute for a Mac. However, with the iPad still moving in lockstep with the iPhone, developers and users began to run up against limitations that made advancing the iPad as a productivity and creativity platform difficult.

\n\n

Chief among the stumbling blocks were the lack of file system access and multitasking. The app-centric nature of iOS worked well for the iPhone, but as apps and user workflows became more sophisticated, the inflexibility of tying files to particular apps held the iPad back. The apps-first model was simple, but processing a file in multiple apps meant leaving a trail of partially completed copies in the wake of a project, wasting space, and risking confusion about which file was the latest version.

\n

The lack of multitasking became a speed bump in the iPad’s path too. The larger screen of the device made multitasking a natural fit. However, even though the hardware practically begged to run multiple apps onscreen, the OS wasn’t built for it, and when it finally was, the first iterations were limited in functionality.

\n

We don’t know what the internal dynamic was at Apple during this time, but the result was that, despite early optimism fueled by the iWork apps, GarageBand, and iMovie, the iPad failed to evolve into a widely-used device for more sophisticated productivity and creative work. It was still a capable device, and increasing numbers of users were turning to it as their primary computer. However, that only added to the building frustration of users who sensed the platform’s progress leveling off just as they were pushing it harder.

\n

Pro Hardware

\n

As the first five years of the iPad concluded, sales continued to decline. However, 2015 marked the first signs of a new direction for the iPad with the introduction of the first-generation iPad Pro, a product that began to push the iPad into new territory.

\n
\"The

The original iPad Pro.

\n

The original iPad Pro was a significant departure from prior iPads. Not only was it Apple’s most powerful iPad, but it introduced the Apple Pencil and Smart Keyboard cover.

\n

The Pencil added a level of precision and control that wasn’t possible using your finger or a third-party stylus to draw or take notes. With tilt and pressure sensitivity and low latency, the Pencil opened up significantly more sophisticated productivity and creative uses. The Smart Keyboard did the same for productivity apps, turning the iPad into a typing-focused workstation.

\n
\"The

The Apple Pencil.

\n

The hardware capabilities of the iPad Pro have continued to advance steadily, outpacing desktop PCs in many cases. However, sophisticated hardware alone wasn’t enough to propel the iPad forward. It was certainly a necessary first stop, but Apple still had to convince developers to build more complex pro apps that made use of the iPad’s capabilities.

\n

Pro Apps and Business Models

\n

As the first decade of the iPad comes to a close, Apple continues to grapple with fostering a healthy ecosystem of pro-level apps. For most of the past decade, the iPad app ecosystem has been treated the same as the iPhone’s. The company, however, seems to have recognized the differences and has made adjustments that have substantially improved the situation.

\n
\"As

As of the last time Apple reported unit sales, iPad sales were roughly double those of the Mac, but far smaller than the iPhone.

\n

The challenge of the iPad market is that it’s much smaller than the iPhone market. At the same time, pro-level apps are inherently more complex, requiring a substantially greater investment than many iPhone apps. That makes embarking on these kinds of iPad-focused pro app projects far riskier for developers.

\n

However, Apple’s recent moves have significantly improved the situation. The first step was to implement subscriptions. Although they aren’t limited to the iPad, subscription business models have provided developers with the recurring revenue necessary to maintain sophisticated apps long term. Subscriptions haven’t been popular with some consumers, especially when a traditionally paid up front app transitions to a subscription. Still, the developers I’ve spoken with who have made it through the transition are doing better than before.

\n
\"Mac

Mac Catalyst’s introduction.

\n

The second and less obvious move Apple made was the introduction of Mac Catalyst apps, which allow iPad developers to bring their apps to macOS. To be sure, Mac Catalyst is designed to breathe new life into the stagnant Mac App Store, but it also benefits the iPad. As I wrote last summer, Mac Catalyst aligns the iPad more closely with the Mac than ever before. By facilitating simultaneous development for both platforms, Mac Catalyst significantly increases the potential market of users for those apps and encourages the use of the latest iPadOS technologies, both of which promise to advance the platform.

\n
\"The

The keyboard shortcuts in Twitter’s Mac Catalyst app migrated back to its iPad app.

\n

A good example of this in action is Twitter’s Mac and iPad apps. Twitter’s iPad app languished for years; however, last fall Twitter launched a Mac Catalyst version of its app. Although the app had some rough edges, it included features you’d expect on the Mac like extensive keyboard shortcuts. Less than three weeks later, the iPad app was updated with extensive keyboard support too.

\n\n

The third software piece of the puzzle Apple has addressed is iPadOS. Going into WWDC 2019, significant iPad-specific updates to iOS were anticipated, but no one saw a separate OS coming. Currently, the overlap between iPadOS and iOS is extensive, but the introduction of iPadOS sent a message to developers that Apple was ready to start differentiating the feature sets of its mobile OSes to suit the unique characteristics of each platform. Over time, I expect we’ll see even more differences between the platforms emerge, such as the cursor support added to iPadOS with iPadOS 13.4.

\n

It’s still too early to judge whether the software-side changes to the iPad will succeed in encouraging widespread adoption of the latest hardware capabilities. However, early signs are encouraging, such as Adobe’s commitment to the platform with Photoshop and the announced addition of Illustrator. My hope is that with time, Apple’s substantial investment in both hardware and software will create a healthy cycle of hardware enabling more sophisticated apps and apps pushing hardware advancements.

\n

The trajectory of the iPad is one of the most interesting stories of modern Apple. Taking off as strongly as the device did in the aftermath of the iPhone made it look, for a time, like Apple had repeated the unbounded success of the iPhone.

\n

Although the company managed to jump-start sales of the iPad by tying it to the iPhone, it became clear after a couple of years that the iPad wasn’t another iPhone. That doesn’t make the iPad a failure. It’s still the dominant tablet on the market, but it hasn’t transformed society and culture to the same degree as the iPhone.

\n

With the introduction of advanced hardware, new accessories, iPadOS, Mac Catalyst, and new business models, the iPad has begun striking out on its own, emerging from the shadow of the iPhone. That’s been a long time coming. Hitching the iPad to the iPhone’s star was an undeniably successful strategy, but Apple played out that thread too long.

\n

Fortunately, the company has begun to correct the iPad’s course, and I couldn’t be more excited. The hardware is fast and reliable, and soon, we’ll have an all-new, more robust keyboard that can double as a way to charge the device. Perhaps most intriguing, though, is the introduction of trackpad and mouse support in iPadOS 13.4. Although it was rumored, most people didn’t expect to see trackpad support until iPadOS 14. I’m hoping that means there is lots more to show off in the next major release of iPadOS.

\n

Ten years in, the iPad is still a relatively young device with plenty of room to improve. Especially when you consider that the iPad Pro with its Apple Pencil and Smart Keyboard is only five years old, there is lots of room for optimism for the future. The iPad may have been pitched as a big iPhone at first, but its future as a modular computing platform that is neither a mobile phone nor traditional computer is bright.

\n

You can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "The trouble with looking back over a long period is that time has a way of compressing history. The clarity of hindsight makes it easy to look back at almost anything and be disappointed in some way with how it turned out years later.\nWe certainly saw that with the anniversary of the introduction of the iPad. Considered in isolation a decade later, it’s easy to find shortcomings with the iPad. However, the endpoints of the iPad’s timeline don’t tell the full story.\nIt’s not that the device is short on ways it could be improved; of course, it isn’t. However, the path of the iPad over the past decade isn’t a straight line from point A to point B. The iPad’s course has been influenced by countless decisions along the way bearing consequences that were good, bad, and sometimes unintended.\nSupported By\nConcepts\n\n\nConcepts: Where ideas take shape\nThe 10th anniversary of the iPad isn’t a destination, it’s just an arbitrary point from which to take stock of where things have been and consider where they are going. To do that, it’s instructive to look at more than the endpoints of the iPad’s history and consider what has happened in between. Viewed from that perspective, the state of the iPad ten years later, while at times frustrating, also holds reason for optimism. No single product in Apple’s lineup has more room to grow or potential to change the computing landscape than the iPad does today.\n\nJust a Big iPhone\nThe original iPad.\nNo product is launched into a vacuum, and the origin of the iPad is no different. It’s influenced by what came before it. In the ten years since April 3, 2010, that context has profoundly affected the iPad’s adoption and trajectory.\nIn hindsight, it seems obvious that the iPad would follow in the footsteps of the iPhone’s design, but that was far from clear in 2010. Take a look at the gallery of mockups leading up to the iPad’s introduction that I collected for a story in January. The designs were all over the place. More than two years after the iPhone’s launch, some people still expected the iPad to be a Mac tablet based on OS X.\nAn early Apple tablet mockup (Source: digitaltrends.com).\nHowever, after spending over two years acclimating consumers to a touch-based UI and app-centric approach to computing, building on the iPhone’s success was only natural. Of course, that led to criticisms that the iPad was ‘just a big iPhone/iPod touch’ from some commentators, but that missed one of the most interesting and overlooked part of the iPad’s introduction. Sure, Apple leaned into the similarities between the iPhone and the iPad, but even that initial introduction hinted at uses for the iPad that went far beyond what the iPhone could do.\nApple walked a careful line during that introductory keynote in January 2010. Steve Jobs’ presentation and most of the demos were a careful mix of showing off familiar interactions while simultaneously explaining how the iPad was better for certain activities than the iPhone or the Mac. That went hand-in-hand with UI changes on the iPad that let apps spread out, flattening view hierarchies. The company didn’t stop there, though.\n\nApple also provided a glimpse of the future. Toward the end of the iPad’s debut keynote, Phil Schiller came out for a demo of iWork. It was a final unexpected twist. Up to this point, the demos had focused primarily on consumption use cases, including web and photo browsing, reading in iBooks, and watching videos.\nThe iWork demo was a stark departure from the rest of the presentation. Schiller took the audience on a tour of Pages, Numbers, and Keynote. They weren’t feature-for-feature copies of their Mac counterparts. Instead, each app included novel, touch-first adaptations that included innovative design elements like a custom software keyboard for entering formulas in Numbers. Apple also introduced a somewhat odd combination keyboard and docking station accessory for typing on the iPad in portrait mode.\nWhat stands out most about this last segment of the iPad keynote isn’t that the iPad could be used as a productivity tool, but the contrast between how little time was spent on it compared to how advanced the iWork suite was from day one. As I explained in January, I suspect that to a degree, Apple was hedging its bets. Perhaps, like the original Apple Watch, the company knew it had built something special, but it wasn’t sure which of the device’s uses would stick with consumers.\nThe iPhone was still a young product in 2010. It was a hit, but consumers were still new to its interaction model. The App Store was relatively new, too, and filled with apps, many of which would be considered simple by today’s standards.\n\nBy tying the iPad closely to the iPhone, Apple provided users with a clear path to learning how to use its new tablet. Spotlighting familiar uses from the iPhone and demonstrating how they were better on the iPad tapped into a built-in market for the company’s new device. But the more I look at that first iPad keynote in the context of the past decade, the more I’m convinced that there was something else happening during that last segment.\nApple knew it was on to something special that went beyond finding better ways to do the same things you could already do on an iPhone. If that was the extent of the iPad’s potential, the company wouldn’t have invested the considerable time it must have taken to adapt the iWork suite to the iPad, and later update the Mac versions to align better with their iOS counterparts. iWork was a harbinger of the more complex, fully-featured pro apps on the iPad that have only recently begun to find a foothold on the platform.\nHowever, there’s no doubt that it was the emphasis on consumption apps that set the stage for the iPad’s early years.\nThe First Five Years\nInitial sales of the iPad were strong. The first day it was available, Apple sold 300,000 iPads. After four weeks, that number climbed to one million. Adoption was faster than the iPhone. In fact, it was faster than any non-phone consumer electronic product in history, surpassing the DVD player.\nThe sales results were a vindication of the iPad’s launch formula. So when it was time to roll out the iPad 2, Apple sought to capture lightning in a bottle again by using the same playbook, right down to the black leather chair used for onstage demos in the first keynote.\nA GarageBand for iPad billboard.\nThe structure of the iPad 2 presentation was the same too, closing with a demo of new, sophisticated apps from Apple that was reminiscent of Schiller’s iWork demo. This time, the company showed off iMovie and GarageBand. iWork had been impressive the year before, but something about the creative possibilities of making movies and music on a device you held in your hands recaptured the excitement and promise the iLife suite had ignited on the Mac years before. In 2010, Apple showed that you could get work done on an iPad, but in 2011, the company showed customers they could make art.\n\nIn the wake of the iPad 2 keynote, the iPad seemed poised to jump-start a computing revolution, but instead, Apple’s iPad efforts appeared to stall. Steve Jobs’ passing in late 2011 may have been a contributing factor, but paradoxically, it was Apple’s successful strategy of tying the iPad so closely to the iPhone that also held it back.\niPad sales continued to climb in 2011 and 2012. Apple responded by rapidly iterating on the device and expanding the lineup. In 2012 alone, Apple released the third and fourth generation iPads, plus the iPad mini. By the time the ultra-thin iPad Air debuted in late 2013, though, things had begun to take a turn for the worse.\nThe original iPad Air.\nAfter those first two keynotes, Apple’s push into productivity and creativity apps on the iPad slowed. At the same time, iPads were built to last. Users didn’t feel the same sense of urgency to replace them every couple years as they did with iPhones. Coupled with the fact that few apps were pushing the hardware, sales began to decline.\nThe picture wasn’t completely bleak, though. Third parties took up the cause and pushed the capabilities of the platform with apps like Procreate, Affinity Photo, Editorial, and MindNode, and for some uses, the iPad was a worthy substitute for a Mac. However, with the iPad still moving in lockstep with the iPhone, developers and users began to run up against limitations that made advancing the iPad as a productivity and creativity platform difficult.\n\nChief among the stumbling blocks were the lack of file system access and multitasking. The app-centric nature of iOS worked well for the iPhone, but as apps and user workflows became more sophisticated, the inflexibility of tying files to particular apps held the iPad back. The apps-first model was simple, but processing a file in multiple apps meant leaving a trail of partially completed copies in the wake of a project, wasting space, and risking confusion about which file was the latest version.\nThe lack of multitasking became a speed bump in the iPad’s path too. The larger screen of the device made multitasking a natural fit. However, even though the hardware practically begged to run multiple apps onscreen, the OS wasn’t built for it, and when it finally was, the first iterations were limited in functionality.\nWe don’t know what the internal dynamic was at Apple during this time, but the result was that, despite early optimism fueled by the iWork apps, GarageBand, and iMovie, the iPad failed to evolve into a widely-used device for more sophisticated productivity and creative work. It was still a capable device, and increasing numbers of users were turning to it as their primary computer. However, that only added to the building frustration of users who sensed the platform’s progress leveling off just as they were pushing it harder.\nPro Hardware\nAs the first five years of the iPad concluded, sales continued to decline. However, 2015 marked the first signs of a new direction for the iPad with the introduction of the first-generation iPad Pro, a product that began to push the iPad into new territory.\nThe original iPad Pro.\nThe original iPad Pro was a significant departure from prior iPads. Not only was it Apple’s most powerful iPad, but it introduced the Apple Pencil and Smart Keyboard cover.\nThe Pencil added a level of precision and control that wasn’t possible using your finger or a third-party stylus to draw or take notes. With tilt and pressure sensitivity and low latency, the Pencil opened up significantly more sophisticated productivity and creative uses. The Smart Keyboard did the same for productivity apps, turning the iPad into a typing-focused workstation.\nThe Apple Pencil.\nThe hardware capabilities of the iPad Pro have continued to advance steadily, outpacing desktop PCs in many cases. However, sophisticated hardware alone wasn’t enough to propel the iPad forward. It was certainly a necessary first stop, but Apple still had to convince developers to build more complex pro apps that made use of the iPad’s capabilities.\nPro Apps and Business Models\nAs the first decade of the iPad comes to a close, Apple continues to grapple with fostering a healthy ecosystem of pro-level apps. For most of the past decade, the iPad app ecosystem has been treated the same as the iPhone’s. The company, however, seems to have recognized the differences and has made adjustments that have substantially improved the situation.\nAs of the last time Apple reported unit sales, iPad sales were roughly double those of the Mac, but far smaller than the iPhone.\nThe challenge of the iPad market is that it’s much smaller than the iPhone market. At the same time, pro-level apps are inherently more complex, requiring a substantially greater investment than many iPhone apps. That makes embarking on these kinds of iPad-focused pro app projects far riskier for developers.\nHowever, Apple’s recent moves have significantly improved the situation. The first step was to implement subscriptions. Although they aren’t limited to the iPad, subscription business models have provided developers with the recurring revenue necessary to maintain sophisticated apps long term. Subscriptions haven’t been popular with some consumers, especially when a traditionally paid up front app transitions to a subscription. Still, the developers I’ve spoken with who have made it through the transition are doing better than before.\nMac Catalyst’s introduction.\nThe second and less obvious move Apple made was the introduction of Mac Catalyst apps, which allow iPad developers to bring their apps to macOS. To be sure, Mac Catalyst is designed to breathe new life into the stagnant Mac App Store, but it also benefits the iPad. As I wrote last summer, Mac Catalyst aligns the iPad more closely with the Mac than ever before. By facilitating simultaneous development for both platforms, Mac Catalyst significantly increases the potential market of users for those apps and encourages the use of the latest iPadOS technologies, both of which promise to advance the platform.\nThe keyboard shortcuts in Twitter’s Mac Catalyst app migrated back to its iPad app.\nA good example of this in action is Twitter’s Mac and iPad apps. Twitter’s iPad app languished for years; however, last fall Twitter launched a Mac Catalyst version of its app. Although the app had some rough edges, it included features you’d expect on the Mac like extensive keyboard shortcuts. Less than three weeks later, the iPad app was updated with extensive keyboard support too.\n\nThe third software piece of the puzzle Apple has addressed is iPadOS. Going into WWDC 2019, significant iPad-specific updates to iOS were anticipated, but no one saw a separate OS coming. Currently, the overlap between iPadOS and iOS is extensive, but the introduction of iPadOS sent a message to developers that Apple was ready to start differentiating the feature sets of its mobile OSes to suit the unique characteristics of each platform. Over time, I expect we’ll see even more differences between the platforms emerge, such as the cursor support added to iPadOS with iPadOS 13.4.\nIt’s still too early to judge whether the software-side changes to the iPad will succeed in encouraging widespread adoption of the latest hardware capabilities. However, early signs are encouraging, such as Adobe’s commitment to the platform with Photoshop and the announced addition of Illustrator. My hope is that with time, Apple’s substantial investment in both hardware and software will create a healthy cycle of hardware enabling more sophisticated apps and apps pushing hardware advancements.\nThe trajectory of the iPad is one of the most interesting stories of modern Apple. Taking off as strongly as the device did in the aftermath of the iPhone made it look, for a time, like Apple had repeated the unbounded success of the iPhone.\nAlthough the company managed to jump-start sales of the iPad by tying it to the iPhone, it became clear after a couple of years that the iPad wasn’t another iPhone. That doesn’t make the iPad a failure. It’s still the dominant tablet on the market, but it hasn’t transformed society and culture to the same degree as the iPhone.\nWith the introduction of advanced hardware, new accessories, iPadOS, Mac Catalyst, and new business models, the iPad has begun striking out on its own, emerging from the shadow of the iPhone. That’s been a long time coming. Hitching the iPad to the iPhone’s star was an undeniably successful strategy, but Apple played out that thread too long.\nFortunately, the company has begun to correct the iPad’s course, and I couldn’t be more excited. The hardware is fast and reliable, and soon, we’ll have an all-new, more robust keyboard that can double as a way to charge the device. Perhaps most intriguing, though, is the introduction of trackpad and mouse support in iPadOS 13.4. Although it was rumored, most people didn’t expect to see trackpad support until iPadOS 14. I’m hoping that means there is lots more to show off in the next major release of iPadOS.\nTen years in, the iPad is still a relatively young device with plenty of room to improve. Especially when you consider that the iPad Pro with its Apple Pencil and Smart Keyboard is only five years old, there is lots of room for optimism for the future. The iPad may have been pitched as a big iPhone at first, but its future as a modular computing platform that is neither a mobile phone nor traditional computer is bright.\nYou can also follow all of our iPad at 10 coverage through our iPad at 10 hub, or subscribe to the dedicated iPad at 10 RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-03-30T10:31:48-04:00", "date_modified": "2020-03-30T13:54:50-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "iPad at 10", "stories" ] }, { "id": "https://www.macstories.net/?p=62246", "url": "https://www.macstories.net/linked/reflections-from-the-ipads-original-development-team/", "title": "Reflections from the iPad\u2019s Original Development Team", "content_html": "

Ryan Houlihan at Input has published a new interview with two key members of the team that birthed the iPad 10 years ago. Married couple Imran Chaudhri and Bethany Bongiorno, former Director of Design for the HI team and Software Engineering Director, respectively, reflect widely on the development process behind Apple’s tablet. Two of the most interesting answers had to do with envisioning the future of the iPad, and regrets about its past.

\n

Chaudhri on the device’s future:

\n

\n I think it’ll be interesting for all of us to watch and see how Apple evolves the iPad. But, you know, I think one of the struggles that customers have with the iPad right now is really trying to figure out what role it plays in terms of a portable class computer. You have a traditional desktop computer or a traditional laptop computer — and where does the iPad fit in? You know, I would hope and I think they would continue to evolve it to a point where the iPad does end up doing a lot more that the Mac [currently] does and that the Mac redefines itself as more of a professional tool and the iPad defines itself as more of a mass consumer computing platform. I think that would be almost like a natural progression.\n

\n

Bongiorno on a regret:

\n

\n I would say one regret is that it became really hard after we shipped the iPad to continue to push it forward in the way that I think Imran and myself and others at the company really wanted to. The gravity of the phone was so big — and it still is so big, right? It makes it really hard.\n

\n

This week as the iPad’s 10 years are celebrated, it’s become a great time to reflect on where the device has come from and where it’s going. I use my iPad all day, every day for work, and love it. However, for the device to realize its fullest potential as a mass-market computer replacement, I think there’s still plenty of work yet for Apple to accomplish.

\n

\u2192 Source: inputmag.com

", "content_text": "Ryan Houlihan at Input has published a new interview with two key members of the team that birthed the iPad 10 years ago. Married couple Imran Chaudhri and Bethany Bongiorno, former Director of Design for the HI team and Software Engineering Director, respectively, reflect widely on the development process behind Apple’s tablet. Two of the most interesting answers had to do with envisioning the future of the iPad, and regrets about its past.\nChaudhri on the device’s future:\n\n I think it’ll be interesting for all of us to watch and see how Apple evolves the iPad. But, you know, I think one of the struggles that customers have with the iPad right now is really trying to figure out what role it plays in terms of a portable class computer. You have a traditional desktop computer or a traditional laptop computer — and where does the iPad fit in? You know, I would hope and I think they would continue to evolve it to a point where the iPad does end up doing a lot more that the Mac [currently] does and that the Mac redefines itself as more of a professional tool and the iPad defines itself as more of a mass consumer computing platform. I think that would be almost like a natural progression.\n\nBongiorno on a regret:\n\n I would say one regret is that it became really hard after we shipped the iPad to continue to push it forward in the way that I think Imran and myself and others at the company really wanted to. The gravity of the phone was so big — and it still is so big, right? It makes it really hard.\n\nThis week as the iPad’s 10 years are celebrated, it’s become a great time to reflect on where the device has come from and where it’s going. I use my iPad all day, every day for work, and love it. However, for the device to realize its fullest potential as a mass-market computer replacement, I think there’s still plenty of work yet for Apple to accomplish.\n\u2192 Source: inputmag.com", "date_published": "2020-01-28T08:15:57-05:00", "date_modified": "2020-01-28T08:15:57-05:00", "authors": [ { "name": "Ryan Christoffel", "url": "https://www.macstories.net/author/ryanchristoffel/", "avatar": "https://secure.gravatar.com/avatar/6f92854b21cbef25629d7efb809a9de7?s=512&d=mm&r=g" } ], "tags": [ "iPad", "iPad at 10", "Linked" ] }, { "id": "https://www.macstories.net/?p=62220", "url": "https://www.macstories.net/stories/the-ipad-at-10-a-new-product-category-defined-by-apps/", "title": "The iPad at 10: A New Product Category Defined by Apps", "content_html": "
\"\"

\n

When Steve Jobs strode onto the stage at the Yerba Buena Center on January 27, 2010, he carried with him the answers to years of speculation and rumors about an Apple tablet. Everyone at the event that day knew why they were there and what would be announced. Jobs acknowledged as much up front, saying that he had a ‘truly magical and revolutionary product’ to announce.

\n

Thanks to the iPhone, everyone at the Yerba Buena Center also had a vague notion of what Apple’s tablet would probably look like. Mockups and phony leaks were all over the web, and tablets weren’t new. Everyone expected a big slab of glass. Beyond that, though, few rumors were in agreement about what the tablet’s hardware specs would be.

\n
\"Source:

Source: The Verge.

\n

It was correctly assumed that Apple’s tablet would fit somewhere in between an iPhone and a Mac both physically and functionally, but where exactly was a mystery. That made the OS and the apps the stars of the keynote and critical to the way Apple’s tablet would be used and how it would be perceived for years to come.

\n

Before Steve Jobs revealed Apple’s new tablet to the world, though, he paused – as is still customary during most Apple keynotes – to set the stage and provide context, which is where I will start too. Ten years ago, the tech world was a very different place, and Apple was a very different company. Not only is it fun to remember what those days were like, but it helps explain the trajectory of the iPad in the decade that followed.

\n

\n

The Rumor Mill

\n

Rumors that Apple was working on a tablet device circulated for years before the iPad was released in 2010. According to an Engadget story published the day before the iPad was revealed, tablet rumors stretched back to at least the early 2000s.

\n

It wasn’t until the iPhone was released in 2007 that the rumors really picked up in earnest, though. At the time, a small army of bloggers covering Apple competed for scoops by combing through patent filings, domain registrations, and any other scrap they could get their hands on, looking for evidence of a tablet. It was the waning days of the ‘golden age’ of Apple rumors, before Apple ‘doubled down on secrecy.’ The same competition that fueled the rumor mill led to a cottage industry in device mockups that sometimes got passed off as ‘spy shots’ of real hardware.

\n

It was an environment that fed on itself, spawning crazy speculation. The rumors and mockups may seem like unimportant historical relics now, but they’re still instructive in understanding the expectations going into the iPad’s launch and a lot of fun to revisit. Here is a collection of some of my favorites:1

\n\n

Looking back at these mockups, what strikes me is how many imagined a tablet that would run OS X. Over and over, the mockups envisioned a windowed environment with a Mac-like UI. Even though the iPhone had been out for over two years, surprisingly few mockups approached the design with the iPhone as their starting point. Instead, it was assumed that a tablet with a screen closer to the size of a Mac would naturally inherit the Mac’s OS too. Surely a device with room for windows would run something more than just iPhone OS.

\n

The assumption that an Apple tablet would run OS X may be part of what drove rumors that it would carry a Mac-like price tag of around $1,000 too. It’s long been suspected that Apple planted the notion that the iPad would cost that much with a story that appeared in The Wall Street Journal just over three weeks before the iPad was announced. I’m not sure that’s the case, though.

\n

The Wall Street Journal was quoting an analyst report from a month earlier. Also, sifting through the many rumors leading up the event, there were many guesses of a price in the neighborhood of $1,000 that were made long before the announcement. Not all guesses were as high as The Wall Street Journal’s report, but none were as low as the $499 price point Apple announced either. Whatever the explanation, attendees that winter day in San Francisco expected an expensive slab of glass.

\n

Beyond the walls of Apple’s Cupertino headquarters, the tech trend that hung over the iPad event more than anything else was the rise of the netbook, an inexpensive laptop that relied primarily on web apps. In 2010, the US and much of the rest of the world was just emerging from a severe recession, and at an average price of $300-$500, netbooks were attractive to a lot of consumers.

\n

On an earnings call the year before the iPad was announced, Tim Cook, who was serving as acting-CEO at the time, said in response to an analyst question about when Apple would have an answer to netbooks:

\n

\n We’ve got some ideas, but right now we think the products there are inferior and will not provide the experience to customers that they’re happy with.\n

\n

Jobs had delivered essentially the same message in the fall of 2008 and suggested that, for the time being, the iPhone was Apple’s answer to netbooks.

\n

Warming Up the Crowd

\n

Steve Jobs loped onto the stage on January 27th, 2010 with Apple’s answer to netbooks. However, he began the event by launching into one of his signature updates on Apple’s recent accomplishments. He touted the 250 millionth iPod sold, and mentioned the still relatively new iPhone 3GS, but most relevant to the day’s announcement was the 18-month-old App Store.

\n
\"After

After only 18 months, there were 140,000 apps on the App Store that customers had downloaded over three billion times.

\n

At the time, the App Store already offered 140,000 iPhone apps. That number seems tiny compared to today when there are over two million apps available, but even at that level, the Store had become a legitimate cultural phenomenon that had captured the imagination of people around the world, as demonstrated by the fact that a total of over three billion apps had been downloaded in just 18 months.

\n

As Jobs wrapped up his overview of Apple’s achievements, he acknowledged the incredible expectations and pressure on the company that day by referencing a recent story from The Wall Street Journal:

\n

\n Last time there was this much excitement about a tablet, it had some commandments written on it.\n

\n
\"\"

\n

Behind Jobs was a slide of Moses returning from Mount Sinai with the Ten Commandments. The joke landed well, simultaneously acknowledging the importance of the event for Apple and warming up the crowd. I only wish Jobs had used this image I ran across in my research instead:

\n
\"Source:

Source: macdailynews.com.

\n

With the crowd ready to start, Jobs launched into his pitch, asking rhetorically whether there was room for a device that sits between the mobile phone and laptop. Of course, the answer was ‘yes,’ but only if, as Jobs explained, the device was ‘far better at some key things.’ Those things were:

\n
\"Jobs

Jobs explained that a third category of device had to be ‘far better at some key things.’

\n

Before revealing Apple’s answer to what was suited to these tasks, Jobs set up and knocked down the netbook, declaring that ‘the problem is, netbooks aren’t better at anything…they’re just cheap laptops.’

\n
\"'...and

‘…and we call it the iPad.’

\n

As the crowd was still chuckling over Jobs’ swipe at netbooks, he explained that Apple had an answer to the third category of device: the iPad. The device took over the screen behind Jobs as he walked over to a small table sitting next to a stylish black leather chair with a chrome frame.2 Underneath a cloth on the table was an iPad. Jobs picked the iPad up and proudly displayed it to the assembled crowd.

\n
\"\"

\n

Jobs paced the stage quickly, running through each of the categories of tasks he’d outlined previously with a short overview of the core iPad apps from Apple running on the familiar iPhone OS. With the preamble and reveal out of the way just 12 minutes into the one hour and 45-minute keynote, Jobs walked over to the chair waiting for him onstage, sat down, and unlocked his iPad.

\n

It was demo time.

\n

More Intimate Than a Laptop and More Capable Than a Smartphone

\n
\"The

The Internet ‘right in the palm of your hand.’

\n

The remainder of the keynote was dedicated almost exclusively to demos. Demos are a staple of Apple events, but the iPad’s introduction was unique for the amount of time that Jobs, Scott Forstall, and Phil Schiller spent walking the crowd through app after app. Jobs made the case that out of the box, an iPad justified the creation of a third category of device because it was better at the core experiences he had listed at the top of the presentation. Forstall’s job was to get developers fired up and show off early third-party apps. Schiller wrapped up with the new iWork suite for the iPad, which suggested what the future might hold for the device.

\n

When Jobs sat down, he immediately launched into a demo of Safari. Suddenly, the ‘Safari Pad’ rumors from 2007 made sense as Jobs repeatedly emphasized the power of holding the Internet in your hands as he browsed The New York Times3 and other websites. It was the perfect way to lead off the demos too, combining the intimacy of the iPhone’s touch UI with a big, bright display reminiscent of a laptop.

\n
\"Apps

Apps defined the iPad.

\n

As I rewatched Jobs make his way through apps like Safari, Mail, Videos, YouTube, and iPod, a couple of things struck me. One was that he rotated the iPad a lot. Jobs showed off reading an email message in portrait orientation then rotated the iPad to see his list of messages on the left with a preview on the right. He took similar approaches with Safari, Photos, the iTunes Store, and iBooks.

\n

The message was that the iPad could become whatever users wanted it to be through its apps. Jony Ive reinforced that in the event’s closing video, declaring that the iPad was defined by its single slab of multitouch glass and lack of an input device or prescribed orientation. Nobody rotates their iPad as much as was demonstrated onstage, but Apple was sending an unmistakable message that the iPad was designed to disappear beneath users’ fingertips the same way an iPhone does, but deliver the computing power to drive a big display.

\n
\"Scott

Scott Forstall in the chair.

\n

That all three presenters conducted their demos leaning back in that black leather chair was part of the message too. John Gruber put his finger on one aspect of the chair’s import following the introduction of the iPad 2 in 2011:

\n

\n The on-stage demos of the iPad aren’t conducted at a table or a lectern. They’re conducted sitting in an armchair. That conveys something about the feel of the iPad before its screen is even turned on. Comfortable, emotional, simple, elegant. How it feels is the entirety of the iPad’s appeal.\n

\n

It’s that unique and very personal feel that sets the iPad apart from a laptop, where users are separated from the apps they use by an intermediary pointing device.

\n

But there’s more to the chair than that. With the exception of email, the list of activities that Steve Jobs said the iPad had to be better at are largely content consumption activities. Like watching TV, those sorts of activities lend themselves to a lean-back experience sitting in a comfortable chair to browse webpages and photos, watch videos, listen to music, play games, or read a book. The iPad is undeniably an excellent choice for those activities, but by focusing on them up front and largely to the exclusion of more creation-based tasks, Apple defined the iPad as a content consumption device, a reputation that stuck for years thereafter. Jobs’ comfortable black chair may have worked to convey the intimacy of interacting with an iPad, but it also held the iPad back.

\n

Leading up to the iPad keynote, there were as many fanciful mockups of an Apple tablet that looked like a Mac as there were ones that looked like an iPhone. With the emphasis on media consumption apps and the tight coupling of iPad and iPhone app development, those early notions of a more laptop-like productivity tablet were largely swept away. They weren’t completely abandoned, as Phil Schiller’s introduction of iWork for the iPad would demonstrate, but the emphasis was clear: in the space between the iPhone and Mac, the iPad sat closer to the iPhone’s end of the spectrum than the Mac’s.

\n
\"Scott

Scott Forstall was tasked with selling third-party developers on making apps for the iPad.

\n

That message was reinforced by Scott Forstall’s segment of the presentation. He bounded onto the stage full of excitement, clearly tasked with getting developers on board to make iPad apps in the short two months between the keynote and launch of the iPad. That was a tall order, especially since developers would have to start building apps without the hardware on which to test them.4

\n
\"Forstall

Forstall predicted an all-new app gold rush.

\n

In hindsight, it’s clear that Apple was concerned about whether developers would make apps for the iPad. Forstall laid it on thick in his presentation, promising a ‘second gold rush’ on the App Store driven by iPad apps, flashing a slide of a gold prospector on the screen behind him. That was followed by a parade of third-party developers who showed off games5 like N.O.V.A. and Need for Speed, and apps like Brushes, The New York Times,6 and MLB at Bat.

\n
\"Forstall

Forstall using Facebook in compatibility mode and pixel doubled.

\n

Apple added iPad-specific development tools as part of the iPhone SDK, but it also hedged its bets with compatibility mode, allowing apps to run at their native size surrounded by a sea of black or blown up to twice their normal size. The experience wasn’t great, but it filled the gap until iPad versions of popular apps were ready.7

\n
\"Jobs

Jobs unveiling iBooks.

\n

The final segment of the keynote was reserved for all-new Apple apps. Jobs returned to the stage to introduce iBooks. It’s fun to rewatch this part of the keynote because it’s a great example of the skeuomorphic design that was taking an ever-increasing hold on Apple’s mobile devices. The app featured a wooden bookshelf that spun around when you tapped the ‘Store’ button in the toolbar, which Jobs described as ‘kind of like a secret passageway.’

\n
\"Schiller

Schiller introduced iWork, showing off Numbers’ charting tools.

\n

To wrap things up, Phil Schiller demoed iWork for the iPad, which was an interesting departure from the consumption-heavy apps spotlighted during the rest of the keynote. Schiller walked the audience through Pages, Keynote, and Numbers, showing off impressive adaptations of the apps that, until then, were only available on the Mac. There wasn’t complete feature parity between the Mac and iPad versions of the iWork apps, but Schiller’s demo hinted at the possibility of a more productivity-oriented future for the iPad.

\n

The inclusion of iWork at the end of the presentation feels like another hedged bet in hindsight. The excitement of the participants in the keynote was unmistakable. Apple knew it had created something special, but it wasn’t entirely sure how people would use it. The company seemed to be placing most of its bets on consumption apps, but with iWork and the somewhat odd portrait-mode keyboard dock, Apple wanted to be ready if customers gravitated toward more productivity-oriented uses too. Having emphasized the consumption aspects of the iPad so strongly, though, the identity of the iPad’s first era was all but inevitable.

\n

The final surprise of the iPad event was its price. The conventional wisdom was that the device would cost in the neighborhood of $1,000. When the 16GB WiFi model was announced at $499, the crowd was shocked. With more storage and 3G connectivity, you could pay as much as $829, but even that was significantly less than what many people expected.

\n
\"Apple

Apple shocked the crowd announcing a $499 starting price for the iPad.

\n

The price was important to driving early sales. The iPad was by no means inexpensive, but the price point Apple chose put the device within reach of the curious who were already familiar with the iPhone and the exploding app economy.

\n

It’s hard to argue with the approach. Early sales of the iPad were brisk, with over three million sold in the first 80 days and 15 million by the time the iPad 2 was launched.

\n

Brand new product categories don’t normally take off like that, but the iPad was different. By leveraging the App Store and familiarity with iOS from the iPhone, Apple had a built-in audience that knew how to use an iPad before they had ever seen one.

\n

However, looking back, the same factors that led to that initial success were the very things that ultimately held the iPad back. That first keynote cemented the iPad as a consumption device in the minds of users and the press. There were people like Federico who found ways to use the iPad as a desktop replacement, but stories like his were largely an exception to the rule, until the introduction of the iPad Pro in 2015.

\n
\"Today,

Today, the iPad Pro emphasizes creativity and there are multiple input options.

\n

iPad keynotes are very different today. The black leather chair is gone, and demos emphasize creative apps like Adobe Photoshop. The ‘single slab of multitouch glass’ is gone too. Sure, the iPad can still be used on its own and can transform into whichever app you’re using as it has always done, but there are many more layers to the interaction now with Split View, Slide Over, and multiwindowing, along with accessories like the Smart Keyboard Folio and Apple Pencil.

\n

That’s quite a transformation for a product that has only been around for 10 years, and with the more recent introduction of the iPad Pro and iPadOS, in many ways it feels as though the iPad is just getting started. Aside from the Apple Watch, there’s no other product in Apple’s lineup that seems poised for greater change and carries more promise than the iPad. It may have started as a big iPhone, but the iPad has finally begun to establish an identity of its own that still delivers on Steve Jobs’ original promise of a new category of device that balances the intimacy of the iPhone and the capabilities of a Mac.

\n
\n
  1. \nI’ve tried my best to track down the original source for each mockup and note them in the captions of the gallery, although with the amount of liberal reposting of images that was happening ten years ago, some undoubtedly originated elsewhere. ↩︎\n
  2. \n
  3. \nThe same leather chair would make another appearance for the iPad 2 launch. ↩︎\n
  4. \n
  5. \nWho knows whether it was intentional, but there was a humorous moment as Jobs scrolled down The New York Times’ homepage past a block of Adobe Flash content that didn’t load because it wasn’t supported by the OS. Jobs’ famous essay, Thoughts on Flash, was published just three months later. ↩︎\n
  6. \n
  7. \nWithout hardware to test their apps on, companies like The Omni Group, which had decided to go all-in on the iPad, resorted to mocking up apps on faux iPads they constructed with a 3D printer, while Marco Arment turned to cardboard mockups↩︎\n
  8. \n
  9. \nWith the wild success of games on the iPhone and the big, bright display of the iPad, the potential for the iPad to become a gaming device was clearly on the minds of Apple executives, as evidenced by the somewhat unusual invitation to the keynote that was extended to gaming site Kotaku↩︎\n
  10. \n
  11. \nIncluding The New York Times app in the presentation was a curious choice given that it also played a prominent role in the demonstration of web browsing at the beginning of the keynote. The New York Times’ app made for a nice demo too, but muddied the web browsing story that was so clearly an important component of the presentation. ↩︎\n
  12. \n
  13. \nRemarkably, compatibility mode is still with us a decade later. ↩︎\n
  14. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "When Steve Jobs strode onto the stage at the Yerba Buena Center on January 27, 2010, he carried with him the answers to years of speculation and rumors about an Apple tablet. Everyone at the event that day knew why they were there and what would be announced. Jobs acknowledged as much up front, saying that he had a ‘truly magical and revolutionary product’ to announce.\nThanks to the iPhone, everyone at the Yerba Buena Center also had a vague notion of what Apple’s tablet would probably look like. Mockups and phony leaks were all over the web, and tablets weren’t new. Everyone expected a big slab of glass. Beyond that, though, few rumors were in agreement about what the tablet’s hardware specs would be.\nSource: The Verge.\nIt was correctly assumed that Apple’s tablet would fit somewhere in between an iPhone and a Mac both physically and functionally, but where exactly was a mystery. That made the OS and the apps the stars of the keynote and critical to the way Apple’s tablet would be used and how it would be perceived for years to come.\nBefore Steve Jobs revealed Apple’s new tablet to the world, though, he paused – as is still customary during most Apple keynotes – to set the stage and provide context, which is where I will start too. Ten years ago, the tech world was a very different place, and Apple was a very different company. Not only is it fun to remember what those days were like, but it helps explain the trajectory of the iPad in the decade that followed.\n\nThe Rumor Mill\nRumors that Apple was working on a tablet device circulated for years before the iPad was released in 2010. According to an Engadget story published the day before the iPad was revealed, tablet rumors stretched back to at least the early 2000s.\nIt wasn’t until the iPhone was released in 2007 that the rumors really picked up in earnest, though. At the time, a small army of bloggers covering Apple competed for scoops by combing through patent filings, domain registrations, and any other scrap they could get their hands on, looking for evidence of a tablet. It was the waning days of the ‘golden age’ of Apple rumors, before Apple ‘doubled down on secrecy.’ The same competition that fueled the rumor mill led to a cottage industry in device mockups that sometimes got passed off as ‘spy shots’ of real hardware.\nMore on this week’s episode of AppStories\nFor more on the iPad’s app-oriented introduction, listen to this week’s episode of AppStories:\n\n\n\n\n\n\n\nIt was an environment that fed on itself, spawning crazy speculation. The rumors and mockups may seem like unimportant historical relics now, but they’re still instructive in understanding the expectations going into the iPad’s launch and a lot of fun to revisit. Here is a collection of some of my favorites:1\n\nLooking back at these mockups, what strikes me is how many imagined a tablet that would run OS X. Over and over, the mockups envisioned a windowed environment with a Mac-like UI. Even though the iPhone had been out for over two years, surprisingly few mockups approached the design with the iPhone as their starting point. Instead, it was assumed that a tablet with a screen closer to the size of a Mac would naturally inherit the Mac’s OS too. Surely a device with room for windows would run something more than just iPhone OS.\nWhat’s in a Name\nJust as entertaining as the rumors and mockups leading up to the announcement of the iPad were the names people came up with for Apple’s tablet. Combing through dozens of blog posts, I came across a long list of guesses:\niSlate\niTablet\niNetbook\niGuide\nTabletMac\nCanvas\nTaplet\nApple Tablet\nMacSlate\nMacBook Slate\nMacPad\nPalette\niBook\nThe assumption that an Apple tablet would run OS X may be part of what drove rumors that it would carry a Mac-like price tag of around $1,000 too. It’s long been suspected that Apple planted the notion that the iPad would cost that much with a story that appeared in The Wall Street Journal just over three weeks before the iPad was announced. I’m not sure that’s the case, though.\nThe Wall Street Journal was quoting an analyst report from a month earlier. Also, sifting through the many rumors leading up the event, there were many guesses of a price in the neighborhood of $1,000 that were made long before the announcement. Not all guesses were as high as The Wall Street Journal’s report, but none were as low as the $499 price point Apple announced either. Whatever the explanation, attendees that winter day in San Francisco expected an expensive slab of glass.\nBeyond the walls of Apple’s Cupertino headquarters, the tech trend that hung over the iPad event more than anything else was the rise of the netbook, an inexpensive laptop that relied primarily on web apps. In 2010, the US and much of the rest of the world was just emerging from a severe recession, and at an average price of $300-$500, netbooks were attractive to a lot of consumers.\nOn an earnings call the year before the iPad was announced, Tim Cook, who was serving as acting-CEO at the time, said in response to an analyst question about when Apple would have an answer to netbooks:\n\n We’ve got some ideas, but right now we think the products there are inferior and will not provide the experience to customers that they’re happy with.\n\nJobs had delivered essentially the same message in the fall of 2008 and suggested that, for the time being, the iPhone was Apple’s answer to netbooks.\nWarming Up the Crowd\nSteve Jobs loped onto the stage on January 27th, 2010 with Apple’s answer to netbooks. However, he began the event by launching into one of his signature updates on Apple’s recent accomplishments. He touted the 250 millionth iPod sold, and mentioned the still relatively new iPhone 3GS, but most relevant to the day’s announcement was the 18-month-old App Store.\nAfter only 18 months, there were 140,000 apps on the App Store that customers had downloaded over three billion times.\nAt the time, the App Store already offered 140,000 iPhone apps. That number seems tiny compared to today when there are over two million apps available, but even at that level, the Store had become a legitimate cultural phenomenon that had captured the imagination of people around the world, as demonstrated by the fact that a total of over three billion apps had been downloaded in just 18 months.\nAs Jobs wrapped up his overview of Apple’s achievements, he acknowledged the incredible expectations and pressure on the company that day by referencing a recent story from The Wall Street Journal:\n\n Last time there was this much excitement about a tablet, it had some commandments written on it.\n\n\nBehind Jobs was a slide of Moses returning from Mount Sinai with the Ten Commandments. The joke landed well, simultaneously acknowledging the importance of the event for Apple and warming up the crowd. I only wish Jobs had used this image I ran across in my research instead:\nSource: macdailynews.com.\nWith the crowd ready to start, Jobs launched into his pitch, asking rhetorically whether there was room for a device that sits between the mobile phone and laptop. Of course, the answer was ‘yes,’ but only if, as Jobs explained, the device was ‘far better at some key things.’ Those things were:\nWeb browsing\nEmail\nPhotos\nVideo\nMusic\nGames\neBooks\nJobs explained that a third category of device had to be ‘far better at some key things.’\nBefore revealing Apple’s answer to what was suited to these tasks, Jobs set up and knocked down the netbook, declaring that ‘the problem is, netbooks aren’t better at anything…they’re just cheap laptops.’\n‘…and we call it the iPad.’\nAs the crowd was still chuckling over Jobs’ swipe at netbooks, he explained that Apple had an answer to the third category of device: the iPad. The device took over the screen behind Jobs as he walked over to a small table sitting next to a stylish black leather chair with a chrome frame.2 Underneath a cloth on the table was an iPad. Jobs picked the iPad up and proudly displayed it to the assembled crowd.\n\nJobs paced the stage quickly, running through each of the categories of tasks he’d outlined previously with a short overview of the core iPad apps from Apple running on the familiar iPhone OS. With the preamble and reveal out of the way just 12 minutes into the one hour and 45-minute keynote, Jobs walked over to the chair waiting for him onstage, sat down, and unlocked his iPad.\nIt was demo time.\nMore Intimate Than a Laptop and More Capable Than a Smartphone\nThe Internet ‘right in the palm of your hand.’\nThe remainder of the keynote was dedicated almost exclusively to demos. Demos are a staple of Apple events, but the iPad’s introduction was unique for the amount of time that Jobs, Scott Forstall, and Phil Schiller spent walking the crowd through app after app. Jobs made the case that out of the box, an iPad justified the creation of a third category of device because it was better at the core experiences he had listed at the top of the presentation. Forstall’s job was to get developers fired up and show off early third-party apps. Schiller wrapped up with the new iWork suite for the iPad, which suggested what the future might hold for the device.\nWhen Jobs sat down, he immediately launched into a demo of Safari. Suddenly, the ‘Safari Pad’ rumors from 2007 made sense as Jobs repeatedly emphasized the power of holding the Internet in your hands as he browsed The New York Times3 and other websites. It was the perfect way to lead off the demos too, combining the intimacy of the iPhone’s touch UI with a big, bright display reminiscent of a laptop.\nApps defined the iPad.\nAs I rewatched Jobs make his way through apps like Safari, Mail, Videos, YouTube, and iPod, a couple of things struck me. One was that he rotated the iPad a lot. Jobs showed off reading an email message in portrait orientation then rotated the iPad to see his list of messages on the left with a preview on the right. He took similar approaches with Safari, Photos, the iTunes Store, and iBooks.\nThe message was that the iPad could become whatever users wanted it to be through its apps. Jony Ive reinforced that in the event’s closing video, declaring that the iPad was defined by its single slab of multitouch glass and lack of an input device or prescribed orientation. Nobody rotates their iPad as much as was demonstrated onstage, but Apple was sending an unmistakable message that the iPad was designed to disappear beneath users’ fingertips the same way an iPhone does, but deliver the computing power to drive a big display.\nScott Forstall in the chair.\nThat all three presenters conducted their demos leaning back in that black leather chair was part of the message too. John Gruber put his finger on one aspect of the chair’s import following the introduction of the iPad 2 in 2011:\n\n The on-stage demos of the iPad aren’t conducted at a table or a lectern. They’re conducted sitting in an armchair. That conveys something about the feel of the iPad before its screen is even turned on. Comfortable, emotional, simple, elegant. How it feels is the entirety of the iPad’s appeal.\n\nIt’s that unique and very personal feel that sets the iPad apart from a laptop, where users are separated from the apps they use by an intermediary pointing device.\nBut there’s more to the chair than that. With the exception of email, the list of activities that Steve Jobs said the iPad had to be better at are largely content consumption activities. Like watching TV, those sorts of activities lend themselves to a lean-back experience sitting in a comfortable chair to browse webpages and photos, watch videos, listen to music, play games, or read a book. The iPad is undeniably an excellent choice for those activities, but by focusing on them up front and largely to the exclusion of more creation-based tasks, Apple defined the iPad as a content consumption device, a reputation that stuck for years thereafter. Jobs’ comfortable black chair may have worked to convey the intimacy of interacting with an iPad, but it also held the iPad back.\nLeading up to the iPad keynote, there were as many fanciful mockups of an Apple tablet that looked like a Mac as there were ones that looked like an iPhone. With the emphasis on media consumption apps and the tight coupling of iPad and iPhone app development, those early notions of a more laptop-like productivity tablet were largely swept away. They weren’t completely abandoned, as Phil Schiller’s introduction of iWork for the iPad would demonstrate, but the emphasis was clear: in the space between the iPhone and Mac, the iPad sat closer to the iPhone’s end of the spectrum than the Mac’s.\nScott Forstall was tasked with selling third-party developers on making apps for the iPad.\nThat message was reinforced by Scott Forstall’s segment of the presentation. He bounded onto the stage full of excitement, clearly tasked with getting developers on board to make iPad apps in the short two months between the keynote and launch of the iPad. That was a tall order, especially since developers would have to start building apps without the hardware on which to test them.4\nForstall predicted an all-new app gold rush.\nIn hindsight, it’s clear that Apple was concerned about whether developers would make apps for the iPad. Forstall laid it on thick in his presentation, promising a ‘second gold rush’ on the App Store driven by iPad apps, flashing a slide of a gold prospector on the screen behind him. That was followed by a parade of third-party developers who showed off games5 like N.O.V.A. and Need for Speed, and apps like Brushes, The New York Times,6 and MLB at Bat.\nForstall using Facebook in compatibility mode and pixel doubled.\nApple added iPad-specific development tools as part of the iPhone SDK, but it also hedged its bets with compatibility mode, allowing apps to run at their native size surrounded by a sea of black or blown up to twice their normal size. The experience wasn’t great, but it filled the gap until iPad versions of popular apps were ready.7\nJobs unveiling iBooks.\nThe final segment of the keynote was reserved for all-new Apple apps. Jobs returned to the stage to introduce iBooks. It’s fun to rewatch this part of the keynote because it’s a great example of the skeuomorphic design that was taking an ever-increasing hold on Apple’s mobile devices. The app featured a wooden bookshelf that spun around when you tapped the ‘Store’ button in the toolbar, which Jobs described as ‘kind of like a secret passageway.’\nSchiller introduced iWork, showing off Numbers’ charting tools.\nTo wrap things up, Phil Schiller demoed iWork for the iPad, which was an interesting departure from the consumption-heavy apps spotlighted during the rest of the keynote. Schiller walked the audience through Pages, Keynote, and Numbers, showing off impressive adaptations of the apps that, until then, were only available on the Mac. There wasn’t complete feature parity between the Mac and iPad versions of the iWork apps, but Schiller’s demo hinted at the possibility of a more productivity-oriented future for the iPad.\nThe inclusion of iWork at the end of the presentation feels like another hedged bet in hindsight. The excitement of the participants in the keynote was unmistakable. Apple knew it had created something special, but it wasn’t entirely sure how people would use it. The company seemed to be placing most of its bets on consumption apps, but with iWork and the somewhat odd portrait-mode keyboard dock, Apple wanted to be ready if customers gravitated toward more productivity-oriented uses too. Having emphasized the consumption aspects of the iPad so strongly, though, the identity of the iPad’s first era was all but inevitable.\nThe final surprise of the iPad event was its price. The conventional wisdom was that the device would cost in the neighborhood of $1,000. When the 16GB WiFi model was announced at $499, the crowd was shocked. With more storage and 3G connectivity, you could pay as much as $829, but even that was significantly less than what many people expected.\nApple shocked the crowd announcing a $499 starting price for the iPad.\nThe price was important to driving early sales. The iPad was by no means inexpensive, but the price point Apple chose put the device within reach of the curious who were already familiar with the iPhone and the exploding app economy.\nIt’s hard to argue with the approach. Early sales of the iPad were brisk, with over three million sold in the first 80 days and 15 million by the time the iPad 2 was launched.\nBrand new product categories don’t normally take off like that, but the iPad was different. By leveraging the App Store and familiarity with iOS from the iPhone, Apple had a built-in audience that knew how to use an iPad before they had ever seen one.\nHowever, looking back, the same factors that led to that initial success were the very things that ultimately held the iPad back. That first keynote cemented the iPad as a consumption device in the minds of users and the press. There were people like Federico who found ways to use the iPad as a desktop replacement, but stories like his were largely an exception to the rule, until the introduction of the iPad Pro in 2015.\nToday, the iPad Pro emphasizes creativity and there are multiple input options.\niPad keynotes are very different today. The black leather chair is gone, and demos emphasize creative apps like Adobe Photoshop. The ‘single slab of multitouch glass’ is gone too. Sure, the iPad can still be used on its own and can transform into whichever app you’re using as it has always done, but there are many more layers to the interaction now with Split View, Slide Over, and multiwindowing, along with accessories like the Smart Keyboard Folio and Apple Pencil.\nThat’s quite a transformation for a product that has only been around for 10 years, and with the more recent introduction of the iPad Pro and iPadOS, in many ways it feels as though the iPad is just getting started. Aside from the Apple Watch, there’s no other product in Apple’s lineup that seems poised for greater change and carries more promise than the iPad. It may have started as a big iPhone, but the iPad has finally begun to establish an identity of its own that still delivers on Steve Jobs’ original promise of a new category of device that balances the intimacy of the iPhone and the capabilities of a Mac.\n\n\nI’ve tried my best to track down the original source for each mockup and note them in the captions of the gallery, although with the amount of liberal reposting of images that was happening ten years ago, some undoubtedly originated elsewhere. ↩︎\n\n\nThe same leather chair would make another appearance for the iPad 2 launch. ↩︎\n\n\nWho knows whether it was intentional, but there was a humorous moment as Jobs scrolled down The New York Times’ homepage past a block of Adobe Flash content that didn’t load because it wasn’t supported by the OS. Jobs’ famous essay, Thoughts on Flash, was published just three months later. ↩︎\n\n\nWithout hardware to test their apps on, companies like The Omni Group, which had decided to go all-in on the iPad, resorted to mocking up apps on faux iPads they constructed with a 3D printer, while Marco Arment turned to cardboard mockups. ↩︎\n\n\nWith the wild success of games on the iPhone and the big, bright display of the iPad, the potential for the iPad to become a gaming device was clearly on the minds of Apple executives, as evidenced by the somewhat unusual invitation to the keynote that was extended to gaming site Kotaku. ↩︎\n\n\nIncluding The New York Times app in the presentation was a curious choice given that it also played a prominent role in the demonstration of web browsing at the beginning of the keynote. The New York Times’ app made for a nice demo too, but muddied the web browsing story that was so clearly an important component of the presentation. ↩︎\n\n\nRemarkably, compatibility mode is still with us a decade later. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2020-01-27T11:04:07-05:00", "date_modified": "2021-11-23T10:14:00-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "Apple history", "iPad", "iPad at 10", "stories" ] } ] }