{ "version": "https://jsonfeed.org/version/1.1", "user_comment": "This feed allows you to read the posts from this site in any feed reader that supports the JSON Feed format. To add this feed to your reader, copy the following URL -- https://www.macstories.net/tag/automation/feed/json/ -- and add it your reader.", "home_page_url": "https://www.macstories.net/tag/automation/", "feed_url": "https://www.macstories.net/tag/automation/feed/json/", "language": "en-US", "title": "automation – MacStories", "description": "Apple news, app reviews, and stories by Federico Viticci and friends.", "items": [ { "id": "https://www.macstories.net/?p=77681", "url": "https://www.macstories.net/reviews/bangcase-push-button-iphone-automation/", "title": "BANG!CASE: Push-Button iPhone Automation", "content_html": "
\"\"

\n

I’ve been intrigued by the BANG!CASE ever since it was introduced by Bitmo Lab as a Kickstarter campaign about a year ago. The case includes a programmable button that can be used to automate actions using your iPhone’s accessibility features. However, because I don’t normally use a case with my iPhone, I never followed through on buying the BANG!CASE.

\n

Fast forward to early January at CES when I visited the booth for JSAUX, an affiliate of Bitmo Lab. In addition to JSAUX’s portable displays and gaming accessories, the company was showing off the BANG!CASE and GAMEBABY. (More on that on NPC soon.)

\n

It just so happens that since the holidays, I’ve continued my quest to refine how I collect and process information throughout my day. That’s led me to test a dozen or so apps, build new shortcuts, and explore other new setups. As a result, I was primed to give the BANG!CASE a try when Bitmo offered me a review unit at their booth, and I’ve been using it for a couple of weeks.

\n
\n

The case has a couple of minor drawbacks that I’ll get to, but by and large, it’s the most unique and useful case I’ve ever put on an iPhone. After enjoying my iPhone without a case for nearly two years, I’ve found that the utility of the BANG!CASE is significant enough that I’ve decided to keep using it, which I didn’t expect. So today, I thought I’d lay out why I like the BANG!CASE so much and how I’m using it.

\n

\n

At first blush, the BANG!CASE is an ordinary case made of a soft-touch plastic. It feels good to hold, includes a cutout for the Camera Control, and has hard clicky buttons that make pressing the iPhone’s standard buttons easy. However, aside from the case’s programmable button, the part of the BANG!CASE I like the most is the design of the back, which shows off its electronics and adds some character to my iPhone.

\n

Aesthetics aside, what really sets the BANG!CASE apart is an extra button that sits midway between the side button and the Camera Control. Bitmo calls it the BANG!BUTTON, and it can be programmed to perform three different actions with a single-press, double-press, or long-press.

\n

The BANG!BUTTON works via Bluetooth as an accessibility device, a very clever solution that has a couple of important implications worth keeping in mind. The first is that the BANG!CASE’s Bluetooth radio is powered by a rechargeable battery, not your iPhone. That means you’ll need to charge your case periodically. The case comes with a charging cable that has a USB-A plug on one end and a special connector on the other end that uses magnets and two pogo pin connectors. In my experience, the BANG!CASE doesn’t need to be charged often, but relying on a proprietary cable to do so isn’t ideal.

\n
\"Setting

Setting up actions for the BANG!BUTTON.

\n

The other somewhat fiddly implication of the BANG!CASE’s design is that you’ll need to dig fairly deep into iOS’s accessibility settings to set up the BANG!BUTTON’s actions. The first step is to hold the BANG!BUTTON until the light on the case is blinking to pair the case with your iPhone under Settings → Bluetooth. Once it’s paired, you can go to Settings → Accessibility → Touch → AssistiveTouch → Devices, where you’ll see your case listed. There, you can assign up to three actions, including a long list of system and accessibility actions, along with any shortcuts you’ve created in the Shortcuts app.

\n

None of that is as bad as it may sound since it’s a one-time setup unless you decide to change the assigned actions. Also, I’ve been using the BANG!CASE for a couple of weeks and have yet to run out of battery, although I have topped it off a couple of times. That said, running out of juice would be a bummer because you’d lose the use of the BANG!BUTTON; having yet another thing to charge in my life isn’t great, either.

\n

Still, I’ve enjoyed the BANG!CASE a lot – so much so that I’ve been using it daily since I got home from CES. You can get more out of Apple’s Action button using Shortcuts, as Federico has shared with his ActionMode shortcut for Club MacStories members, but it’s always nice to have more automation options, which is exactly what the BANG!CASE provides. Moreover, I find the Action button a little hard to reach on the iPhone 16 Pro Max, whereas the BANG!BUTTON is near the middle of the iPhone’s vertical side, making it easier to press.

\n
\"\"

\n

For the time being, I’ve settled on the following for my Action and BANG!BUTTON setup:

\n

So far, I’ve enjoyed this setup a lot. Having both text and voice capture just a single button press away has been perfect for saving tasks, ideas, and snippets of text. Thanks to superwhisper’s share sheet integration, it’s simple to send its transcriptions to a to-do, email, note-taking, or other app too.

\n

The winter season is my time to try new things. I’ve burned through task managers, email services, automation and AI services, new audio and video hardware, Macs, and more. It’s an eclectic mix, but the apps and services that are sticking all have one thing in common: easy access no matter what the context is. The BANG!CASE offers that, giving me access to a larger set of button actions at my fingertips, which I’m loving so far. The iPhone is a great capture device, and it’s even better with the BANG!CASE.

\n

The BANG!CASE is available from Bitmo Lab for $49.99.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "I’ve been intrigued by the BANG!CASE ever since it was introduced by Bitmo Lab as a Kickstarter campaign about a year ago. The case includes a programmable button that can be used to automate actions using your iPhone’s accessibility features. However, because I don’t normally use a case with my iPhone, I never followed through on buying the BANG!CASE.\nFast forward to early January at CES when I visited the booth for JSAUX, an affiliate of Bitmo Lab. In addition to JSAUX’s portable displays and gaming accessories, the company was showing off the BANG!CASE and GAMEBABY. (More on that on NPC soon.)\nIt just so happens that since the holidays, I’ve continued my quest to refine how I collect and process information throughout my day. That’s led me to test a dozen or so apps, build new shortcuts, and explore other new setups. As a result, I was primed to give the BANG!CASE a try when Bitmo offered me a review unit at their booth, and I’ve been using it for a couple of weeks.\n\nThe case has a couple of minor drawbacks that I’ll get to, but by and large, it’s the most unique and useful case I’ve ever put on an iPhone. After enjoying my iPhone without a case for nearly two years, I’ve found that the utility of the BANG!CASE is significant enough that I’ve decided to keep using it, which I didn’t expect. So today, I thought I’d lay out why I like the BANG!CASE so much and how I’m using it.\n\nAt first blush, the BANG!CASE is an ordinary case made of a soft-touch plastic. It feels good to hold, includes a cutout for the Camera Control, and has hard clicky buttons that make pressing the iPhone’s standard buttons easy. However, aside from the case’s programmable button, the part of the BANG!CASE I like the most is the design of the back, which shows off its electronics and adds some character to my iPhone.\nAesthetics aside, what really sets the BANG!CASE apart is an extra button that sits midway between the side button and the Camera Control. Bitmo calls it the BANG!BUTTON, and it can be programmed to perform three different actions with a single-press, double-press, or long-press.\nThe BANG!BUTTON works via Bluetooth as an accessibility device, a very clever solution that has a couple of important implications worth keeping in mind. The first is that the BANG!CASE’s Bluetooth radio is powered by a rechargeable battery, not your iPhone. That means you’ll need to charge your case periodically. The case comes with a charging cable that has a USB-A plug on one end and a special connector on the other end that uses magnets and two pogo pin connectors. In my experience, the BANG!CASE doesn’t need to be charged often, but relying on a proprietary cable to do so isn’t ideal.\nSetting up actions for the BANG!BUTTON.\nThe other somewhat fiddly implication of the BANG!CASE’s design is that you’ll need to dig fairly deep into iOS’s accessibility settings to set up the BANG!BUTTON’s actions. The first step is to hold the BANG!BUTTON until the light on the case is blinking to pair the case with your iPhone under Settings → Bluetooth. Once it’s paired, you can go to Settings → Accessibility → Touch → AssistiveTouch → Devices, where you’ll see your case listed. There, you can assign up to three actions, including a long list of system and accessibility actions, along with any shortcuts you’ve created in the Shortcuts app.\nNone of that is as bad as it may sound since it’s a one-time setup unless you decide to change the assigned actions. Also, I’ve been using the BANG!CASE for a couple of weeks and have yet to run out of battery, although I have topped it off a couple of times. That said, running out of juice would be a bummer because you’d lose the use of the BANG!BUTTON; having yet another thing to charge in my life isn’t great, either.\nStill, I’ve enjoyed the BANG!CASE a lot – so much so that I’ve been using it daily since I got home from CES. You can get more out of Apple’s Action button using Shortcuts, as Federico has shared with his ActionMode shortcut for Club MacStories members, but it’s always nice to have more automation options, which is exactly what the BANG!CASE provides. Moreover, I find the Action button a little hard to reach on the iPhone 16 Pro Max, whereas the BANG!BUTTON is near the middle of the iPhone’s vertical side, making it easier to press.\n\nFor the time being, I’ve settled on the following for my Action and BANG!BUTTON setup:\nAction Button: I use Quick Capture for Obsidian to quickly save thoughts to a scratchpad note in my Obsidian vault.\nBANG!BUTTON Single Press: A single press of the BANG!BUTTON triggers a shortcut that starts a new recording in superwhisper, an app that uses OpenAI’s Whisper LLM to transcribe spoken audio.\nBANG!BUTTON Double Press: When I hit the BANG!BUTTON twice, it opens Control Center, giving me quick access to a variety of media playback, HomeKit, and other controls.\nBANG!BUTTON Long Press: I have several shortcuts for saving URLs from specific apps, but for those contexts I haven’t created an automation for, I copy the link and then long-press the BANG!BUTTON to save it as a task in Godspeed using its API.\nSo far, I’ve enjoyed this setup a lot. Having both text and voice capture just a single button press away has been perfect for saving tasks, ideas, and snippets of text. Thanks to superwhisper’s share sheet integration, it’s simple to send its transcriptions to a to-do, email, note-taking, or other app too.\nThe winter season is my time to try new things. I’ve burned through task managers, email services, automation and AI services, new audio and video hardware, Macs, and more. It’s an eclectic mix, but the apps and services that are sticking all have one thing in common: easy access no matter what the context is. The BANG!CASE offers that, giving me access to a larger set of button actions at my fingertips, which I’m loving so far. The iPhone is a great capture device, and it’s even better with the BANG!CASE.\nThe BANG!CASE is available from Bitmo Lab for $49.99.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2025-01-22T12:18:59-05:00", "date_modified": "2025-01-22T13:53:47-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "accessories", "automation", "iPhone", "shortcuts", "reviews" ] }, { "id": "https://www.macstories.net/?p=77337", "url": "https://www.macstories.net/ios/apple-frames-3-3-adds-support-for-iphone-16-and-16-pro-m4-ipad-pro-and-apple-watch-series-10-feat-an-unexpected-technical-detour/", "title": "Apple Frames 3.3 Adds Support for iPhone 16 and 16 Pro, M4 iPad Pro, and Apple Watch Series 10 (feat. An Unexpected Technical Detour)", "content_html": "
\"Apple

Apple Frames 3.3 supports all the new devices released by Apple in 2024.

\n

Well, this certainly took longer than expected.

\n

Today, I’m happy to finally release version 3.3 of Apple Frames, my shortcut to put screenshots inside physical frames of Apple devices. In this new version, which is a free update for everyone, you’ll find support for all the new devices Apple released in 2024:

\n

To get started with Apple Frames, simply head to the end of this post (or search for Apple Frames in the MacStories Shortcuts Archive), download the updated shortcut, and replace any older version you may have installed with it. The first time you run the shortcut, you’ll be asked to redownload the file assets necessary for Apple Frames, which is a one-time operation. Once that’s done, you can resume framing your screenshots like you’ve always done, either using the native Apple Frames menu or the advanced API that I introduced last year.

\n

So what took this update so long? Well, if you want to know the backstory, keep on reading.

\n

\n

A Tale of Two Types of Screenshots

\n

I was busy with my Not an iPad Pro Review story back in May when the new iPads came out, then WWDC happened, so I didn’t get to work on an updated version of Apple Frames with support for the M4 iPad Pros until after the conference had wrapped up. I quickly put together a version with support for the new iPad frames and tried the shortcut with a screenshot, and it didn’t work. Not in the sense that the shortcut was crashing, though; instead, when the screenshot was overlaid on top of the iPad frame, the alpha transparency around the iPad would turn into a solid black color.

\n

I thought that was weird, but initially, I just wrote it off as an early iPadOS 18 beta issue. I figured it’d get fixed in the near future during the beta cycle.

\n

I started getting concerned when months passed and not only was the issue never fixed, but MacStories readers kept asking me for updates to the shortcut. To make matters worse, I got to the point where I was seeing the problem with some screenshots but not with others. The worst kind of bug is one you cannot reliably reproduce. I tried again. I asked Silvia to put together different versions of the frame assets and even tested different techniques for overlaying images; nothing was working. For some screenshots, the Shortcuts app would turn the transparency around a frame into a black color, and I didn’t know how to explain it.

\n

The situation got even worse when new iPhones and Apple Watches were released, and I still couldn’t figure out how to make Apple Frames work with them. This is when I tried to submit feedback and reached out to folks who work on Shortcuts privately, passing along what I was seeing. That also didn’t work.

\n

I was ready to give up on Apple Frames, but I decided to at least try to post about my issues publicly first, which I did on Bluesky.

\n

\n
\n

So the reason I’ve been unable to update my Apple Frames shortcut for the latest devices is a bug in iOS/iPadOS 18’s Shortcuts app that hasn’t been fixed yet.

\n

For the past few months, the Overlay Image action has always removed the alpha transparency of a PNG.

\n

I have no idea how to work around it.

\n

[image or embed]

\n

— Federico Viticci (@viticci.macstories.net) Nov 19, 2024 at 1:17 PM

\n

It worked. Within 24 hours, MacStories readers Douglas and Antonio got in touch with me with details about the potential culprit, which they independently identified: iOS 18 was capturing some screenshots in 16-bit Display P3 instead of 8-bit sRGB.

\n

As soon as I read Douglas’ email and later read Antonio’s post, I had one of those “of course, I should have thought about this” moments. Why would a PNG with alpha transparency lose its transparency after an image is overlaid on it? Because maybe there’s a metadata mismatch between the two images, and one is being “forced” behind the scenes to be converted to a format that loses the alpha transparency.

\n

Here are the details of the issue: occasionally – seemingly with no clear pattern – iOS and iPadOS 18 capture screenshots in 16-bit Display P3, which means they support a wide color gamut and higher dynamic range. Sometimes, however, screenshots are still captured in the old format, 8-bit sRGB. There is no way to tell these different types of screenshots apart since the Photos app lumps them all together as PNG files in the same Screenshots collection. To confirm my theory, I had to use the excellent Metapho app to inspect the metadata of my screenshots. As you can see below, some of them are captured in 16-bit Display P3, while others are in good old 8-bit sRGB.

\n
\"Two

Two screenshots taken on my iPhone 16 Plus, two different bit depths.

\n

I’m a bit mystified by this approach, and I would love to know how and why the system decides to capture screenshots in one format over the other.1 Regardless, that explained why I couldn’t reproduce the bug consistently or figure out what the underlying issue was: the frame assets (which are based on Apple’s official files) were 8-bit sRGB PNGs; when the shortcut tried to overlay a similar screenshot, everything worked, but if the screenshot was one of the new “fancy” images with a 16-bit Display P3 profile, I’d get the black border around the image.

\n

Apple has never publicly documented this, nor is there any information in Shortcuts that explains how the Overlay Image and Mask Image actions work with conflicting color profiles in images. But I still had to come up with a solution now that I knew what the problem was.

\n

Initially, Antonio Bueno proposed a workaround that used JavaScript to redraw every screenshot passed to the shortcut with a different RGB profile. That happened locally, on-device, thanks to Shortcuts’ ability to execute arbitrary JS code in a URL action. It worked, but it added a lot of latency to the shortcut due to increased memory consumption. The performance of the JavaScript-based approach was so bad, the beta version of Apple Frames crashed if I tried to frame more than three screenshots at once. I couldn’t use it for the final version.

\n

I then realized I was thinking about the issue the wrong way. I was convinced I had to fix the screenshots; what if, instead, I simply updated all the frame assets to be 16-bit?

\n

My theory was that, with a 16-bit PNG frame, pasting either an 8-bit or 16-bit screenshot on top of it would cause no trouble. I tested this by asking Silvia to re-export a single frame in 16-bit and, surely enough, it worked. But that led to another problem: should I ask Silvia to manually re-export 68 more frame assets, some of which were older Apple devices that are still supported by Apple Frames but no longer available as PSDs on Apple’s website?

\n

And that, friends, is where One True John comes in. As he will detail later this week in MacStories Weekly for Club members, John found a way to upscale 8-bit PNGs to 16-bit files with no color degradation or bloated file sizes in an automated fashion. Stay tuned for the story on Saturday.

\n
Apple Frames 3.3 in action. We all love this \"imyk\" guy.

Apple Frames 3.3 in action. We all love this “imyk” guy.

\n

To wrap up, what you should know is this: Apple Frames is now fully compatible with 8-bit and 16-bit screenshots, and all frame assets downloaded and used by the shortcut are 16-bit PNGs. As a result, Apple Frames is just as efficient as ever; in fact, thanks to some improved logic for overlaying screenshots, it should even be slightly faster than before.

\n

Like I said, I wish I’d thought of this sooner instead of having to wait months for a bug fix that, at this point, will likely never come. But such is the journey with automation sometimes. I’m glad we eventually figured this out.

\n

Download Apple Frames 3.3

\n

Well, that was a lot of words about color profiles in screenshots. I apologize, but it feels good to finally wrap up this saga.

\n

As I mentioned above, you can download Apple Frames 3.3, completely ignore its backstory, and keep using the shortcut like you’ve always done. I’m thrilled to have an up-to-date version of Apple Frames again, and I hope you like it as much as I do.

\n

You can download Apple Frames 3.3 below and find it in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

Apple Frames

Add device frames to screenshots for iPhones (8/SE, 11, 12, 13, 14, 15, 16 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11”, 12.9”, 13” 2018-2024 models), iPad Air (10.9”, 2020-2024 models), iPad mini (2021/2024 models), Apple Watch S4/5/6/7/8/9/10/Ultra, iMac (24” model, 2021/2024), MacBook Air (2020-2022 models), and MacBook Pro (2021-2024 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
  1. \nMy theory is that screenshots that feature lots of different colors are captured in 16-bit Display P3 to make them “pop” more, whereas screenshots of mostly white UIs are still captured in 8-bit sRGB. ↩︎\n
  2. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Apple Frames 3.3 supports all the new devices released by Apple in 2024.\nWell, this certainly took longer than expected.\nToday, I’m happy to finally release version 3.3 of Apple Frames, my shortcut to put screenshots inside physical frames of Apple devices. In this new version, which is a free update for everyone, you’ll find support for all the new devices Apple released in 2024:\n11” and 13” M4 iPad Pro\niPhone 16 and iPhone 16 Pro lineup\n42mm and 46mm Apple Watch Series 10\nTo get started with Apple Frames, simply head to the end of this post (or search for Apple Frames in the MacStories Shortcuts Archive), download the updated shortcut, and replace any older version you may have installed with it. The first time you run the shortcut, you’ll be asked to redownload the file assets necessary for Apple Frames, which is a one-time operation. Once that’s done, you can resume framing your screenshots like you’ve always done, either using the native Apple Frames menu or the advanced API that I introduced last year.\nSo what took this update so long? Well, if you want to know the backstory, keep on reading.\n\nA Tale of Two Types of Screenshots\nI was busy with my Not an iPad Pro Review story back in May when the new iPads came out, then WWDC happened, so I didn’t get to work on an updated version of Apple Frames with support for the M4 iPad Pros until after the conference had wrapped up. I quickly put together a version with support for the new iPad frames and tried the shortcut with a screenshot, and it didn’t work. Not in the sense that the shortcut was crashing, though; instead, when the screenshot was overlaid on top of the iPad frame, the alpha transparency around the iPad would turn into a solid black color.\nI thought that was weird, but initially, I just wrote it off as an early iPadOS 18 beta issue. I figured it’d get fixed in the near future during the beta cycle.\nI started getting concerned when months passed and not only was the issue never fixed, but MacStories readers kept asking me for updates to the shortcut. To make matters worse, I got to the point where I was seeing the problem with some screenshots but not with others. The worst kind of bug is one you cannot reliably reproduce. I tried again. I asked Silvia to put together different versions of the frame assets and even tested different techniques for overlaying images; nothing was working. For some screenshots, the Shortcuts app would turn the transparency around a frame into a black color, and I didn’t know how to explain it.\nThe situation got even worse when new iPhones and Apple Watches were released, and I still couldn’t figure out how to make Apple Frames work with them. This is when I tried to submit feedback and reached out to folks who work on Shortcuts privately, passing along what I was seeing. That also didn’t work.\nI was ready to give up on Apple Frames, but I decided to at least try to post about my issues publicly first, which I did on Bluesky.\n\n\nSo the reason I’ve been unable to update my Apple Frames shortcut for the latest devices is a bug in iOS/iPadOS 18’s Shortcuts app that hasn’t been fixed yet.\nFor the past few months, the Overlay Image action has always removed the alpha transparency of a PNG.\nI have no idea how to work around it.\n[image or embed]\n— Federico Viticci (@viticci.macstories.net) Nov 19, 2024 at 1:17 PM\nIt worked. Within 24 hours, MacStories readers Douglas and Antonio got in touch with me with details about the potential culprit, which they independently identified: iOS 18 was capturing some screenshots in 16-bit Display P3 instead of 8-bit sRGB.\nAs soon as I read Douglas’ email and later read Antonio’s post, I had one of those “of course, I should have thought about this” moments. Why would a PNG with alpha transparency lose its transparency after an image is overlaid on it? Because maybe there’s a metadata mismatch between the two images, and one is being “forced” behind the scenes to be converted to a format that loses the alpha transparency.\nHere are the details of the issue: occasionally – seemingly with no clear pattern – iOS and iPadOS 18 capture screenshots in 16-bit Display P3, which means they support a wide color gamut and higher dynamic range. Sometimes, however, screenshots are still captured in the old format, 8-bit sRGB. There is no way to tell these different types of screenshots apart since the Photos app lumps them all together as PNG files in the same Screenshots collection. To confirm my theory, I had to use the excellent Metapho app to inspect the metadata of my screenshots. As you can see below, some of them are captured in 16-bit Display P3, while others are in good old 8-bit sRGB.\nTwo screenshots taken on my iPhone 16 Plus, two different bit depths.\nI’m a bit mystified by this approach, and I would love to know how and why the system decides to capture screenshots in one format over the other.1 Regardless, that explained why I couldn’t reproduce the bug consistently or figure out what the underlying issue was: the frame assets (which are based on Apple’s official files) were 8-bit sRGB PNGs; when the shortcut tried to overlay a similar screenshot, everything worked, but if the screenshot was one of the new “fancy” images with a 16-bit Display P3 profile, I’d get the black border around the image.\nApple has never publicly documented this, nor is there any information in Shortcuts that explains how the Overlay Image and Mask Image actions work with conflicting color profiles in images. But I still had to come up with a solution now that I knew what the problem was.\nInitially, Antonio Bueno proposed a workaround that used JavaScript to redraw every screenshot passed to the shortcut with a different RGB profile. That happened locally, on-device, thanks to Shortcuts’ ability to execute arbitrary JS code in a URL action. It worked, but it added a lot of latency to the shortcut due to increased memory consumption. The performance of the JavaScript-based approach was so bad, the beta version of Apple Frames crashed if I tried to frame more than three screenshots at once. I couldn’t use it for the final version.\nI then realized I was thinking about the issue the wrong way. I was convinced I had to fix the screenshots; what if, instead, I simply updated all the frame assets to be 16-bit?\nMy theory was that, with a 16-bit PNG frame, pasting either an 8-bit or 16-bit screenshot on top of it would cause no trouble. I tested this by asking Silvia to re-export a single frame in 16-bit and, surely enough, it worked. But that led to another problem: should I ask Silvia to manually re-export 68 more frame assets, some of which were older Apple devices that are still supported by Apple Frames but no longer available as PSDs on Apple’s website?\nAnd that, friends, is where One True John comes in. As he will detail later this week in MacStories Weekly for Club members, John found a way to upscale 8-bit PNGs to 16-bit files with no color degradation or bloated file sizes in an automated fashion. Stay tuned for the story on Saturday.\nApple Frames 3.3 in action. We all love this “imyk” guy.\nTo wrap up, what you should know is this: Apple Frames is now fully compatible with 8-bit and 16-bit screenshots, and all frame assets downloaded and used by the shortcut are 16-bit PNGs. As a result, Apple Frames is just as efficient as ever; in fact, thanks to some improved logic for overlaying screenshots, it should even be slightly faster than before.\nLike I said, I wish I’d thought of this sooner instead of having to wait months for a bug fix that, at this point, will likely never come. But such is the journey with automation sometimes. I’m glad we eventually figured this out.\nDownload Apple Frames 3.3\nWell, that was a lot of words about color profiles in screenshots. I apologize, but it feels good to finally wrap up this saga.\nAs I mentioned above, you can download Apple Frames 3.3, completely ignore its backstory, and keep using the shortcut like you’ve always done. I’m thrilled to have an up-to-date version of Apple Frames again, and I hope you like it as much as I do.\nYou can download Apple Frames 3.3 below and find it in the MacStories Shortcuts Archive.\n\n \n \n Apple FramesAdd device frames to screenshots for iPhones (8/SE, 11, 12, 13, 14, 15, 16 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11”, 12.9”, 13” 2018-2024 models), iPad Air (10.9”, 2020-2024 models), iPad mini (2021/2024 models), Apple Watch S4/5/6/7/8/9/10/Ultra, iMac (24” model, 2021/2024), MacBook Air (2020-2022 models), and MacBook Pro (2021-2024 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.\nGet the shortcut here.\n\n \n \n\n\n\nMy theory is that screenshots that feature lots of different colors are captured in 16-bit Display P3 to make them “pop” more, whereas screenshots of mostly white UIs are still captured in 8-bit sRGB. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-11-25T12:08:04-05:00", "date_modified": "2024-11-27T04:33:40-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Apple Frames", "automation", "iOS", "iPadOS", "macOS", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=77279", "url": "https://www.macstories.net/ios/a-feature-from-10-years-ago-is-back-with-a-twist-in-my-favorite-rss-client/", "title": "A Feature from 10 Years Ago Is Back \u2013 with a Twist \u2013 in My Favorite RSS Client", "content_html": "
\"Unread's

Unread’s new custom shortcuts.

\n

When it comes to productivity apps, especially those that have to work within the constraints of iOS and iPadOS, it’s rare these days to stumble upon a new idea that has never been tried before. With the exception of objectively new technologies such as LLMs, or unless there’s a new framework that Apple is opening up to developers, it can often feel like most ideas have been attempted before and we’re simply retreading old ground.

\n

Let me be clear: I don’t think there’s anything inherently wrong with that. I’ve been writing about iPhone and iPad apps for over a decade now, and I believe there are dozens of design patterns and features that have undeservedly fallen out of fashion. But such is life.

\n

Today marks the return of a very MacStories-y feature in one of my longtime favorite apps, which – thanks to this new functionality – is gaining a permanent spot on my Home Screen. Namely, the RSS client Unread now lets you create custom article actions powered by the Shortcuts app.

\n

\n

To understand why this feature is a big deal to me, we need to travel back in time to 2013, when an incredible RSS client known as Mr. Reader1 pioneered the idea of sending parts of an article to other apps via custom actions you could pin to the app’s context menu. Here’s what I wrote at the time:

\n

\n Mr. Reader’s developer, Oliver Fürniß, supported a lot of apps in previous versions of his Google Reader client. Since the very first updates, Mr. Reader became well known for allowing users to open an article’s link in an alternative browser, or sending a URL to OmniFocus to create a new task. All these actions, which spanned browsers, to-do managers, note-taking apps, and more, were hard-coded by Oliver. It means he had to manually insert them in the code of the app, without offering his users the possibility to customize them or create new ones entirely. Mr. Reader was versatile, but as URL schemes started becoming more popular, there was always going to be an app that wasn’t supported, which required Oliver to go back and hard-code it again into the app. Oliver tells me he received “hundreds of requests” to add support for a specific app that had been updated with a URL scheme capable of receiving URLs or text. It was getting out of hand.

\n

The new generic solution allows you to build as many actions as you want, using the parameters you want, using either URL schemes from sample actions or by entering your own. In terms of iOS automation, this is the DIY version of Services: actions will appear in standard menus, but they will launch an app – they won’t display a part of an app inline.\n

\n
\"You

You wouldn’t last a day in the asylum where they raised me.

\n

The idea was simple: Mr. Reader’s developer had been inundated with feature requests to support specific app integrations, so at some point, they just decided to let people build their own actions using URL schemes. The technology made sense for its time. Workflow (which would later become Shortcuts) didn’t exist yet, so we only had apps like Pythonista, Editorial, Drafts, and Launch Center Pro to automate our devices.

\n

As it turns out, that idea – letting people create their own enhancements for an RSS reader – is still sound today. This is what developer John Brayton is doing with the latest version of Unread, the elegant RSS client for Apple platforms that we have covered several times on MacStories. In version 4.3, you can create custom actions to send articles from Unread to any app you want. In 2024, though, you no longer do that with URL schemes; you do it with Shortcuts.

\n

I have to imagine that, just like developer Oliver Fürniß 11 years ago, John Brayton must have gotten all kinds of requests to support third-party apps for saving links from Unread. Case in point: this version also adds built-in integrations for Anybox, Flyleaf, Matter, and Wallabag. This approach works, but it isn’t sustainable long-term, and, more importantly, it doesn’t scale to power users who want the ability to do whatever they want with their RSS client without having to wait for its developer to support their ideas. Letting power users create their own enhancements is a safer investment; the developer saves time and makes their most loyal users happier and more productive. It’s a win-win, especially when you consider the fact that these power user actions require a premium Unread subscription.

\n

But back to the feature itself. It’s 2024, and URL schemes have largely been abstracted from iOS automation. What Unread does is clever: it includes a menu in the app’s preferences where you can define a list of custom shortcuts you want to run for selected articles. To add a shortcut, all you have to do is enter its name as it appears in the Shortcuts app. Then, these shortcuts will show up in Unread’s context menu when you swipe inside the article viewer or long-press an article in a list:

\n
\"Setting

Setting up custom actions for shortcuts in Unread.

\n

It gets even better, though. On devices with a hardware keyboard, Unread 4.3 lets you define custom keyboard shortcuts to immediately trigger specific article actions as well as these new custom shortcuts. This option is glorious. I was able to program Unread to save an article to a specific Reminders list by pressing ⌃ + U, which opens the Shortcuts app, runs a shortcut, and automatically returns to Unread.2

\n
\"Assigning

Assigning a custom hotkey to an action in Unread.

\n

So how does Unread do it? There’s an entire support page about this, but the gist is that Unread sends a custom JSON object to the Shortcuts app that contains multiple variables for the selected article, including its URL, summary, and title, as well as the name of the feed it comes from. In Shortcuts, you can then decide what to do with each of these variables by parsing the JSON input as a dictionary. Here’s what it looks like:

\n
{\"url\":\"https:\\/\\/9to5mac.com\\/2024\\/11\\/18\\/iphone-16-magsafe-battery-pack-memories\\/\",\"summary\":\"Following the introduction of MagSafe charging on the iPhone 12, Apple unveiled a MagSafe Battery Pack accessory.\",\"title\":\"Apple’s MagSafe Battery Pack for iPhone shouldn’t have been a one-and-done experiment\",\"article_url\":\"https:\\/\\/9to5mac.com\\/2024\\/11\\/18\\/iphone-16-magsafe-battery-pack-memories\\/\",\"feed_name\":\"9to5Mac\",\"type\":\"article\"}\n
\n

And here’s all you need to do in Shortcuts to get the input from Unread and extract some of its variables:

\n
\"This

This is the JSON object that Unread passes to Shortcuts.

\n

If you’re the type of person who’s fascinated by a feature like this, I think you can see why this is a definite improvement over how we used to do this kind of thing in 2013. We don’t need to worry about percent-encoding and decoding URL schemes anymore; we can just send some input data to Shortcuts, parse it using visual actions, and work with those variables to connect them to whatever service or app we want. Want to publish an article from Unread on your blog as a linked post? Thinking of ways to pair Unread with your task manager? Looking to use ChatGPT’s actions with input from your RSS reader? All of this is possible thanks to this new integration between Unread and Shortcuts.

\n

As you can tell, I love this feature. However, there are two aspects I would like to see improve. I should be able to design a custom icon for an action in Unread by picking a color and SF Symbol that match the icon of a shortcut in the Shortcuts app, for consistency’s sake. Furthermore, I’d like to see an expansion of the variables that Unread passes to Shortcuts: publication date, selected text, and author names would be nice to have for automation purposes.

\n

If you told me in 2013 that in 2024, I’d still be writing about running custom actions in my RSS reader…I mean, let’s face it, I would have totally believed you. This feature has always been a great idea, and I’m glad developer John Brayton put a new spin on it by embracing the Shortcuts app and its immense potential for power users. Everything old is new again.

\n

Unread 4.3 is available now on the App Store. A premium subscription, which costs $4.99/month or $29.99/year, is required for custom article actions.

\n
\n
  1. \nAlas, Mr. Reader was removed from the App Store years ago, and its website is no longer online. I would have loved to see what a post-Google Reader, post-Twitter Mr. Reader would have looked like. ↩︎\n
  2. \n
  3. \nAs I explained when we released Obsidian Shortcut Launcher, there is no way on iOS to trigger a shortcut in the background, without launching the Shortcuts app. ↩︎\n
  4. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Unread’s new custom shortcuts.\nWhen it comes to productivity apps, especially those that have to work within the constraints of iOS and iPadOS, it’s rare these days to stumble upon a new idea that has never been tried before. With the exception of objectively new technologies such as LLMs, or unless there’s a new framework that Apple is opening up to developers, it can often feel like most ideas have been attempted before and we’re simply retreading old ground.\nLet me be clear: I don’t think there’s anything inherently wrong with that. I’ve been writing about iPhone and iPad apps for over a decade now, and I believe there are dozens of design patterns and features that have undeservedly fallen out of fashion. But such is life.\nToday marks the return of a very MacStories-y feature in one of my longtime favorite apps, which – thanks to this new functionality – is gaining a permanent spot on my Home Screen. Namely, the RSS client Unread now lets you create custom article actions powered by the Shortcuts app.\n\nTo understand why this feature is a big deal to me, we need to travel back in time to 2013, when an incredible RSS client known as Mr. Reader1 pioneered the idea of sending parts of an article to other apps via custom actions you could pin to the app’s context menu. Here’s what I wrote at the time:\n\n Mr. Reader’s developer, Oliver Fürniß, supported a lot of apps in previous versions of his Google Reader client. Since the very first updates, Mr. Reader became well known for allowing users to open an article’s link in an alternative browser, or sending a URL to OmniFocus to create a new task. All these actions, which spanned browsers, to-do managers, note-taking apps, and more, were hard-coded by Oliver. It means he had to manually insert them in the code of the app, without offering his users the possibility to customize them or create new ones entirely. Mr. Reader was versatile, but as URL schemes started becoming more popular, there was always going to be an app that wasn’t supported, which required Oliver to go back and hard-code it again into the app. Oliver tells me he received “hundreds of requests” to add support for a specific app that had been updated with a URL scheme capable of receiving URLs or text. It was getting out of hand.\n The new generic solution allows you to build as many actions as you want, using the parameters you want, using either URL schemes from sample actions or by entering your own. In terms of iOS automation, this is the DIY version of Services: actions will appear in standard menus, but they will launch an app – they won’t display a part of an app inline.\n\nYou wouldn’t last a day in the asylum where they raised me.\nThe idea was simple: Mr. Reader’s developer had been inundated with feature requests to support specific app integrations, so at some point, they just decided to let people build their own actions using URL schemes. The technology made sense for its time. Workflow (which would later become Shortcuts) didn’t exist yet, so we only had apps like Pythonista, Editorial, Drafts, and Launch Center Pro to automate our devices.\nAs it turns out, that idea – letting people create their own enhancements for an RSS reader – is still sound today. This is what developer John Brayton is doing with the latest version of Unread, the elegant RSS client for Apple platforms that we have covered several times on MacStories. In version 4.3, you can create custom actions to send articles from Unread to any app you want. In 2024, though, you no longer do that with URL schemes; you do it with Shortcuts.\nI have to imagine that, just like developer Oliver Fürniß 11 years ago, John Brayton must have gotten all kinds of requests to support third-party apps for saving links from Unread. Case in point: this version also adds built-in integrations for Anybox, Flyleaf, Matter, and Wallabag. This approach works, but it isn’t sustainable long-term, and, more importantly, it doesn’t scale to power users who want the ability to do whatever they want with their RSS client without having to wait for its developer to support their ideas. Letting power users create their own enhancements is a safer investment; the developer saves time and makes their most loyal users happier and more productive. It’s a win-win, especially when you consider the fact that these power user actions require a premium Unread subscription.\nBut back to the feature itself. It’s 2024, and URL schemes have largely been abstracted from iOS automation. What Unread does is clever: it includes a menu in the app’s preferences where you can define a list of custom shortcuts you want to run for selected articles. To add a shortcut, all you have to do is enter its name as it appears in the Shortcuts app. Then, these shortcuts will show up in Unread’s context menu when you swipe inside the article viewer or long-press an article in a list:\nSetting up custom actions for shortcuts in Unread.\nIt gets even better, though. On devices with a hardware keyboard, Unread 4.3 lets you define custom keyboard shortcuts to immediately trigger specific article actions as well as these new custom shortcuts. This option is glorious. I was able to program Unread to save an article to a specific Reminders list by pressing ⌃ + U, which opens the Shortcuts app, runs a shortcut, and automatically returns to Unread.2\nAssigning a custom hotkey to an action in Unread.\nSo how does Unread do it? There’s an entire support page about this, but the gist is that Unread sends a custom JSON object to the Shortcuts app that contains multiple variables for the selected article, including its URL, summary, and title, as well as the name of the feed it comes from. In Shortcuts, you can then decide what to do with each of these variables by parsing the JSON input as a dictionary. Here’s what it looks like:\n{\"url\":\"https:\\/\\/9to5mac.com\\/2024\\/11\\/18\\/iphone-16-magsafe-battery-pack-memories\\/\",\"summary\":\"Following the introduction of MagSafe charging on the iPhone 12, Apple unveiled a MagSafe Battery Pack accessory.\",\"title\":\"Apple’s MagSafe Battery Pack for iPhone shouldn’t have been a one-and-done experiment\",\"article_url\":\"https:\\/\\/9to5mac.com\\/2024\\/11\\/18\\/iphone-16-magsafe-battery-pack-memories\\/\",\"feed_name\":\"9to5Mac\",\"type\":\"article\"}\n\nAnd here’s all you need to do in Shortcuts to get the input from Unread and extract some of its variables:\nThis is the JSON object that Unread passes to Shortcuts.\nIf you’re the type of person who’s fascinated by a feature like this, I think you can see why this is a definite improvement over how we used to do this kind of thing in 2013. We don’t need to worry about percent-encoding and decoding URL schemes anymore; we can just send some input data to Shortcuts, parse it using visual actions, and work with those variables to connect them to whatever service or app we want. Want to publish an article from Unread on your blog as a linked post? Thinking of ways to pair Unread with your task manager? Looking to use ChatGPT’s actions with input from your RSS reader? All of this is possible thanks to this new integration between Unread and Shortcuts.\nAs you can tell, I love this feature. However, there are two aspects I would like to see improve. I should be able to design a custom icon for an action in Unread by picking a color and SF Symbol that match the icon of a shortcut in the Shortcuts app, for consistency’s sake. Furthermore, I’d like to see an expansion of the variables that Unread passes to Shortcuts: publication date, selected text, and author names would be nice to have for automation purposes.\nIf you told me in 2013 that in 2024, I’d still be writing about running custom actions in my RSS reader…I mean, let’s face it, I would have totally believed you. This feature has always been a great idea, and I’m glad developer John Brayton put a new spin on it by embracing the Shortcuts app and its immense potential for power users. Everything old is new again.\nUnread 4.3 is available now on the App Store. A premium subscription, which costs $4.99/month or $29.99/year, is required for custom article actions.\n\n\nAlas, Mr. Reader was removed from the App Store years ago, and its website is no longer online. I would have loved to see what a post-Google Reader, post-Twitter Mr. Reader would have looked like. ↩︎\n\n\nAs I explained when we released Obsidian Shortcut Launcher, there is no way on iOS to trigger a shortcut in the background, without launching the Shortcuts app. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-11-19T09:21:37-05:00", "date_modified": "2024-11-19T09:32:37-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "iPadOS", "Mr. Reader", "RSS", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=76813", "url": "https://www.macstories.net/news/pop-icon-keys-logitech-brings-automation-to-a-budget-friendly-keyboard/", "title": "POP Icon Keys: Logitech Brings Automation to a Budget-Friendly Keyboard", "content_html": "
\"\"

\n

A couple of weeks ago, I wrote about and showed off Logitech’s MX Creative Console, a two-piece device made up of a keypad and dialpad, that takes the Elgato Stream Deck head-on. Well, today, Logitech is back with a slightly different approach in the form of its POP Icon Keys keyboard, which borrows some tricks from the Creative Console.

\n
\n

The $49.99 keyboard, shipping later this month, is solidly built and low-profile. It weighs 530g and has four big rubber pads on the bottom corners, giving it a sturdy, stable feel on my desk. The keys use scissor switches and feature aggressively rounded corners, and they’re quiet and have more throw and resistance than an Apple Magic Keyboard, but are easy to adapt to if you’re used to Apple’s keyboards. I particularly like the texture of the keys – which could be partially due to the fact that I’ve been using a worn-down Magic Keyboard – but the keys have a nice feel and don’t show fingerprints.

\n
\"The

The POP Icon Keys comes in four color options.

\n

The body of the keyboard is made of a similar plastic, and the keys are surrounded by a strip of glossy, transparent plastic that adds a little flair to the entire package. The color options available for the POP Icon Keys are fun, too. I’ve been testing a black keyboard with neon yellow accents for about a week, and I like it a lot, but there are other color combinations available, including pink, orange and white, and a purpleish-blue color scheme. Also, the POP Icon Keys runs on two AAA batteries, which Logitech says can provide 36 months of operation thanks to the keyboard’s onboard power management.

\n

If that’s where the story ended for the POP Icon Keys, I’d recommend it because it’s a very good keyboard for the price. What sets the POP Icon Keys apart, though, is that it goes a step further, adding automation features similar to those found on the more expensive MX Creative Console.

\n
\"\"

\n

Logitech has designated the Home, End, Page Up, Page Down, F4-F12, and brightness keys as programmable via its Logi Options+ app. Among other things, you can use these keys to control system settings, execute keyboard shortcuts, and run multiple actions combined into macros. The keys’ original functionality remains available, too, if you hold down the function button. The POP Icon Keys also shares the MX Creative Console’s ability to set up app-specific profiles, meaning you can program keys to perform different tasks depending on which app is active.

\n
\"\"

\n

For example, you could use the Home, End, Page Up, and Page Down buttons to open different sets of apps for work, a special project, or relaxing with a game. Or you could use the function keys to trigger keyboard shortcuts in your favorite apps or Shortcuts automations.

\n

There are a couple of things I love about this functionality. First, the flexibility is fantastic, especially since you can access the programmable keys without taking your hands off the keyboard, which is an advantage over the MX Creative Console. Second, for just $50, the POP Icon Keys is a great entry point into the world of push-button automation. If it turns out that keyboard-driven automation isn’t your thing, you still have an excellent keyboard, but if it is, you can go a long way with the POP Icon Keys’ options before you graduate to the MX Creative Console or another similar device.

\n

All in all, I like the POP Icon Keys a lot. It’s nicely built and a great way to get started with keyboard automation or supplement other automation workflows you already use. The device is available directly from Logitech and Amazon.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "A couple of weeks ago, I wrote about and showed off Logitech’s MX Creative Console, a two-piece device made up of a keypad and dialpad, that takes the Elgato Stream Deck head-on. Well, today, Logitech is back with a slightly different approach in the form of its POP Icon Keys keyboard, which borrows some tricks from the Creative Console.\n\nThe $49.99 keyboard, shipping later this month, is solidly built and low-profile. It weighs 530g and has four big rubber pads on the bottom corners, giving it a sturdy, stable feel on my desk. The keys use scissor switches and feature aggressively rounded corners, and they’re quiet and have more throw and resistance than an Apple Magic Keyboard, but are easy to adapt to if you’re used to Apple’s keyboards. I particularly like the texture of the keys – which could be partially due to the fact that I’ve been using a worn-down Magic Keyboard – but the keys have a nice feel and don’t show fingerprints.\nThe POP Icon Keys comes in four color options.\nThe body of the keyboard is made of a similar plastic, and the keys are surrounded by a strip of glossy, transparent plastic that adds a little flair to the entire package. The color options available for the POP Icon Keys are fun, too. I’ve been testing a black keyboard with neon yellow accents for about a week, and I like it a lot, but there are other color combinations available, including pink, orange and white, and a purpleish-blue color scheme. Also, the POP Icon Keys runs on two AAA batteries, which Logitech says can provide 36 months of operation thanks to the keyboard’s onboard power management.\nIf that’s where the story ended for the POP Icon Keys, I’d recommend it because it’s a very good keyboard for the price. What sets the POP Icon Keys apart, though, is that it goes a step further, adding automation features similar to those found on the more expensive MX Creative Console.\n\nLogitech has designated the Home, End, Page Up, Page Down, F4-F12, and brightness keys as programmable via its Logi Options+ app. Among other things, you can use these keys to control system settings, execute keyboard shortcuts, and run multiple actions combined into macros. The keys’ original functionality remains available, too, if you hold down the function button. The POP Icon Keys also shares the MX Creative Console’s ability to set up app-specific profiles, meaning you can program keys to perform different tasks depending on which app is active.\n\nFor example, you could use the Home, End, Page Up, and Page Down buttons to open different sets of apps for work, a special project, or relaxing with a game. Or you could use the function keys to trigger keyboard shortcuts in your favorite apps or Shortcuts automations.\nThere are a couple of things I love about this functionality. First, the flexibility is fantastic, especially since you can access the programmable keys without taking your hands off the keyboard, which is an advantage over the MX Creative Console. Second, for just $50, the POP Icon Keys is a great entry point into the world of push-button automation. If it turns out that keyboard-driven automation isn’t your thing, you still have an excellent keyboard, but if it is, you can go a long way with the POP Icon Keys’ options before you graduate to the MX Creative Console or another similar device.\nAll in all, I like the POP Icon Keys a lot. It’s nicely built and a great way to get started with keyboard automation or supplement other automation workflows you already use. The device is available directly from Logitech and Amazon.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-10-08T09:04:06-04:00", "date_modified": "2024-10-11T12:28:54-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "accessories", "automation", "keyboard", "Logitech", "news", "reviews" ] }, { "id": "https://www.macstories.net/?p=76689", "url": "https://www.macstories.net/stories/first-look-logitechs-mx-creative-console-is-poised-to-compete-with-elgatos-stream-deck-lineup/", "title": "First Look: Logitech\u2019s MX Creative Console Is Poised to Compete with Elgato\u2019s Stream Deck Lineup", "content_html": "
\"Source:

Source: Logitech.

\n

Today, Logitech revealed the MX Creative Console, the company’s first product that takes advantage technology from Loupedeck, a company it acquired in July 2023.

\n
\n

I’ve been a user of Loupedeck products since 2019. When I heard about the acquisition last summer, I was intrigued. Loupedeck positioned itself as a premium accessory for creatives. The company’s early products were dedicated keyboard-like accessories for apps like Adobe Lightroom Classic. With the Loupedeck Live and later, the Live S, Loupedeck’s focus expanded to encompass the needs of streamers and automation more generally.

\n

Suddenly, Loupedeck was competing head-to-head with Elgato and its line of Stream Deck peripherals. I’ve always preferred Loupedeck’s more premium hardware to the Stream Deck, but that came at a higher cost, which I expect made it hard to compete.

\n
\"The

The Logitech MX Creative Console slots nicely into my existing setup.

\n

Fast forward to today, and the first Logitech product featuring Loupedeck’s know-how has been announced: the MX Creative Console. It’s a new direction for the hardware, coupled with familiar software. I’ve had Logitech’s new device for a couple of weeks, and I like it a lot.

\n

The MX Creative Console is first and foremost built for Adobe users. That’s clear from the three-month free trial to Creative Cloud that comes with the $199.99 device. Logitech has not only partnered with Adobe for the free trial, but it has worked with Adobe to create a series of plugins specifically for Adobe’s most popular apps, although plugins for other apps are available, too.

\n

\n

I use Adobe apps, but my interest in the MX Creative Console is its ability to run keyboard shortcuts, trigger various system events, and string together multiple actions as macros. For example, I’m using the MX Creative Console to navigate RSS, add files to Dropover, manage my windows, and take screenshots. Those are things you can do with a Stream Deck, too, but Logitech’s MX Creative Console has a few special things going for it that I love.

\n
\"Up

Up close with the MX Creative Console’s keypad.

\n

First, there’s the fact that the MX Creative Console comes in two parts. The first is a wireless dialpad with a big knob, a scroll wheel, and four programmable buttons; the dialpad is wireless because it has no screens, allowing it to run on AAA batteries. The second part is a keypad with nine customizable buttons plus two buttons for paging among multiple sets of the nine buttons. The two devices can work together, allowing, for example, a press of something like a brightness button on the keypad to control brightness via the dialpad’s knob.

\n

The keypad’s design is closer to that of a Stream Deck than a Loupedeck, which sacrifices some of the Loupedeck’s premium feel, but I still prefer it to a Stream Deck. The keys have a similar but perhaps slightly shallower throw and aren’t as concave as the Stream Deck. That means the icons assigned to each key’s little display aren’t as distorted by the shape of the keys as they are with a Stream Deck’s. There’s also a subtle lip on the edge of each key and a bump on the center key that makes it easy to orient your hand on the MX Creative Console’s keypad without looking at it.

\n
\"Source:

Source: Logitech.

\n

As for the dialpad, it connects to your computer wirelessly via Bluetooth or Logitech’s proprietary Bolt dongle. Either way, the dialpad can be paired with up to three devices, just like many of the company’s keyboards – something I wish Apple would do with its own input devices. In addition to a knob that’s excellent for adjusting sliders or scrolling horizontally, it includes a scroll wheel for navigating long vertical pages and four programmable buttons. The dialpad is even compatible with the iPad, too, connecting via Bluetooth and operating the same way a third-party mouse does for scrolling and clicking.

\n

Overall, my first impressions of the MX Creative Console’s hardware have been positive. By separating the device into two parts, it’s far more portable than many similar devices. I wouldn’t hesitate to throw one or the other or both into my bag because of their compact size and minimal weight. When I’m at my desk, the keypad includes a stand that holds the device at a little more than a 45° angle, too.

\n
\"Programming

Programming the MX Creative Console’s keypad with Logi Options+.

\n

I’m less excited about the MX Creative Console’s software. I’ve been using a beta version, so I’ll reserve judgment until its final release in October, but so far, programming the device isn’t great. That’s true of the Stream Deck too. Like it, Logitech uses a cross-platform app, Logitech Options+, that appears to be built with web technologies and just isn’t very good. Loupedeck users will recognize elements of Loupedeck’s software when they dig into Options+ to program the dialpad or keypad. But that familiarity isn’t an advantage because Loupedeck’s software was one of its weakest points as well.

\n

Logitech has done an admirable job of competing on hardware, but at least in its beta form, Options+ feels like it’s trying to steal Stream Deck’s crown for janky setup software. The only silver lining is that anyone who has used a Stream Deck or Loupedeck before won’t be surprised by Options+’s limitations.

\n

Still, the Logitech MX Creative Console is excellent overall. I’d prefer better software support, but again, it’s worth noting that the version of Options+ I’ve been using is a beta, and it does get the job done. Although the hardware isn’t as nice as the Loupedeck Live S, I prefer it to a standard Stream Deck and appreciate that it’s been split into two components which allows for a variety of desk setups and easier portability. I can’t wait to see where Logitech takes the MX Creative Console next and how Elgato responds.

\n

The Logitech MX Creative Console is available in black and light gray and can be pre-ordered today on Amazon or Logitech’s website. According to Amazon’s listing the device pre-orders will be delivered on October 16th.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Source: Logitech.\nToday, Logitech revealed the MX Creative Console, the company’s first product that takes advantage technology from Loupedeck, a company it acquired in July 2023.\n\nI’ve been a user of Loupedeck products since 2019. When I heard about the acquisition last summer, I was intrigued. Loupedeck positioned itself as a premium accessory for creatives. The company’s early products were dedicated keyboard-like accessories for apps like Adobe Lightroom Classic. With the Loupedeck Live and later, the Live S, Loupedeck’s focus expanded to encompass the needs of streamers and automation more generally.\nSuddenly, Loupedeck was competing head-to-head with Elgato and its line of Stream Deck peripherals. I’ve always preferred Loupedeck’s more premium hardware to the Stream Deck, but that came at a higher cost, which I expect made it hard to compete.\nThe Logitech MX Creative Console slots nicely into my existing setup.\nFast forward to today, and the first Logitech product featuring Loupedeck’s know-how has been announced: the MX Creative Console. It’s a new direction for the hardware, coupled with familiar software. I’ve had Logitech’s new device for a couple of weeks, and I like it a lot.\nThe MX Creative Console is first and foremost built for Adobe users. That’s clear from the three-month free trial to Creative Cloud that comes with the $199.99 device. Logitech has not only partnered with Adobe for the free trial, but it has worked with Adobe to create a series of plugins specifically for Adobe’s most popular apps, although plugins for other apps are available, too.\n\nI use Adobe apps, but my interest in the MX Creative Console is its ability to run keyboard shortcuts, trigger various system events, and string together multiple actions as macros. For example, I’m using the MX Creative Console to navigate RSS, add files to Dropover, manage my windows, and take screenshots. Those are things you can do with a Stream Deck, too, but Logitech’s MX Creative Console has a few special things going for it that I love.\nUp close with the MX Creative Console’s keypad.\nFirst, there’s the fact that the MX Creative Console comes in two parts. The first is a wireless dialpad with a big knob, a scroll wheel, and four programmable buttons; the dialpad is wireless because it has no screens, allowing it to run on AAA batteries. The second part is a keypad with nine customizable buttons plus two buttons for paging among multiple sets of the nine buttons. The two devices can work together, allowing, for example, a press of something like a brightness button on the keypad to control brightness via the dialpad’s knob.\nThe keypad’s design is closer to that of a Stream Deck than a Loupedeck, which sacrifices some of the Loupedeck’s premium feel, but I still prefer it to a Stream Deck. The keys have a similar but perhaps slightly shallower throw and aren’t as concave as the Stream Deck. That means the icons assigned to each key’s little display aren’t as distorted by the shape of the keys as they are with a Stream Deck’s. There’s also a subtle lip on the edge of each key and a bump on the center key that makes it easy to orient your hand on the MX Creative Console’s keypad without looking at it.\nSource: Logitech.\nAs for the dialpad, it connects to your computer wirelessly via Bluetooth or Logitech’s proprietary Bolt dongle. Either way, the dialpad can be paired with up to three devices, just like many of the company’s keyboards – something I wish Apple would do with its own input devices. In addition to a knob that’s excellent for adjusting sliders or scrolling horizontally, it includes a scroll wheel for navigating long vertical pages and four programmable buttons. The dialpad is even compatible with the iPad, too, connecting via Bluetooth and operating the same way a third-party mouse does for scrolling and clicking.\nOverall, my first impressions of the MX Creative Console’s hardware have been positive. By separating the device into two parts, it’s far more portable than many similar devices. I wouldn’t hesitate to throw one or the other or both into my bag because of their compact size and minimal weight. When I’m at my desk, the keypad includes a stand that holds the device at a little more than a 45° angle, too.\nProgramming the MX Creative Console’s keypad with Logi Options+.\nI’m less excited about the MX Creative Console’s software. I’ve been using a beta version, so I’ll reserve judgment until its final release in October, but so far, programming the device isn’t great. That’s true of the Stream Deck too. Like it, Logitech uses a cross-platform app, Logitech Options+, that appears to be built with web technologies and just isn’t very good. Loupedeck users will recognize elements of Loupedeck’s software when they dig into Options+ to program the dialpad or keypad. But that familiarity isn’t an advantage because Loupedeck’s software was one of its weakest points as well.\nLogitech has done an admirable job of competing on hardware, but at least in its beta form, Options+ feels like it’s trying to steal Stream Deck’s crown for janky setup software. The only silver lining is that anyone who has used a Stream Deck or Loupedeck before won’t be surprised by Options+’s limitations.\nStill, the Logitech MX Creative Console is excellent overall. I’d prefer better software support, but again, it’s worth noting that the version of Options+ I’ve been using is a beta, and it does get the job done. Although the hardware isn’t as nice as the Loupedeck Live S, I prefer it to a standard Stream Deck and appreciate that it’s been split into two components which allows for a variety of desk setups and easier portability. I can’t wait to see where Logitech takes the MX Creative Console next and how Elgato responds.\nThe Logitech MX Creative Console is available in black and light gray and can be pre-ordered today on Amazon or Logitech’s website. According to Amazon’s listing the device pre-orders will be delivered on October 16th.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-09-24T09:28:12-04:00", "date_modified": "2024-09-24T12:41:31-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "keyboard", "Logitech", "mac", "shortcuts", "stories" ] }, { "id": "https://www.macstories.net/?p=75404", "url": "https://www.macstories.net/news/apple-marks-global-accessibility-awareness-day-with-a-preview-of-os-features-coming-later-this-year/", "title": "Apple Marks Global Accessibility Awareness Day with a Preview of OS Features Coming Later This Year", "content_html": "
\"Source:

Source: Apple.

\n

Thursday is Global Accessibility Awareness Day, and to mark the occasion, Apple has previewed several new accessibility features coming to its OSes later this year. Although this accessibility preview has become an annual affair, this year’s preview is more packed than most years, with a wide variety of features for navigating UIs, automating tasks, interacting with Siri and CarPlay, enabling live captions in visionOS, and more. Apple hasn’t announced when these features will debut, but if past years are any indication, most should be released in the fall as part of the annual OS release cycle.

\n

Eye Tracking

\n

Often, Apple’s work in one area lends itself to new accessibility features in another. With Eye Tracking in iOS and iPadOS, the connection to the company’s work on visionOS is clear. The feature will allow users to look at UI elements on the iPhone and iPad, and the front-facing camera – combined with a machine learning model – will follow their gaze, moving the selection as what they look at changes. No additional hardware is necessary.

\n

Eye Tracking also works with Dwell, meaning that when a user pauses their gaze on an interface element, it will be clicked. The feature, which requires a one-time calibration setup process, will work with Apple’s apps, as well as third-party apps, on iPhones and iPads with an A12 Bionic chip or newer.

\n

Vocal Shortcuts

\n
\"Source:

Source: Apple.

\n

Vocal Shortcuts provide a way to define custom utterances that launch shortcuts and other tasks. The phrases are defined on-device for maximum privacy using a process similar to Personal Voice. The feature is like triggering shortcuts with Siri, but it doesn’t require an assistant trigger word or phrase.

\n

\n

Music Haptics

\n

For deaf and hard-of-hearing iPhone customers, Apple has implemented Music Haptics, which use haptic feedback to allow users to feel songs streamed via Apple Music without any additional devices. Apple has also created an API for third-party developers who want to incorporate Music Haptics into their streaming services or other music apps.

\n

CarPlay and Vehicle Motion Cues

\n
\"Source:

Source: Apple.

\n

CarPlay is getting a complete accessibility makeover later this year with several new features. The system will gain sound recognition for things like car horns and sirens. Users will be able to control the UI of CarPlay with their voice. There will be a color filter setting, allowing color-blind users to see the CarPlay interface better. Plus, options to make text bold and larger will be added.

\n

Apple is also introducing a solution for passengers who experience motion sickness when using a device in the car. Vehicle Motion Cues are a series of dots that sit on the edge of your device and animate as the vehicle you’re in moves. The dots shift left and right as the car turns and up and down as it accelerates and brakes, providing your brain with contextual cues that counteract the conflict between what you see and feel, which causes motion sickness. The feature can be set to turn on automatically when your device senses that you’re in a car or activated via Control Center.

\n

Vision Pro Live Captioning

\n
\"Source:

Source: Apple.

\n

The Vision Pro is gaining Live Captions throughout visionOS. You’ll have the option to see captions onscreen in various contexts including when someone speaks to you and while enjoying immersive video. The Vision Pro will also add support for additional Made for iPhone hearing devices and cochlear hearing processors. And other accessibility options are coming to visionOS as well:

\n

\n Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.\n

\n

Everything Else

\n
\"Source:

Source: Apple.

\n

Apple’s press release previews several other accesibilty-related updates as well:

\n
\n
\n

In addition, throughout the month of May, Apple Stores are offering free classes to help customers learn about accessibility features. Apple has added a ‘Calming Sounds’ shortcut to its Shortcuts Gallery that plays soothing soundscapes. Plus, the App Store, Apple TV app, Books, Fitness+, and Apple Support are joining in with content that focuses on accessibility.

\n

The week of Global Accessibility Awareness Day is the perfect time to spotlight these upcoming OS features. It raises awareness of GAAD and spotlights Apple’s efforts to use technology to make its products available to as many people as possible. I’m looking forward to testing everything announced today as the summer beta cycle begins.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Source: Apple.\nThursday is Global Accessibility Awareness Day, and to mark the occasion, Apple has previewed several new accessibility features coming to its OSes later this year. Although this accessibility preview has become an annual affair, this year’s preview is more packed than most years, with a wide variety of features for navigating UIs, automating tasks, interacting with Siri and CarPlay, enabling live captions in visionOS, and more. Apple hasn’t announced when these features will debut, but if past years are any indication, most should be released in the fall as part of the annual OS release cycle.\nEye Tracking\nOften, Apple’s work in one area lends itself to new accessibility features in another. With Eye Tracking in iOS and iPadOS, the connection to the company’s work on visionOS is clear. The feature will allow users to look at UI elements on the iPhone and iPad, and the front-facing camera – combined with a machine learning model – will follow their gaze, moving the selection as what they look at changes. No additional hardware is necessary.\nEye Tracking also works with Dwell, meaning that when a user pauses their gaze on an interface element, it will be clicked. The feature, which requires a one-time calibration setup process, will work with Apple’s apps, as well as third-party apps, on iPhones and iPads with an A12 Bionic chip or newer.\nVocal Shortcuts\nSource: Apple.\nVocal Shortcuts provide a way to define custom utterances that launch shortcuts and other tasks. The phrases are defined on-device for maximum privacy using a process similar to Personal Voice. The feature is like triggering shortcuts with Siri, but it doesn’t require an assistant trigger word or phrase.\n\nMusic Haptics\nFor deaf and hard-of-hearing iPhone customers, Apple has implemented Music Haptics, which use haptic feedback to allow users to feel songs streamed via Apple Music without any additional devices. Apple has also created an API for third-party developers who want to incorporate Music Haptics into their streaming services or other music apps.\nCarPlay and Vehicle Motion Cues\nSource: Apple.\nCarPlay is getting a complete accessibility makeover later this year with several new features. The system will gain sound recognition for things like car horns and sirens. Users will be able to control the UI of CarPlay with their voice. There will be a color filter setting, allowing color-blind users to see the CarPlay interface better. Plus, options to make text bold and larger will be added.\nApple is also introducing a solution for passengers who experience motion sickness when using a device in the car. Vehicle Motion Cues are a series of dots that sit on the edge of your device and animate as the vehicle you’re in moves. The dots shift left and right as the car turns and up and down as it accelerates and brakes, providing your brain with contextual cues that counteract the conflict between what you see and feel, which causes motion sickness. The feature can be set to turn on automatically when your device senses that you’re in a car or activated via Control Center.\nVision Pro Live Captioning\nSource: Apple.\nThe Vision Pro is gaining Live Captions throughout visionOS. You’ll have the option to see captions onscreen in various contexts including when someone speaks to you and while enjoying immersive video. The Vision Pro will also add support for additional Made for iPhone hearing devices and cochlear hearing processors. And other accessibility options are coming to visionOS as well:\n\n Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.\n\nEverything Else\nSource: Apple.\nApple’s press release previews several other accesibilty-related updates as well:\n\nFor users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.\nMagnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.\nBraille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.\nFor users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.\nFor users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases. \nFor users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.\nFor users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.\nSwitch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.\nVoice Control will offer support for custom vocabularies and complex words.\n\nIn addition, throughout the month of May, Apple Stores are offering free classes to help customers learn about accessibility features. Apple has added a ‘Calming Sounds’ shortcut to its Shortcuts Gallery that plays soothing soundscapes. Plus, the App Store, Apple TV app, Books, Fitness+, and Apple Support are joining in with content that focuses on accessibility.\nThe week of Global Accessibility Awareness Day is the perfect time to spotlight these upcoming OS features. It raises awareness of GAAD and spotlights Apple’s efforts to use technology to make its products available to as many people as possible. I’m looking forward to testing everything announced today as the summer beta cycle begins.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-05-15T10:00:04-04:00", "date_modified": "2024-05-15T10:02:13-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "accessibility", "automation", "iOS", "iPadOS", "shortcuts", "visionOS", "news" ] }, { "id": "https://www.macstories.net/?p=75162", "url": "https://www.macstories.net/linked/the-joy-of-shortcuts/", "title": "The Joy of Shortcuts", "content_html": "

I read this post by Jarrod Blundy a few weeks ago and forgot to link it on MacStories. I think Jarrod did a great job explaining why Apple’s Shortcuts app resonates so strongly with a specific type of person:

\n

\n But mostly, it just lights up my brain in a way that few other things do.

\n

[…]

\n

But when there’s a little burr in my computing life that I think could be sanded down with Shortcuts, my wheels get turning and it’s hard to pull myself away from refining, adding features, and solving down to an ideal answer. I’m sure if I learned traditional coding, I’d feel the same. Or if I had a workshop to craft furniture or pound metal into useful shapes. But since I don’t know that much about programming languages nor have the desire to craft physical products, Shortcuts is my IDE, my workshop.\n

\n

For me, despite the (many) issues of the Shortcuts app on all platforms, the reason I can’t pull myself away from it is that there’s nothing else like it on any modern computing platform (yes, I have tried Tasker and Power Automate and, no, I did not like them). Shortcuts appeals to that part of my brain that loves it when a plan comes together and different things happen in succession. If you’re a gamer, it’s similar to the satisfaction of watching Final Fantasy XII’s Gambits play out in real time, and it’s why I need to check out Unicorn Overlord as soon as possible.

\n

I love software that lets me design a plan and watch it execute automatically. I’ve shared hundreds of shortcuts over the years, and I’m still chasing that high.

\n

\u2192 Source: heydingus.net

", "content_text": "I read this post by Jarrod Blundy a few weeks ago and forgot to link it on MacStories. I think Jarrod did a great job explaining why Apple’s Shortcuts app resonates so strongly with a specific type of person:\n\n But mostly, it just lights up my brain in a way that few other things do.\n […]\n But when there’s a little burr in my computing life that I think could be sanded down with Shortcuts, my wheels get turning and it’s hard to pull myself away from refining, adding features, and solving down to an ideal answer. I’m sure if I learned traditional coding, I’d feel the same. Or if I had a workshop to craft furniture or pound metal into useful shapes. But since I don’t know that much about programming languages nor have the desire to craft physical products, Shortcuts is my IDE, my workshop.\n\nFor me, despite the (many) issues of the Shortcuts app on all platforms, the reason I can’t pull myself away from it is that there’s nothing else like it on any modern computing platform (yes, I have tried Tasker and Power Automate and, no, I did not like them). Shortcuts appeals to that part of my brain that loves it when a plan comes together and different things happen in succession. If you’re a gamer, it’s similar to the satisfaction of watching Final Fantasy XII’s Gambits play out in real time, and it’s why I need to check out Unicorn Overlord as soon as possible.\nI love software that lets me design a plan and watch it execute automatically. I’ve shared hundreds of shortcuts over the years, and I’m still chasing that high.\n\u2192 Source: heydingus.net", "date_published": "2024-04-24T12:48:49-04:00", "date_modified": "2024-04-24T12:48:49-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "shortcuts", "Linked" ] }, { "id": "https://www.macstories.net/?p=74954", "url": "https://www.macstories.net/news/obsidian-shortcut-launcher-1-1-brings-support-for-file-properties-and-backlinks/", "title": "Obsidian Shortcut Launcher 1.1 Brings Support for File Properties and Backlinks", "content_html": "
\"The

The updated Obsidian Shortcut Launcher with support for passing document backlinks to Shortcuts.

\n

Two years ago, we released Obsidian Shortcut Launcher, a free plugin to trigger shortcuts from Obsidian with the ability to pass input text to Apple’s automation app. In case you missed it in January 2022, here’s how I described the plugin:

\n

\n With Obsidian Shortcut Launcher (or ‘OSL’), you’ll be able to trigger any shortcut you want from Obsidian, passing along values such as the text of the document you’re working on, its name, text selection, and more. Obsidian Shortcut Launcher is free to use and works on iOS, iPadOS, and macOS.

\n

Obsidian Shortcut Launcher is the result of weeks of planning and work from me and Finn Voorhees, and it has created an entirely new dimension in how I use Obsidian and Shortcuts on a daily basis.\n

\n

I’ve been using Obsidian Shortcut Launcher every day for the past two years, and I couldn’t imagine a better way to integrate my favorite text editor and note-taking app with Shortcuts. I’ve built launchers to publish articles to WordPress, upload images, perform backups of my iOS reviews, and a lot more. You can read more about my examples and find a usage guide for the plugin in the original story.

\n

Today, I’m pleased to announce that we’re releasing version 1.1 of Obsidian Shortcut Launcher with two new integrations: properties and backlinks.

\n

\n

Released last year, properties are Obsidian’s more visual and user-friendly take on YAML metadata, which had long been supported in the app prior to the launch of native properties. As the Obsidian team explains in their documentation:

\n

\n Properties allow you to organize information about a note. Properties contain structured data such as text, links, dates, checkboxes, and numbers. Properties can also be used in combination with Community plugins that can do useful things with your structured data.\n

\n

In the updated Obsidian Shortcut Launcher, you can now choose a ‘Properties’ input type upon creating a new launcher that will allow you to pass the file’s current properties (and their values) to Shortcuts. Thanks to Finn’s excellent work, properties for the current document are passed to Shortcuts with the same JSON dictionary structure followed by Obsidian’s underlying YAML metadata. Any type of property is supported, too: dates, checkboxes, text, and lists can all be sent from Obsidian to Shortcuts using the plugin.

\n
\"Setting

Setting up the new integrations in Obsidian Shortcut Launcher 1.1.

\n

Here’s what a document with some properties looks like in Obsidian:

\n
\"Document

Document properties in Obsidian.

\n

And here’s what the input text passed to Shortcuts via the plugin looks like:

\n
\"Here's

Here’s what properties look like when passed to Shortcuts.

\n

If you use properties to provide metadata and additional organization to your notes, I think you’re really going to like the possibilities opened up by this integration. That is especially true if you try this on macOS, where Obsidian Shortcut Launcher can run shortcuts in the background thanks to Apple’s native API for the Shortcuts app.

\n

Obsidian Shortcut Launcher can now also retrieve backlinks for the current document and pass them to Shortcuts. If you’re in a document and want to see which other documents in Obsidian are linking to it, you can, of course, show the backlinks tab in the Obsidian UI:

\n
\"The

The Backlinks tab in Obsidian.

\n

However, if you want to do something with those backlinks using Shortcuts, you now can thanks to Obsidian Shortcut Launcher. Backlinks for the active document will be passed to Shortcuts with their absolute file paths in the Obsidian vault, as shown below:

\n
\"Showing

Showing backlinks for the active Obsidian document with Shortcuts.

\n

To demonstrate the capabilities offered by the integration with backlinks, I created a shortcut called Backlinks Navigator that presents you with a simple interface to perform actions on the active note’s backlinks. First, make sure you’re running version 1.1 of Obsidian Shortcut Launcher, and set up a launcher like this:

\n
\"\"

\n

Then, download my shortcut:

\n
\n
\n \"\"
\n

A shortcut to navigate backlinks passed by the Obsidian Shortcut Launcher plugin. This note requires a free plugin in Obsidian and the Actions for Obsidian app.

\n

Get the shortcut here.

\n\n
\n
\n
\n

The shortcut integrates with the outstanding utility Actions for Obsidian, which I plan to cover more frequently now that I’m working with macOS again thanks to my MacPad.

\n

Once everything is set up, trigger the shortcut Backlinks Navigator from Obsidian, and, if you’re on a Mac, you’ll be immediately shown a menu inside Obsidian with options for your backlinks. For instance, you can choose to open all backlinks in separate Obsidian tabs at once or manually select which backlinks you want to open.

\n

This is just an example of what you can achieve thanks to the deeper integration between Obsidian and Shortcuts granted by the updated plugin. I can’t wait to see what folks make with it.

\n

Get Version 1.1, and an Advanced Shortcut Coming Later This Week

\n

Later this week in MacStories Weekly for Club members, I’ll share an advanced shortcut that takes advantage of the new features in Obsidian Shortcut Launcher. With the shortcut, you’ll be able to turn an Obsidian document that contains a specific property into a task in the Things app that is deep-linked back to Obsidian. I’ve been using this shortcut myself for the past several weeks as part of my collection of Things shortcuts, and I love how tightly integrated with Obsidian it is.

\n
\"I've

I’ve been using a shortcut to turn Obsidian documents with a specific property into a deep-linked task in Things.

\n

You can get started with Club MacStories at just $5/month using the buttons below, or you can check out out other plans with more perks here.

\n
\nJoin AnnuallyStarts at $50/yearJoin MonthlyStarts at $5/month\n
\n

You can find Obsidian Shortcut Launcher 1.1 in Obsidian’s Community Plugins section. If you already have the plugin installed, just search for updates and download the new version.

\n

If you put together automations with the plugin and Shortcuts, I’d love to hear from you.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "The updated Obsidian Shortcut Launcher with support for passing document backlinks to Shortcuts.\nTwo years ago, we released Obsidian Shortcut Launcher, a free plugin to trigger shortcuts from Obsidian with the ability to pass input text to Apple’s automation app. In case you missed it in January 2022, here’s how I described the plugin:\n\n With Obsidian Shortcut Launcher (or ‘OSL’), you’ll be able to trigger any shortcut you want from Obsidian, passing along values such as the text of the document you’re working on, its name, text selection, and more. Obsidian Shortcut Launcher is free to use and works on iOS, iPadOS, and macOS.\n Obsidian Shortcut Launcher is the result of weeks of planning and work from me and Finn Voorhees, and it has created an entirely new dimension in how I use Obsidian and Shortcuts on a daily basis.\n\nI’ve been using Obsidian Shortcut Launcher every day for the past two years, and I couldn’t imagine a better way to integrate my favorite text editor and note-taking app with Shortcuts. I’ve built launchers to publish articles to WordPress, upload images, perform backups of my iOS reviews, and a lot more. You can read more about my examples and find a usage guide for the plugin in the original story.\nToday, I’m pleased to announce that we’re releasing version 1.1 of Obsidian Shortcut Launcher with two new integrations: properties and backlinks.\n\nReleased last year, properties are Obsidian’s more visual and user-friendly take on YAML metadata, which had long been supported in the app prior to the launch of native properties. As the Obsidian team explains in their documentation:\n\n Properties allow you to organize information about a note. Properties contain structured data such as text, links, dates, checkboxes, and numbers. Properties can also be used in combination with Community plugins that can do useful things with your structured data.\n\nIn the updated Obsidian Shortcut Launcher, you can now choose a ‘Properties’ input type upon creating a new launcher that will allow you to pass the file’s current properties (and their values) to Shortcuts. Thanks to Finn’s excellent work, properties for the current document are passed to Shortcuts with the same JSON dictionary structure followed by Obsidian’s underlying YAML metadata. Any type of property is supported, too: dates, checkboxes, text, and lists can all be sent from Obsidian to Shortcuts using the plugin.\nSetting up the new integrations in Obsidian Shortcut Launcher 1.1.\nHere’s what a document with some properties looks like in Obsidian:\nDocument properties in Obsidian.\nAnd here’s what the input text passed to Shortcuts via the plugin looks like:\nHere’s what properties look like when passed to Shortcuts.\nIf you use properties to provide metadata and additional organization to your notes, I think you’re really going to like the possibilities opened up by this integration. That is especially true if you try this on macOS, where Obsidian Shortcut Launcher can run shortcuts in the background thanks to Apple’s native API for the Shortcuts app.\nObsidian Shortcut Launcher can now also retrieve backlinks for the current document and pass them to Shortcuts. If you’re in a document and want to see which other documents in Obsidian are linking to it, you can, of course, show the backlinks tab in the Obsidian UI:\nThe Backlinks tab in Obsidian.\nHowever, if you want to do something with those backlinks using Shortcuts, you now can thanks to Obsidian Shortcut Launcher. Backlinks for the active document will be passed to Shortcuts with their absolute file paths in the Obsidian vault, as shown below:\nShowing backlinks for the active Obsidian document with Shortcuts.\nTo demonstrate the capabilities offered by the integration with backlinks, I created a shortcut called Backlinks Navigator that presents you with a simple interface to perform actions on the active note’s backlinks. First, make sure you’re running version 1.1 of Obsidian Shortcut Launcher, and set up a launcher like this:\n\nThen, download my shortcut:\n\n \n \n Backlinks NavigatorA shortcut to navigate backlinks passed by the Obsidian Shortcut Launcher plugin. This note requires a free plugin in Obsidian and the Actions for Obsidian app.\nGet the shortcut here.\n\n \n \n\nThe shortcut integrates with the outstanding utility Actions for Obsidian, which I plan to cover more frequently now that I’m working with macOS again thanks to my MacPad.\nOnce everything is set up, trigger the shortcut Backlinks Navigator from Obsidian, and, if you’re on a Mac, you’ll be immediately shown a menu inside Obsidian with options for your backlinks. For instance, you can choose to open all backlinks in separate Obsidian tabs at once or manually select which backlinks you want to open.\nThis is just an example of what you can achieve thanks to the deeper integration between Obsidian and Shortcuts granted by the updated plugin. I can’t wait to see what folks make with it.\nGet Version 1.1, and an Advanced Shortcut Coming Later This Week\nLater this week in MacStories Weekly for Club members, I’ll share an advanced shortcut that takes advantage of the new features in Obsidian Shortcut Launcher. With the shortcut, you’ll be able to turn an Obsidian document that contains a specific property into a task in the Things app that is deep-linked back to Obsidian. I’ve been using this shortcut myself for the past several weeks as part of my collection of Things shortcuts, and I love how tightly integrated with Obsidian it is.\nI’ve been using a shortcut to turn Obsidian documents with a specific property into a deep-linked task in Things.\nYou can get started with Club MacStories at just $5/month using the buttons below, or you can check out out other plans with more perks here.\n\nJoin AnnuallyStarts at $50/yearJoin MonthlyStarts at $5/month\n\nYou can find Obsidian Shortcut Launcher 1.1 in Obsidian’s Community Plugins section. If you already have the plugin installed, just search for updates and download the new version.\nIf you put together automations with the plugin and Shortcuts, I’d love to hear from you.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-04-04T10:30:17-04:00", "date_modified": "2024-04-05T07:21:34-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "mac", "Obsidian", "shortcuts", "news" ] }, { "id": "https://www.macstories.net/?p=74798", "url": "https://www.macstories.net/ios/apple-frames-3-2-brings-iphone-15-pro-frames-files-picker-and-adjustable-spacing/", "title": "Apple Frames 3.2 Brings iPhone 15 Pro Frames, Files Picker, and Adjustable Spacing", "content_html": "
\"Apple

Apple Frames 3.2.

\n

Today, I’m releasing version 3.2 of Apple Frames, my shortcut to put screenshots taken on Apple devices into physical device templates. If you want to skip ahead, you can download Apple Frames 3.2 at the end of this story or find it in the MacStories Shortcuts Archive.

\n

Version 3.2 is a major update that introduces brand new frames for the iPhone 15 Pro line, improves the reliability of framing screenshots from devices with the Dynamic Island, and, perhaps most importantly, extends the Frames API with new configuration options to give you even more control over framed images.

\n

Let’s dive in.

\n

\n

iPhone 15 Pro Frames and Improved Dynamic Island Support

\n

You may not notice this since the differences between the iPhone 14 Pro and 15 Pro frames are minor, but Apple Frames 3.2 comes with brand new image assets for screenshots taken on an iPhone 15 Pro or 15 Pro Max.

\n
\"The

The updated iPhone 15 Pro Max frames.

\n

Because of this change, installing this updated shortcut will require you to re-download a Frames.zip file from the MacStories CDN. This is a one-time-only operation that will download and install a folder in iCloud Drive ⇾ Shortcuts containing the graphical assets used by the shortcut. You’ll be prompted to do this the first time you run Apple Frames 3.2 and won’t be asked again in the future. Just tap ‘Always Allow’ to continue and wait for the download to finish.

\n
\"The

The update process for Apple Frames 3.2.

\n

Speaking of modern iPhones, I’ve also improved the logic inside the shortcut that deals with framing screenshots from devices with the Dynamic Island. Without getting too deep into the technicalities of what I did here, you should know that the new actions in Apple Frames 3.2 will make for more robust and easy-to-update image overlaying in the future.

\n

Frames API: Files Picker, Spacing Controls, and Sorting Options

\n

I introduced the Frames API last year as a way to script the behavior of Apple Frames using other shortcuts, launchers, or automation apps for Mac such as Raycast and Alfred. As I wrote in the original post:

\n

\n The big change in Apple Frames 3.1 is the availability of a lightweight API that lets you control the shortcut’s behavior with simple text commands. It may seem silly to make an “API” for a shortcut running on your iPhone or Mac, but this is, after all, a little programming interface for Apple Frames, so I think it’s only fair to call it that.

\n

Here’s the gist: you can now script Apple Frames with commands that tell it where to take images from (input commands) and where to save the framed images (output commands). You can still run Apple Frames manually like you’ve always done; however, if you want to save even more time, you can also program Apple Frames 3.1 to get screenshots from a specific source and perform a specific action with the output without having to manually pick images or options from a list.\n

\n

If you haven’t played around with the Frames API yet, I highly recommend that you go back and read the documentation and instructions here. The story also includes a few downloadable examples of the input and output commands supported by the Frames API, which allow you to, effectively, turn Apple Frames into a system-wide utility that you can invoke from any app or existing automated workflow.

\n

Apple Frames 3.2 extends the Frames API with three new features.

\n

The first one is an update to the pick input command, which now supports either picking screenshots from the Photos app (as was the case before) or – new in 3.2 – selecting them from a Files/Finder document browser.

\n
\"Running

Running Apple Frames with the Files picker.

\n

With this update, instead of just pick, you now have to pass either pick(Photos) or pick(Files) as an input command to Apple Frames via the API. Of course, all the other input commands introduced in Apple Frames 3.1 are still supported.

\n

If you want to run Apple Frames with a Files picker by default, this is all you have to do thanks to version 3.2 and the updated Frames API:

\n
\"How

How to run Apple Frames with the Files/Finder picker in version 3.2.

\n

The second addition to the Frames API is the ability to control spacing between images. By default, if you pass multiple screenshots to Apple Frames, the shortcut will frame and combine them into a single image, like this:

\n
\"The

The default spacing between images in Apple Frames.

\n

As I explained last year, there is an API override to disable merging, but I was recently asked by Jonathan to add another option: a way to make framed screenshots look “less tight” when merged into a single composite image. Thus the +mergeSpacing(n) override was born.

\n

By default, Apple Frames uses a default value of 60 pixels as the spacing between images. If you find this value too low or high, you can now override it with the +mergeSpacing(n) input flag, where n stands for any numeric value you want. Here’s the same image I used above, but this time with the +mergeSpacing(420) override to make framed screenshots more spaced out:

\n
\"Increased

Increased spacing in Apple Frames 3.2.

\n

And this is all you have to do to run Apple Frames with this setting:

\n
\"Remember:

Remember: when using the Frames API, you always have to include an input command for the source of images, such as pick. In this image, I ran Apple Frames with the Photos picker and increased spacing.

\n

If you want, you can also change the default value of 60 by editing the following action inside the Shortcuts editor:

\n
\"The

The variable that controls spacing.

\n

The third new feature is something I should have done years ago: you can now choose whether you want to see your latest screenshots first when running Apple Frames in manual picking mode, or if you want to see your oldest screenshots first instead.

\n

The default behavior of Apple Frames has always been to show your latest screenshots first, meaning the image grid in the Photos picker is sorted in reverse chronological order. However, I (and a small percentage of Apple Frames users) like to see our oldest screenshots first. The reasons for this are twofold:

\n
\"Oldest

Oldest screenshots first (left) Vs. latest screenshots first.

\n

Since I understand the value of both options, I’ve made this behavior configurable in Apple Frames 3.2. You will be asked at setup to pick your preferred sorting order, but you can always change the Oldest First variable inside the shortcut later. By default, it’s set to False, meaning that your most recent screenshots are shown first; if you set it to True, your oldest ones will be shown first in the photo picker instead.

\n
\"You

You can set the ‘Oldest First’ variable during the shortcut’s setup flow.

\n
\"To

To change the photo picker’s sorting, tweak this variable.

\n

The Frames API offers a pretty large selection of options and flags now, which is why I’m going to dive deeper into its more advanced functionalities in this week’s issue of MacStories Weekly for Club members. In the story, I will share more advanced examples of the Frames API and explain how you can chain input and output commands with overrides to turn Apple Frames into the ultimate screenshot-framing utility for all Apple platforms.

\n

To get MacStories Weekly, you just need to sign up for a basic Club MacStories plan at $5/month or $50/year here or by using the buttons below:

\n
\nJoin AnnuallyStarts at $50/yearJoin MonthlyStarts at $5/month\n
\n

Download Apple Frames 3.2

\n

Apple Frames continues to be a labor of love that I build primarily for myself, but which I know is also used by thousands of designers, developers, and Apple users who want a nicer way to share screenshots. I’m especially happy with the latest automation capabilities of the Frames API, so I hope you’ll find this shortcut as useful as I do on a daily basis.

\n

You can download Apple Frames 3.2 below and find it in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

Apple Frames

Add device frames to screenshots for iPhones (8/SE, 11, 12, 13, 14, 15 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image.
\nThe shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Apple Frames 3.2.\nToday, I’m releasing version 3.2 of Apple Frames, my shortcut to put screenshots taken on Apple devices into physical device templates. If you want to skip ahead, you can download Apple Frames 3.2 at the end of this story or find it in the MacStories Shortcuts Archive.\nVersion 3.2 is a major update that introduces brand new frames for the iPhone 15 Pro line, improves the reliability of framing screenshots from devices with the Dynamic Island, and, perhaps most importantly, extends the Frames API with new configuration options to give you even more control over framed images.\nLet’s dive in.\n\niPhone 15 Pro Frames and Improved Dynamic Island Support\nYou may not notice this since the differences between the iPhone 14 Pro and 15 Pro frames are minor, but Apple Frames 3.2 comes with brand new image assets for screenshots taken on an iPhone 15 Pro or 15 Pro Max.\nThe updated iPhone 15 Pro Max frames.\nBecause of this change, installing this updated shortcut will require you to re-download a Frames.zip file from the MacStories CDN. This is a one-time-only operation that will download and install a folder in iCloud Drive ⇾ Shortcuts containing the graphical assets used by the shortcut. You’ll be prompted to do this the first time you run Apple Frames 3.2 and won’t be asked again in the future. Just tap ‘Always Allow’ to continue and wait for the download to finish.\nThe update process for Apple Frames 3.2.\nSpeaking of modern iPhones, I’ve also improved the logic inside the shortcut that deals with framing screenshots from devices with the Dynamic Island. Without getting too deep into the technicalities of what I did here, you should know that the new actions in Apple Frames 3.2 will make for more robust and easy-to-update image overlaying in the future.\nDue to a new bug in Shortcuts, it’s possible you may have to pick the source and destination folders from Files/Finder twice after setting up Apple Frames on an iPhone and then using it on a Mac, and vice versa. This is a Shortcuts issue that is, unfortunately, out of my control.\nIf you see this error message, pick the folders pictured in the linked screenshot again and run Apple Frames. I hope Apple fixes this soon.\n\nFrames API: Files Picker, Spacing Controls, and Sorting Options\nI introduced the Frames API last year as a way to script the behavior of Apple Frames using other shortcuts, launchers, or automation apps for Mac such as Raycast and Alfred. As I wrote in the original post:\n\n The big change in Apple Frames 3.1 is the availability of a lightweight API that lets you control the shortcut’s behavior with simple text commands. It may seem silly to make an “API” for a shortcut running on your iPhone or Mac, but this is, after all, a little programming interface for Apple Frames, so I think it’s only fair to call it that.\n Here’s the gist: you can now script Apple Frames with commands that tell it where to take images from (input commands) and where to save the framed images (output commands). You can still run Apple Frames manually like you’ve always done; however, if you want to save even more time, you can also program Apple Frames 3.1 to get screenshots from a specific source and perform a specific action with the output without having to manually pick images or options from a list.\n\nIf you haven’t played around with the Frames API yet, I highly recommend that you go back and read the documentation and instructions here. The story also includes a few downloadable examples of the input and output commands supported by the Frames API, which allow you to, effectively, turn Apple Frames into a system-wide utility that you can invoke from any app or existing automated workflow.\nApple Frames 3.2 extends the Frames API with three new features.\nThe first one is an update to the pick input command, which now supports either picking screenshots from the Photos app (as was the case before) or – new in 3.2 – selecting them from a Files/Finder document browser.\nRunning Apple Frames with the Files picker.\nWith this update, instead of just pick, you now have to pass either pick(Photos) or pick(Files) as an input command to Apple Frames via the API. Of course, all the other input commands introduced in Apple Frames 3.1 are still supported.\nIf you want to run Apple Frames with a Files picker by default, this is all you have to do thanks to version 3.2 and the updated Frames API:\nHow to run Apple Frames with the Files/Finder picker in version 3.2.\nThe second addition to the Frames API is the ability to control spacing between images. By default, if you pass multiple screenshots to Apple Frames, the shortcut will frame and combine them into a single image, like this:\nThe default spacing between images in Apple Frames.\nAs I explained last year, there is an API override to disable merging, but I was recently asked by Jonathan to add another option: a way to make framed screenshots look “less tight” when merged into a single composite image. Thus the +mergeSpacing(n) override was born.\nBy default, Apple Frames uses a default value of 60 pixels as the spacing between images. If you find this value too low or high, you can now override it with the +mergeSpacing(n) input flag, where n stands for any numeric value you want. Here’s the same image I used above, but this time with the +mergeSpacing(420) override to make framed screenshots more spaced out:\nIncreased spacing in Apple Frames 3.2.\nAnd this is all you have to do to run Apple Frames with this setting:\nRemember: when using the Frames API, you always have to include an input command for the source of images, such as pick. In this image, I ran Apple Frames with the Photos picker and increased spacing.\nIf you want, you can also change the default value of 60 by editing the following action inside the Shortcuts editor:\nThe variable that controls spacing.\nThe third new feature is something I should have done years ago: you can now choose whether you want to see your latest screenshots first when running Apple Frames in manual picking mode, or if you want to see your oldest screenshots first instead.\nThe default behavior of Apple Frames has always been to show your latest screenshots first, meaning the image grid in the Photos picker is sorted in reverse chronological order. However, I (and a small percentage of Apple Frames users) like to see our oldest screenshots first. The reasons for this are twofold:\nWe prune our screenshots album frequently, so scrolling from the oldest screenshots to the latest ones isn’t a huge deal since the list is short.\nSelecting screenshots from left to right (oldest to newest) will also combine them from left to right in the shortcut, matching their order in the picker’s grid.\nOldest screenshots first (left) Vs. latest screenshots first.\nSince I understand the value of both options, I’ve made this behavior configurable in Apple Frames 3.2. You will be asked at setup to pick your preferred sorting order, but you can always change the Oldest First variable inside the shortcut later. By default, it’s set to False, meaning that your most recent screenshots are shown first; if you set it to True, your oldest ones will be shown first in the photo picker instead.\nYou can set the ‘Oldest First’ variable during the shortcut’s setup flow.\nTo change the photo picker’s sorting, tweak this variable.\nThe Frames API offers a pretty large selection of options and flags now, which is why I’m going to dive deeper into its more advanced functionalities in this week’s issue of MacStories Weekly for Club members. In the story, I will share more advanced examples of the Frames API and explain how you can chain input and output commands with overrides to turn Apple Frames into the ultimate screenshot-framing utility for all Apple platforms.\nTo get MacStories Weekly, you just need to sign up for a basic Club MacStories plan at $5/month or $50/year here or by using the buttons below:\n\nJoin AnnuallyStarts at $50/yearJoin MonthlyStarts at $5/month\n\nDownload Apple Frames 3.2\nApple Frames continues to be a labor of love that I build primarily for myself, but which I know is also used by thousands of designers, developers, and Apple users who want a nicer way to share screenshots. I’m especially happy with the latest automation capabilities of the Frames API, so I hope you’ll find this shortcut as useful as I do on a daily basis.\nYou can download Apple Frames 3.2 below and find it in the MacStories Shortcuts Archive.\n\n \n \n Apple FramesAdd device frames to screenshots for iPhones (8/SE, 11, 12, 13, 14, 15 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image.\nThe shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-03-21T12:09:23-04:00", "date_modified": "2024-03-21T16:38:51-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Apple Frames", "automation", "iOS", "iPadOS", "macOS", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=74701", "url": "https://www.macstories.net/ios/multibutton-1-1-makes-the-action-button-change-its-behavior-based-on-the-currently-open-app/", "title": "MultiButton 1.1 Makes the Action Button Change Its Behavior Based on the Currently Open App", "content_html": "
\"The

The same Action button; multiple behaviors for different apps.

\n

What if the Action button could change its behavior depending on which app is currently open?

\n

That’s what I started wondering months ago after I released MultiButton, my shortcut for toggling between two commands assigned to the Action button rather than being limited to only one. Having the choice between two distinct commands is great, but can you imagine if MultiButton could become a truly contextual shortcut system that adapted to whatever app is currently on-screen?

\n

I’ve spent the past few months working on this idea, and I’m happy to report that I was able to get it to work. In the process, I realized that what I’d designed was a comprehensive, advanced automation system that can be extended beyond MultiButton to a variety of use cases.

\n

Later this week in MacStories Weekly and exclusively for Club MacStories members (of all tiers), I will release my latest creation that makes contextual app automation possible. It’s called CAPS, which stands for Contextual Apps Plugin System.

\n

CAPS is comprised of three standalone shortcuts that allow you to define rules for which shortcuts should be run when the Action button is pressed while using a particular app. CAPS supports creating an unlimited number of rules for as many apps as you want; best of all, it’s based on an open file format that can be integrated with all kinds of shortcuts.

\n

\n

CAPS is an advanced level of user automation, which is why I’ve decided to release it for Club MacStories members. It will come with an in-depth walkthrough on how to set up CAPS, configure its integration with MultiButton, and manage the app-based triggers it relies on.

\n

I can’t emphasize this enough: since I made CAPS for myself, it’s been an automation dream come true, and something that I hope Apple implements natively in iOS 18.

\n

For example, thanks to MultiButton 1.1 and CAPS, when I press the Action button inside the Things app on my iPhone, I can choose between two task management-related shortcuts. However, if I’m using GoodLinks, the Action button changes its behavior and lets me share the article I’m reading instead. Using Timery? The Action button becomes a physical shortcut to stop the current timer, or start a new one. Halide is open? The Action button is now a camera button. And when I’m back on the Home or Lock Screen, the Action button automatically goes back to its default behavior, based on MultiButton.

\n

The flexibility granted by the combination of MultiButton and the Contextual Apps Plugin System is unprecedented on iOS. It lets the Action button know which app is currently open and adapt its selection of shortcuts accordingly. This kind of integration between hardware and software is only possible on Apple’s platforms, and I’m excited to explain everything in more detail later this week in MacStories Weekly.

\n
\"A

A sneak peek at CAPS.

\n

You can download the latest version of MultiButton with CAPS support below and in the MacStories Shortcuts Archive, but you’ll have to wait until Saturday to get access to CAPS.

\n
\n
\n \"\"
\n

MultiButton

Toggle between two shortcuts from the Action button. MultiButton will run a secondary shortcut if you press the Action button within a few seconds of your first press.
\nMultiButton 1.1 introduces support for CAPS (Contextual Apps Plugin System) automation; CAPS lets MultiButton run a different set of shortcuts when specific apps are open. CAPS is available exclusively for Club MacStories members and was released in Issue 409 of MacStories Weekly.

\n

Get the shortcut here.

\n\n
\n
\n
\n

You can join the Club at just $5/month and get access to nine years (!) of archives and over 500 issues of weekly and monthly newsletters. You can join the base tier of Club MacStories (which will get you access to CAPS) using the buttons below, or you can find out more about our higher tiers with more perks here.

\n
\nJoin Annual$50/yearJoin Monthly$5/month\n
\n

And if you want to get even more out of Club MacStories, you can check out our other tiers and additional members-only perks here.

\n

See you on Saturday in MacStories Weekly. It’s going to be a fun one.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "The same Action button; multiple behaviors for different apps.\nWhat if the Action button could change its behavior depending on which app is currently open?\nThat’s what I started wondering months ago after I released MultiButton, my shortcut for toggling between two commands assigned to the Action button rather than being limited to only one. Having the choice between two distinct commands is great, but can you imagine if MultiButton could become a truly contextual shortcut system that adapted to whatever app is currently on-screen?\nI’ve spent the past few months working on this idea, and I’m happy to report that I was able to get it to work. In the process, I realized that what I’d designed was a comprehensive, advanced automation system that can be extended beyond MultiButton to a variety of use cases.\nLater this week in MacStories Weekly and exclusively for Club MacStories members (of all tiers), I will release my latest creation that makes contextual app automation possible. It’s called CAPS, which stands for Contextual Apps Plugin System.\nCAPS is comprised of three standalone shortcuts that allow you to define rules for which shortcuts should be run when the Action button is pressed while using a particular app. CAPS supports creating an unlimited number of rules for as many apps as you want; best of all, it’s based on an open file format that can be integrated with all kinds of shortcuts.\n\nCAPS is an advanced level of user automation, which is why I’ve decided to release it for Club MacStories members. It will come with an in-depth walkthrough on how to set up CAPS, configure its integration with MultiButton, and manage the app-based triggers it relies on.\nI can’t emphasize this enough: since I made CAPS for myself, it’s been an automation dream come true, and something that I hope Apple implements natively in iOS 18.\nFor example, thanks to MultiButton 1.1 and CAPS, when I press the Action button inside the Things app on my iPhone, I can choose between two task management-related shortcuts. However, if I’m using GoodLinks, the Action button changes its behavior and lets me share the article I’m reading instead. Using Timery? The Action button becomes a physical shortcut to stop the current timer, or start a new one. Halide is open? The Action button is now a camera button. And when I’m back on the Home or Lock Screen, the Action button automatically goes back to its default behavior, based on MultiButton.\nThe flexibility granted by the combination of MultiButton and the Contextual Apps Plugin System is unprecedented on iOS. It lets the Action button know which app is currently open and adapt its selection of shortcuts accordingly. This kind of integration between hardware and software is only possible on Apple’s platforms, and I’m excited to explain everything in more detail later this week in MacStories Weekly.\nA sneak peek at CAPS.\nYou can download the latest version of MultiButton with CAPS support below and in the MacStories Shortcuts Archive, but you’ll have to wait until Saturday to get access to CAPS.\n\n \n \n MultiButtonToggle between two shortcuts from the Action button. MultiButton will run a secondary shortcut if you press the Action button within a few seconds of your first press.\nMultiButton 1.1 introduces support for CAPS (Contextual Apps Plugin System) automation; CAPS lets MultiButton run a different set of shortcuts when specific apps are open. CAPS is available exclusively for Club MacStories members and was released in Issue 409 of MacStories Weekly.\nGet the shortcut here.\n\n \n \n\nYou can join the Club at just $5/month and get access to nine years (!) of archives and over 500 issues of weekly and monthly newsletters. You can join the base tier of Club MacStories (which will get you access to CAPS) using the buttons below, or you can find out more about our higher tiers with more perks here.\n\nJoin Annual$50/yearJoin Monthly$5/month\n\nAnd if you want to get even more out of Club MacStories, you can check out our other tiers and additional members-only perks here.\nSee you on Saturday in MacStories Weekly. It’s going to be a fun one.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-03-13T12:40:12-04:00", "date_modified": "2024-03-13T12:40:12-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=74479", "url": "https://www.macstories.net/linked/automation-academy-my-collection-of-advanced-shortcuts-for-things/", "title": "Automation Academy: My Collection of Advanced Shortcuts for Things", "content_html": "

Earlier today, Federico released a series of seven advanced shortcuts for the task manager Things as part of his Automation Academy column, an exclusive perk of Club MacStories+ and Club Premier.

\n

Federico explains in the introduction of the story why he returned to Things a few months ago and has been happy with the decision:

\n

\n not only does the design of the Things app create a more relaxed environment for me to manage my responsibilities, but Cultured Code’s embrace of Shortcuts automation has allowed me to create dozens of custom enhancements for Things.\n

\n

It’s the flexibility that Things’ Shortcuts actions offer that allows for such deep customization. The shortcuts shared today include automations to:

\n

All of the shortcuts are ready to be used immediately and are accompanied by a detailed walk-through of the techniques used to build them and an explanation of how Federico is using them.

\n
\"Discounts

Discounts are just one of the many Club MacStories perks.

\n

Automation Academy is just one of many perks that Club MacStories+ and Club Premier members enjoy including:

\n

On top of that, Club Premier members get AppStories+, an extended, ad-free version of our flagship podcast that we deliver early every week in high-bitrate audio.

\n

Use the buttons below to learn more and sign up for Club MacStories+ or Club Premier.

\n

Join Club MacStories+:

\n
\nJoin Annual$100/yearJoin Monthly$10/month\n
\n

Join Club Premier:

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n\n

\u2192 Source: club.macstories.net

", "content_text": "Earlier today, Federico released a series of seven advanced shortcuts for the task manager Things as part of his Automation Academy column, an exclusive perk of Club MacStories+ and Club Premier.\nFederico explains in the introduction of the story why he returned to Things a few months ago and has been happy with the decision:\n\n not only does the design of the Things app create a more relaxed environment for me to manage my responsibilities, but Cultured Code’s embrace of Shortcuts automation has allowed me to create dozens of custom enhancements for Things.\n\nIt’s the flexibility that Things’ Shortcuts actions offer that allows for such deep customization. The shortcuts shared today include automations to:\nAutomatically move tasks scheduled for a certain time to Things’ Evening section\nPostponing evening tasks\nRescheduling tasks to the next evening\nTag selected tasks as active\nPin tasks\nSelect from a menu of Things shortcuts\nCreate tasks, an updated version of a previously-shared shortcut\nAll of the shortcuts are ready to be used immediately and are accompanied by a detailed walk-through of the techniques used to build them and an explanation of how Federico is using them.\nDiscounts are just one of the many Club MacStories perks.\nAutomation Academy is just one of many perks that Club MacStories+ and Club Premier members enjoy including:\nWeekly and monthly newsletters \nA sophisticated web app with search and filtering tools to navigate eight years of content\nCustomizable RSS feeds\nBonus columns\nAn early and ad-free version of our Internet culture and media podcast, MacStories Unwind\nA vibrant Discord community of smart app and automation fans who trade a wealth of tips and discoveries every day\nLive Discord audio events after Apple events and at other times of the year\nOn top of that, Club Premier members get AppStories+, an extended, ad-free version of our flagship podcast that we deliver early every week in high-bitrate audio.\nUse the buttons below to learn more and sign up for Club MacStories+ or Club Premier.\nJoin Club MacStories+:\n\nJoin Annual$100/yearJoin Monthly$10/month\n\nJoin Club Premier:\n\nJoin Annual$120/yearJoin Monthly$12/month\n\n\n\u2192 Source: club.macstories.net", "date_published": "2024-02-23T13:25:54-05:00", "date_modified": "2024-02-23T13:25:54-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "club", "shortcuts", "Linked" ] }, { "id": "https://www.macstories.net/?p=74364", "url": "https://www.macstories.net/vision/juno-1-1-for-visionos-adds-the-ability-to-open-youtube-com-urls-in-the-app/", "title": "Juno 1.1 for visionOS Adds the Ability to Open YouTube.com URLs in the App", "content_html": "
\"Opening

Opening videos in Juno from the YouTube website.

\n

John covered Juno, Christian Selig’s new YouTube client for visionOS, on MacStories last week, and I’ve been using the app for the past few days as my default way of watching YouTube videos on my Vision Pro. Today, Selig released version 1.1 of Juno with some welcome quality-of-life enhancements such as the ability to choose video quality, faster load times, and support for dropping YouTube links in the app to watch them directly in Juno. You can read more about the changes on Selig’s blog.

\n

The one new feature I want to call out here is the addition of URL schemes which have, once again, come to the rescue to help me navigate the early limitations of a new Apple platform.

\n

\n

I come across a lot of YouTube links in RSS and social media, and I tend to save most of them in Play, which we also covered before on MacStories. In version 1.0 of Juno, there wasn’t a way to take a YouTube URL from Play (or any other app) and instantly open it in the app. Whenever I clicked a YouTube link in a visionOS app, it would take me to Safari, with no way to redirect that video to the Juno app instead. With the new Juno URL scheme in version 1.1, this is now possible. Simply replace https:// with juno:// in a YouTube URL, and you’ll be able to watch that video in Juno rather than YouTube’s website.

\n

But we are refined and tasteful people with our Vision Pros, and no one should edit URLs by hand to watch a YouTube video in 2024. So I’ve built two shortcuts that you can install on your Vision Pro to simplify the process of reopening those links in Juno.

\n

The first one is a basic shortcut that you can run on YouTube.com in Safari to open the video you’re watching in Juno. To use it, just select ‘From YouTube to Juno’ in the Safari share sheet, and the video will be opened in Juno, where you can watch it with a superior interface.

\n
\"Juno

Juno 1.1.

\n

The second shortcut I’ve created is one to pick from a list of videos saved in the Play app and watch the selected video in Juno. Aptly named ‘From Play to Juno’, the shortcut is a nice way to go through your Play queue on the Vision Pro and take advantage of Juno’s native YouTube UI to watch your saved videos.

\n
\"Play

Play for visionOS.

\n

The Vision Pro is a fantastic device to watch videos, and Juno is a solid YouTube client, so it’s only natural that I’d create shortcuts to make the process of opening those YouTube videos in the app easier. You can download the shortcuts below and find them in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

From YouTube to Juno

Open a video from the YouTube website in the Juno app for visionOS.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

From Play to Juno

Open a video previously saved in the Play app in the Juno app for visionOS.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Opening videos in Juno from the YouTube website.\nJohn covered Juno, Christian Selig’s new YouTube client for visionOS, on MacStories last week, and I’ve been using the app for the past few days as my default way of watching YouTube videos on my Vision Pro. Today, Selig released version 1.1 of Juno with some welcome quality-of-life enhancements such as the ability to choose video quality, faster load times, and support for dropping YouTube links in the app to watch them directly in Juno. You can read more about the changes on Selig’s blog.\nThe one new feature I want to call out here is the addition of URL schemes which have, once again, come to the rescue to help me navigate the early limitations of a new Apple platform.\n\nI come across a lot of YouTube links in RSS and social media, and I tend to save most of them in Play, which we also covered before on MacStories. In version 1.0 of Juno, there wasn’t a way to take a YouTube URL from Play (or any other app) and instantly open it in the app. Whenever I clicked a YouTube link in a visionOS app, it would take me to Safari, with no way to redirect that video to the Juno app instead. With the new Juno URL scheme in version 1.1, this is now possible. Simply replace https:// with juno:// in a YouTube URL, and you’ll be able to watch that video in Juno rather than YouTube’s website.\nBut we are refined and tasteful people with our Vision Pros, and no one should edit URLs by hand to watch a YouTube video in 2024. So I’ve built two shortcuts that you can install on your Vision Pro to simplify the process of reopening those links in Juno.\nThe first one is a basic shortcut that you can run on YouTube.com in Safari to open the video you’re watching in Juno. To use it, just select ‘From YouTube to Juno’ in the Safari share sheet, and the video will be opened in Juno, where you can watch it with a superior interface.\nJuno 1.1.\nThe second shortcut I’ve created is one to pick from a list of videos saved in the Play app and watch the selected video in Juno. Aptly named ‘From Play to Juno’, the shortcut is a nice way to go through your Play queue on the Vision Pro and take advantage of Juno’s native YouTube UI to watch your saved videos.\nPlay for visionOS.\nThe Vision Pro is a fantastic device to watch videos, and Juno is a solid YouTube client, so it’s only natural that I’d create shortcuts to make the process of opening those YouTube videos in the app easier. You can download the shortcuts below and find them in the MacStories Shortcuts Archive.\n\n \n \n From YouTube to JunoOpen a video from the YouTube website in the Juno app for visionOS.\nGet the shortcut here.\n\n \n \n\n\n \n \n From Play to JunoOpen a video previously saved in the Play app in the Juno app for visionOS.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-02-14T21:09:34-05:00", "date_modified": "2024-02-15T05:42:36-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "shortcuts", "URL Scheme", "Vision Pro", "visionOS", "youtube", "vision" ] }, { "id": "https://www.macstories.net/?p=74294", "url": "https://www.macstories.net/reviews/vision-pro-app-spotlight-shortcut-buttons-turns-your-shortcuts-into-spatial-launchers/", "title": "Vision Pro App Spotlight: Shortcut Buttons Turns Your Shortcuts into Spatial Launchers", "content_html": "
\"Shortcut

Shortcut Buttons for visionOS.

\n

I received my Apple Vision Pro yesterday (for the full story of how it eventually found its way to Italy, you don’t want to miss the next episode of AppStories), and, as you can imagine, I’ve been busy downloading all the apps, learning my way around visionOS and – just today – using the extended Mac display mode. The first 24 hours with a Vision Pro are a whirlwind of curiosity and genuine nerd excitement, but despite my attention being pulled to a hundred different places, I’ve found the time to test one app in particular: Shortcut Buttons by Finn Voorhees.

\n

\n

Now, I may be biased here for a series of obvious reasons. Finn is John’s son, a friend, and a collaborator who worked with me over the years to develop a suite of Obsidian plugins for Club MacStories Plus and Premier members. I know Finn well, which is why it’s important to have this disclaimer in the story, but it’s equally important because I know the quality of his work and how he listens to feedback and iterates quickly. The aspects of Shortcut Buttons I’m going to criticize here are the same comments I’d usually send Finn in private, only in this case we’re talking about an app that is available for sale on the App Store and therefore deserves the typical MacStories review treatment.

\n

The main idea behind Shortcut Buttons is an ingenious one. Just like iPadOS, visionOS unfortunately lacks the developer API that Apple rolled out years ago on macOS to allow third-party apps to invoke and run shortcuts in the background, without launching the Shortcuts app. On iPhone and iPad, if a third-party app wants to run a shortcut from the user’s library, it still needs to use URL schemes. So our dear John had an idea, which he passed over to Finn: what if you could create small launchers based on URL schemes to run shortcuts and turn those launchers into virtual objects the user could place around their environment thanks to Apple Vision Pro? You know, essentially the equivalent of an NFC tag that runs a shortcut, only built specifically for visionOS.

\n

That’s precisely what Shortcut Buttons does, albeit with some limitations that are (mostly) a byproduct of visionOS’ early nature. When you first open up Shortcut Buttons, you’re presented with a small window that lets you create a new button. The most important field you’ll have to fill in this window is the ‘Shortcut Name’ one, which is where you need to enter the exact name of the shortcut you want to associate to a button in your workspace. Alas, while on macOS apps like Raycast and BetterTouchTool can automatically import the list of all your existing shortcuts, the API doesn’t exist on visionOS (just like on iPadOS), which means you’ll have to type the name of the shortcut yourself, without being able to just pick one and move on.

\n

You have some additional options to choose from after entering the name of the shortcut you want to run. First, you can set a color for the button. Shortcut Buttons doesn’t support a full color picker at the moment, so you’ll be limited to the eight color options provided by the app by default. You can also set an icon for the button with a UI that is reminiscent of picking an icon from the Shortcuts editor. These icons appear to be a subset of SF Symbols organized in different categories but, unfortunately, there is no search function to simplify the process of finding a specific icon, which is something I hope Finn can work on next alongside more color options.

\n
\"Choosing

Choosing icons for your buttons.

\n

Shortcuts power users will also have the ability to define input text passed to the shortcut that’s tied to the button. In version 1.0 of the app, you can only choose between ‘Text’ (an arbitrary string of plain text) and ‘Clipboard’, which returns the current contents of the system clipboard on your Vision Pro as text. These options work and are the kind of workarounds we’ve seen before in apps like Launcher and Drafts when it comes to triggering shortcuts with URL schemes, but I’d like to see more flexibility on this front. For instance, it’d be useful to have an option to be prompted for entering text at runtime as soon as you press a button; a way to pass the current timestamp with formatting options for date and time; perhaps even a way to get an image from the clipboard and convert it on the fly to base64, which you could then re-convert to an image file using Shortcuts actions. In future versions of the app, I hope Finn considers extending the number of input types that can be assigned to a button.

\n
\"Setting

Setting input types.

\n

When you’re done setting up the button, you can tap the ‘Create’ button to turn it into a rounded rectangle that is actually a visionOS window you can drag around in your workspace or environment. These buttons intentionally look like shortcuts from the Shortcuts app, which is a design decision that I appreciate for consistency and – I want to believe – something Apple will eventually steal whenever they release a native Shortcuts app for visionOS with spatial capabilities.

\n

That brings me to my next criticism, which is – sadly – something Finn can’t do anything about right now given the platform’s limitations. When you click a button from the Shortcut Buttons app, the Shortcuts app for iPad running in compatibility mode on Apple Vision Pro will also come in the foreground. This happens because Shortcut Buttons is forced to use a URL scheme to run shortcuts, and as we know from iOS and iPadOS, URL schemes open an app. In practice, this means that you’ll always have to keep the Shortcuts window somewhere around you to make sure that when you press a button its window doesn’t appear in the middle of your field of view, covering other windows you may be looking at. My advice to minimize the disruption caused by the sudden appearance of the Shortcuts app is to always keep it open – perhaps tuck it in a corner of the room – and place your buttons where you need them.

\n

Shortcuts limitations on visionOS notwithstanding, I’ve only spent a day working with my Vision Pro (this article was written entirely in visionOS), but I’ve already been using Shortcut Buttons to great effect for my Shortcuts-based workflow. My favorite use case so far is using Shortcut Buttons as a way to facilitate running the dozens of shortcuts I’ve created for Things, which I plan to share soon for Club Plus and Premier members. On my Vision Pro, I can open Things, then respawn a bunch of buttons previously created with Shortcut Buttons and place them around the Things window as “accessories”. This way, when I look at the Things app, it’s almost as if I have additional functions around it that I can execute with one glance and tap. In my previous era as an iPad Pro user, I’d have to run these shortcuts from the Home Screen or the iPadOS dock, with the latter one usually being a preferred option since I can access the dock while using an app. However, space in the dock is limited, and that limitation no longer exists with visionOS and the Vision Pro.

\n
\"A

A button placed next to the Things app.

\n
\"Running

Running one of my shortcuts for Things from the button placed next to the app’s window.

\n

Looking ahead at the future of Shortcut Buttons, besides the missing features I mentioned above, I’d like to see some way to create presets for groups of buttons that can be instantly recreated with one click. Right now, Shortcut Buttons comes with a history view that lets you quickly respawn a button you previously created. Instead, I’d like to see a way to define groups of buttons (such as “Things Shortcuts”) and make them all appear together with a single command rather than having to recreate each button manually. If this sounds pretty much like the ‘Shortcuts Folder’ widget from other Apple platforms, well, yeah – that’s exactly how I think Shortcut Buttons should behave.

\n

Despite the platform limitations I covered in this review and a series of missing features, Shortcut Buttons is a solid debut and, in my opinion, the first must-have Shortcuts companion utility on Apple Vision Pro. The concept of turning shortcuts into spatial launchers that can coexist in your physical workspace is a winning one, and I can’t wait to see how Finn takes the app even further. If you’re an Apple Vision Pro owner and Shortcuts power user, you owe it to yourself to have some fun scattering shortcuts around the house.

\n

Shortcut Buttons is available at $7.99 on the visionOS App Store.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Shortcut Buttons for visionOS.\nI received my Apple Vision Pro yesterday (for the full story of how it eventually found its way to Italy, you don’t want to miss the next episode of AppStories), and, as you can imagine, I’ve been busy downloading all the apps, learning my way around visionOS and – just today – using the extended Mac display mode. The first 24 hours with a Vision Pro are a whirlwind of curiosity and genuine nerd excitement, but despite my attention being pulled to a hundred different places, I’ve found the time to test one app in particular: Shortcut Buttons by Finn Voorhees.\n\nNow, I may be biased here for a series of obvious reasons. Finn is John’s son, a friend, and a collaborator who worked with me over the years to develop a suite of Obsidian plugins for Club MacStories Plus and Premier members. I know Finn well, which is why it’s important to have this disclaimer in the story, but it’s equally important because I know the quality of his work and how he listens to feedback and iterates quickly. The aspects of Shortcut Buttons I’m going to criticize here are the same comments I’d usually send Finn in private, only in this case we’re talking about an app that is available for sale on the App Store and therefore deserves the typical MacStories review treatment.\nThe main idea behind Shortcut Buttons is an ingenious one. Just like iPadOS, visionOS unfortunately lacks the developer API that Apple rolled out years ago on macOS to allow third-party apps to invoke and run shortcuts in the background, without launching the Shortcuts app. On iPhone and iPad, if a third-party app wants to run a shortcut from the user’s library, it still needs to use URL schemes. So our dear John had an idea, which he passed over to Finn: what if you could create small launchers based on URL schemes to run shortcuts and turn those launchers into virtual objects the user could place around their environment thanks to Apple Vision Pro? You know, essentially the equivalent of an NFC tag that runs a shortcut, only built specifically for visionOS.\nThat’s precisely what Shortcut Buttons does, albeit with some limitations that are (mostly) a byproduct of visionOS’ early nature. When you first open up Shortcut Buttons, you’re presented with a small window that lets you create a new button. The most important field you’ll have to fill in this window is the ‘Shortcut Name’ one, which is where you need to enter the exact name of the shortcut you want to associate to a button in your workspace. Alas, while on macOS apps like Raycast and BetterTouchTool can automatically import the list of all your existing shortcuts, the API doesn’t exist on visionOS (just like on iPadOS), which means you’ll have to type the name of the shortcut yourself, without being able to just pick one and move on.\nYou have some additional options to choose from after entering the name of the shortcut you want to run. First, you can set a color for the button. Shortcut Buttons doesn’t support a full color picker at the moment, so you’ll be limited to the eight color options provided by the app by default. You can also set an icon for the button with a UI that is reminiscent of picking an icon from the Shortcuts editor. These icons appear to be a subset of SF Symbols organized in different categories but, unfortunately, there is no search function to simplify the process of finding a specific icon, which is something I hope Finn can work on next alongside more color options.\nChoosing icons for your buttons.\nShortcuts power users will also have the ability to define input text passed to the shortcut that’s tied to the button. In version 1.0 of the app, you can only choose between ‘Text’ (an arbitrary string of plain text) and ‘Clipboard’, which returns the current contents of the system clipboard on your Vision Pro as text. These options work and are the kind of workarounds we’ve seen before in apps like Launcher and Drafts when it comes to triggering shortcuts with URL schemes, but I’d like to see more flexibility on this front. For instance, it’d be useful to have an option to be prompted for entering text at runtime as soon as you press a button; a way to pass the current timestamp with formatting options for date and time; perhaps even a way to get an image from the clipboard and convert it on the fly to base64, which you could then re-convert to an image file using Shortcuts actions. In future versions of the app, I hope Finn considers extending the number of input types that can be assigned to a button.\nSetting input types.\nWhen you’re done setting up the button, you can tap the ‘Create’ button to turn it into a rounded rectangle that is actually a visionOS window you can drag around in your workspace or environment. These buttons intentionally look like shortcuts from the Shortcuts app, which is a design decision that I appreciate for consistency and – I want to believe – something Apple will eventually steal whenever they release a native Shortcuts app for visionOS with spatial capabilities.\nThat brings me to my next criticism, which is – sadly – something Finn can’t do anything about right now given the platform’s limitations. When you click a button from the Shortcut Buttons app, the Shortcuts app for iPad running in compatibility mode on Apple Vision Pro will also come in the foreground. This happens because Shortcut Buttons is forced to use a URL scheme to run shortcuts, and as we know from iOS and iPadOS, URL schemes open an app. In practice, this means that you’ll always have to keep the Shortcuts window somewhere around you to make sure that when you press a button its window doesn’t appear in the middle of your field of view, covering other windows you may be looking at. My advice to minimize the disruption caused by the sudden appearance of the Shortcuts app is to always keep it open – perhaps tuck it in a corner of the room – and place your buttons where you need them.\nShortcuts limitations on visionOS notwithstanding, I’ve only spent a day working with my Vision Pro (this article was written entirely in visionOS), but I’ve already been using Shortcut Buttons to great effect for my Shortcuts-based workflow. My favorite use case so far is using Shortcut Buttons as a way to facilitate running the dozens of shortcuts I’ve created for Things, which I plan to share soon for Club Plus and Premier members. On my Vision Pro, I can open Things, then respawn a bunch of buttons previously created with Shortcut Buttons and place them around the Things window as “accessories”. This way, when I look at the Things app, it’s almost as if I have additional functions around it that I can execute with one glance and tap. In my previous era as an iPad Pro user, I’d have to run these shortcuts from the Home Screen or the iPadOS dock, with the latter one usually being a preferred option since I can access the dock while using an app. However, space in the dock is limited, and that limitation no longer exists with visionOS and the Vision Pro.\nA button placed next to the Things app.\nRunning one of my shortcuts for Things from the button placed next to the app’s window.\nLooking ahead at the future of Shortcut Buttons, besides the missing features I mentioned above, I’d like to see some way to create presets for groups of buttons that can be instantly recreated with one click. Right now, Shortcut Buttons comes with a history view that lets you quickly respawn a button you previously created. Instead, I’d like to see a way to define groups of buttons (such as “Things Shortcuts”) and make them all appear together with a single command rather than having to recreate each button manually. If this sounds pretty much like the ‘Shortcuts Folder’ widget from other Apple platforms, well, yeah – that’s exactly how I think Shortcut Buttons should behave.\nDespite the platform limitations I covered in this review and a series of missing features, Shortcut Buttons is a solid debut and, in my opinion, the first must-have Shortcuts companion utility on Apple Vision Pro. The concept of turning shortcuts into spatial launchers that can coexist in your physical workspace is a winning one, and I can’t wait to see how Finn takes the app even further. If you’re an Apple Vision Pro owner and Shortcuts power user, you owe it to yourself to have some fun scattering shortcuts around the house.\nShortcut Buttons is available at $7.99 on the visionOS App Store.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2024-02-09T14:27:12-05:00", "date_modified": "2024-03-28T07:35:32-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "shortcuts", "Vision Pro", "Vision Pro App Spotlight", "visionOS", "reviews" ] }, { "id": "https://www.macstories.net/?p=73696", "url": "https://www.macstories.net/linked/an-investigation-into-the-home-apps-clean-power-forecast-feature/", "title": "An Investigation into the Home App\u2019s Clean Power Forecast Feature", "content_html": "
\"\"

\n

Ever since Apple’s OSes were updated in the fall, I’ve been intrigued by the Home app’s new Clean Grid Forecast feature that predicts periods when the energy you use is ‘More Clean.’ The feature immediately reminded me of Clean Energy Charging, which works with Optimized Battery Charging, to charge your iPhone during periods when the electricity generated in your area is cleanest.

\n

However, Clean Grid Forecast also raised more questions in my mind than it answered, like ‘What does More Clean mean?’ and ‘How does Apple know if the energy is cleaner?,’ and ‘How much cleaner is it anyway?’ These are the kind of answers that GridStatus.io, a website that offers electrical grid data, set out to answer by comparing Apple’s ‘More Clean’ periods with publicly available energy generation data.

\n

\n

It turns out that there’s not a lot of information about Apple’s Clean Grid Forecast. As GridStatus explains, what we know is that:

\n
\n

On the iPhone support page we learn one more piece of information, that the grid forecast is available in the contiguous U.S. only.\n

\n

That didn’t give GridStatus a lot with which to work, and although they tried to validate Apple’s forecasts, the results were inconclusive:

\n

\n without more info it’s hard to find evidence that what Apple’s doing here is little more than a novelty. Something for them to point to as “doing their part”, while providing no actionable information to verify or additional materials for their customers to learn from.\n

\n

That’s not to say that the forecasts are incorrect. It’s just that Apple hasn’t shared enough information to judge the quality of its forecasts with any degree of certainty, which is disappointing.

\n

With Clean Grid Forecast, Apple has an opportunity to extend its impact on energy consumption well beyond its products. Integrated with home automation, accurate forecasts could make it easier to schedule energy-intensive tasks during ‘More Clean’ periods. That’s not currently possible because the feature doesn’t extend beyond the Home app and its widgets. Sure, you can watch the forecast and defer tasks until you’re in a ‘More Clean’ zone, but a little automation would go a long way toward making task scheduling a reality.

\n

The first step, though, is convincing consumers that deferring tasks is worth the trouble. One way Apple can do that is by sharing more information about how the Clean Grid Forecast works, providing some level of confidence that deferring tasks makes a difference. Next, Apple should integrate the forecasts directly into OS features and automation tools like Shortcuts, Siri, and home automation. Finally, I’d like to see Apple open up the forecasts to third parties so they can be integrated directly into hardware and software outside of Apple’s product lineup. That may strike some as unlikely, but with the company’s goal to be carbon neutral by 2030, it’s a chance for the company to have an impact well beyond its own products.

\n

I realize that this is all easier said than done. At least in the US, energy production is a vast patchwork of local facilities, and there are many variables that affect whether energy is ‘More Clean’ than usual. However, by being transparent about how ‘More Clean’ is determined and offering tools that give consumers better control over when they consume power, Apple has an opportunity to empower people to take proactive steps to reduce carbon emissions and promote thoughtful decisions about power consumption on a massive scale. My hope is that Clean Power Forecasts is just the tip of the iceberg of what Apple has planned for managing energy usage in 2024.

\n

\u2192 Source: blog.gridstatus.io

", "content_text": "Ever since Apple’s OSes were updated in the fall, I’ve been intrigued by the Home app’s new Clean Grid Forecast feature that predicts periods when the energy you use is ‘More Clean.’ The feature immediately reminded me of Clean Energy Charging, which works with Optimized Battery Charging, to charge your iPhone during periods when the electricity generated in your area is cleanest.\nHowever, Clean Grid Forecast also raised more questions in my mind than it answered, like ‘What does More Clean mean?’ and ‘How does Apple know if the energy is cleaner?,’ and ‘How much cleaner is it anyway?’ These are the kind of answers that GridStatus.io, a website that offers electrical grid data, set out to answer by comparing Apple’s ‘More Clean’ periods with publicly available energy generation data.\n\nIt turns out that there’s not a lot of information about Apple’s Clean Grid Forecast. As GridStatus explains, what we know is that:\n\nIt’s a forecast - this is important because a forecast is going to have some imperfections; and possibly change over time;\nIt’s location-based, perhaps down to individual towns or counties;\nThe forecast goes 12 hours out and seems to be in half-hour increments.\n On the iPhone support page we learn one more piece of information, that the grid forecast is available in the contiguous U.S. only.\n\nThat didn’t give GridStatus a lot with which to work, and although they tried to validate Apple’s forecasts, the results were inconclusive:\n\n without more info it’s hard to find evidence that what Apple’s doing here is little more than a novelty. Something for them to point to as “doing their part”, while providing no actionable information to verify or additional materials for their customers to learn from.\n\nThat’s not to say that the forecasts are incorrect. It’s just that Apple hasn’t shared enough information to judge the quality of its forecasts with any degree of certainty, which is disappointing.\nWith Clean Grid Forecast, Apple has an opportunity to extend its impact on energy consumption well beyond its products. Integrated with home automation, accurate forecasts could make it easier to schedule energy-intensive tasks during ‘More Clean’ periods. That’s not currently possible because the feature doesn’t extend beyond the Home app and its widgets. Sure, you can watch the forecast and defer tasks until you’re in a ‘More Clean’ zone, but a little automation would go a long way toward making task scheduling a reality.\nThe first step, though, is convincing consumers that deferring tasks is worth the trouble. One way Apple can do that is by sharing more information about how the Clean Grid Forecast works, providing some level of confidence that deferring tasks makes a difference. Next, Apple should integrate the forecasts directly into OS features and automation tools like Shortcuts, Siri, and home automation. Finally, I’d like to see Apple open up the forecasts to third parties so they can be integrated directly into hardware and software outside of Apple’s product lineup. That may strike some as unlikely, but with the company’s goal to be carbon neutral by 2030, it’s a chance for the company to have an impact well beyond its own products.\nI realize that this is all easier said than done. At least in the US, energy production is a vast patchwork of local facilities, and there are many variables that affect whether energy is ‘More Clean’ than usual. However, by being transparent about how ‘More Clean’ is determined and offering tools that give consumers better control over when they consume power, Apple has an opportunity to empower people to take proactive steps to reduce carbon emissions and promote thoughtful decisions about power consumption on a massive scale. My hope is that Clean Power Forecasts is just the tip of the iceberg of what Apple has planned for managing energy usage in 2024.\n\u2192 Source: blog.gridstatus.io", "date_published": "2023-12-28T09:25:31-05:00", "date_modified": "2023-12-28T09:37:11-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "environment", "home automation", "shortcuts", "Linked" ] }, { "id": "https://www.macstories.net/?p=73672", "url": "https://www.macstories.net/reviews/goodlinks-adds-even-deeper-shortcuts-integration-with-ability-to-retrieve-current-article-selections-and-more/", "title": "GoodLinks Adds Even Deeper Shortcuts Integration with Ability to Retrieve Current Article, Selections, and More", "content_html": "
\"The

The new Shortcuts actions for GoodLinks.

\n

A few weeks ago on AppStories, I mentioned to John that I was looking for the “Things of read-later apps”. What I meant is that I wanted to find an app to save articles for later that felt native to Apple platforms, had a reliable text parser, but, more importantly, featured deep Shortcuts integration to let me create automations for saved items. As I followed up after a few episodes, I realized the app I’d been looking for was the excellent GoodLinks, which we’ve covered on MacStories several times before.

\n

Today, GoodLinks developer Ngoc Luu released a small update to the app that, however, cements it as the premier solution for people who want a read-later utility for iOS and iPadOS that also features outstanding Shortcuts support.

\n

\n

With version 1.8.5, GoodLinks joins Cultured Code’s Things app in offering a Shortcuts action that returns the current “state” of the app. Specifically, GoodLinks now comes with a ‘Get Current Link’ action that can be used to get the article that you’re currently reading inside the GoodLinks app. The item that will be returned in Shortcuts is a variable that contains properties for the article such as its title, URL, author, and more.

\n

Additionally, the updated GoodLinks also offers a separate ‘Get Current Selection’ action that can return the currently-selected text in the article view as plain text, Markdown, or HTML.

\n
\"The

The new actions to get the current link and selection.

\n

These actions are interesting for a variety of reasons. At a high level, I’ve long been advocating for third-party Shortcuts actions that support the concept of “state” or “selection” in apps, so I’m happy to see GoodLinks follow in the footsteps of Cultured Code’s pioneering work in this area; more apps should do this. Furthermore, these kinds of Shortcuts actions are ideal candidates for Action button integration on iPhone 15 Pro or placement in the iPadOS dock. If you’re inside the GoodLinks app, you can run a shortcut tied to the Action button or saved to the iPad’s dock and perform something contextually to the article you’re reading. This “contextual automation” is an idea I’ve been developing and refining for a while, and I’ll have more to share soon.

\n

In any case, to demonstrate the power of GoodLinks’ Shortcuts actions, I put together a custom shortcut that I’ve been using to turn articles saved for later into linked posts on MacStories. I call it GoodLinked.

\n

As you can see from the images below, this shortcut gets the current article you’re reading in GoodLinks and extracts different properties from it, such as the title of the story, its author, and URL. Using another action, GoodLinked can see if you’ve selected any text in the article and, if so, save it as your selection. By retrieving the article’s selection as Markdown, I don’t have to do anything else to prepare a blockquote for MacStories.

\n
\"\"

\n

With this shortcut and the new actions available in the latest GoodLinks update, I can go from reading an article, like this:

\n
\"Reading

Reading an article in GoodLinks.

\n

…to a draft post in Obsidian, already formatted with placeholders I can tweak before publishing on the site:

\n
\"And

And a draft in Obsidian, created by my GoodLinked shortcut.

\n

This is just an example of what you can build with Shortcuts actions that retrieve the article you’re reading in GoodLinks. You don’t have to be a blogger to take advantage of these actions: perhaps you want to put together a shortcut for the Action button that quickly shares on Threads what you’re reading; maybe you want to clip selected text to a note in Obsidian or Apple Notes. No matter your use case, there is value in being able to process the current state or selection of an app with Shortcuts and create more advanced workflows for all kinds of tasks.

\n

I also wanted to point out some other useful additions in this GoodLinks update. On iPad, you can now open the app’s settings screen with ⌘+, (just like on a Mac); the app now properly supports Dynamic Type and respects the system’s text size (great for Accessibility); if you’ve selected some text in an article, you can also copy the formatted selection from a refreshed context menu:

\n
\"The

The new context menu for selected text in GoodLinks.

\n

I’ve long been a fan of GoodLinks, but the app’s newfound Shortcuts integration convinced me that, at this point in my life, it’s the read-later experience I need in my iPad and iPhone workflow. If you, like me, have been looking for a clean, native read-later app that feels right at home on Apple platforms and can be easily automated, look no further than GoodLinks.

\n

You can download the latest version of GoodLinks on the App Store; my GoodLinked shortcut is available below and in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

GoodLinked

Create a draft for a linked post in Obsidian based on the article you’re currently reading in GoodLinks.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "The new Shortcuts actions for GoodLinks.\nA few weeks ago on AppStories, I mentioned to John that I was looking for the “Things of read-later apps”. What I meant is that I wanted to find an app to save articles for later that felt native to Apple platforms, had a reliable text parser, but, more importantly, featured deep Shortcuts integration to let me create automations for saved items. As I followed up after a few episodes, I realized the app I’d been looking for was the excellent GoodLinks, which we’ve covered on MacStories several times before.\nToday, GoodLinks developer Ngoc Luu released a small update to the app that, however, cements it as the premier solution for people who want a read-later utility for iOS and iPadOS that also features outstanding Shortcuts support.\n\nWith version 1.8.5, GoodLinks joins Cultured Code’s Things app in offering a Shortcuts action that returns the current “state” of the app. Specifically, GoodLinks now comes with a ‘Get Current Link’ action that can be used to get the article that you’re currently reading inside the GoodLinks app. The item that will be returned in Shortcuts is a variable that contains properties for the article such as its title, URL, author, and more.\nAdditionally, the updated GoodLinks also offers a separate ‘Get Current Selection’ action that can return the currently-selected text in the article view as plain text, Markdown, or HTML.\nThe new actions to get the current link and selection.\nThese actions are interesting for a variety of reasons. At a high level, I’ve long been advocating for third-party Shortcuts actions that support the concept of “state” or “selection” in apps, so I’m happy to see GoodLinks follow in the footsteps of Cultured Code’s pioneering work in this area; more apps should do this. Furthermore, these kinds of Shortcuts actions are ideal candidates for Action button integration on iPhone 15 Pro or placement in the iPadOS dock. If you’re inside the GoodLinks app, you can run a shortcut tied to the Action button or saved to the iPad’s dock and perform something contextually to the article you’re reading. This “contextual automation” is an idea I’ve been developing and refining for a while, and I’ll have more to share soon.\nIn any case, to demonstrate the power of GoodLinks’ Shortcuts actions, I put together a custom shortcut that I’ve been using to turn articles saved for later into linked posts on MacStories. I call it GoodLinked.\nAs you can see from the images below, this shortcut gets the current article you’re reading in GoodLinks and extracts different properties from it, such as the title of the story, its author, and URL. Using another action, GoodLinked can see if you’ve selected any text in the article and, if so, save it as your selection. By retrieving the article’s selection as Markdown, I don’t have to do anything else to prepare a blockquote for MacStories.\n\nWith this shortcut and the new actions available in the latest GoodLinks update, I can go from reading an article, like this:\nReading an article in GoodLinks.\n…to a draft post in Obsidian, already formatted with placeholders I can tweak before publishing on the site:\nAnd a draft in Obsidian, created by my GoodLinked shortcut.\nThis is just an example of what you can build with Shortcuts actions that retrieve the article you’re reading in GoodLinks. You don’t have to be a blogger to take advantage of these actions: perhaps you want to put together a shortcut for the Action button that quickly shares on Threads what you’re reading; maybe you want to clip selected text to a note in Obsidian or Apple Notes. No matter your use case, there is value in being able to process the current state or selection of an app with Shortcuts and create more advanced workflows for all kinds of tasks.\nI also wanted to point out some other useful additions in this GoodLinks update. On iPad, you can now open the app’s settings screen with ⌘+, (just like on a Mac); the app now properly supports Dynamic Type and respects the system’s text size (great for Accessibility); if you’ve selected some text in an article, you can also copy the formatted selection from a refreshed context menu:\nThe new context menu for selected text in GoodLinks.\nI’ve long been a fan of GoodLinks, but the app’s newfound Shortcuts integration convinced me that, at this point in my life, it’s the read-later experience I need in my iPad and iPhone workflow. If you, like me, have been looking for a clean, native read-later app that feels right at home on Apple platforms and can be easily automated, look no further than GoodLinks.\nYou can download the latest version of GoodLinks on the App Store; my GoodLinked shortcut is available below and in the MacStories Shortcuts Archive.\n\n \n \n GoodLinkedCreate a draft for a linked post in Obsidian based on the article you’re currently reading in GoodLinks.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-12-20T10:36:29-05:00", "date_modified": "2023-12-20T10:36:29-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "read-later", "shortcuts", "reviews" ] }, { "id": "https://www.macstories.net/?p=73634", "url": "https://www.macstories.net/linked/automation-academy-introducing-thingsbox-an-all-in-one-shortcuts-capture-system-for-the-things-inbox/", "title": "Automation Academy: Introducing ThingsBox, an All-In-One Shortcuts Capture System for the Things Inbox", "content_html": "
\"ThingsBox.\"

ThingsBox.

\n

One of the perks of a Club MacStories+ and Club Premier membership are special columns that Federico and I publish periodically. In today’s Automation Academy, Federico shares ThingsBox, a shortcuts capture system that can handle multiple media types on every Apple device, sending the results to the Things inbox.

\n

As Federico explains, ThingsBox originated from a suggestion I made on AppStories recently, which he took and ran with to add functionality tailored to each type of media he saves, creating:

\n

\n a versatile system for quickly capturing text, Safari webpages, URLs, App Store apps, and even images and save them as new items in the Things inbox. ThingsBox runs on every Apple platform and can be used from a widget, the share sheet, or manually inside the Shortcuts app; it is optimized for the Apple Watch, where it defaults to dictation input; on the Mac, ThingsBox integrates with AppleScript to see what the frontmost window is and capture its data accordingly.\n

\n
\"Sharing

Sharing different types of input with ThingsBox…

\n
\"…and

…and the resulting tasks in the inbox.

\n

Automation Academy is one of the many perks of a Club MacStories+ and Club Premier membership and an excellent way to learn advanced Shortcuts techniques that are explained in the context of solutions to everyday problems.

\n

Join Club MacStories+:

\n
\nJoin Annual$100/yearJoin Monthly$10/month\n
\n

Join Club Premier:

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n

\u2192 Source: club.macstories.net

", "content_text": "ThingsBox.\nOne of the perks of a Club MacStories+ and Club Premier membership are special columns that Federico and I publish periodically. In today’s Automation Academy, Federico shares ThingsBox, a shortcuts capture system that can handle multiple media types on every Apple device, sending the results to the Things inbox.\nAs Federico explains, ThingsBox originated from a suggestion I made on AppStories recently, which he took and ran with to add functionality tailored to each type of media he saves, creating:\n\n a versatile system for quickly capturing text, Safari webpages, URLs, App Store apps, and even images and save them as new items in the Things inbox. ThingsBox runs on every Apple platform and can be used from a widget, the share sheet, or manually inside the Shortcuts app; it is optimized for the Apple Watch, where it defaults to dictation input; on the Mac, ThingsBox integrates with AppleScript to see what the frontmost window is and capture its data accordingly.\n\nSharing different types of input with ThingsBox…\n…and the resulting tasks in the inbox.\nAutomation Academy is one of the many perks of a Club MacStories+ and Club Premier membership and an excellent way to learn advanced Shortcuts techniques that are explained in the context of solutions to everyday problems.\nJoin Club MacStories+:\n\nJoin Annual$100/yearJoin Monthly$10/month\n\nJoin Club Premier:\n\nJoin Annual$120/yearJoin Monthly$12/month\n\n\u2192 Source: club.macstories.net", "date_published": "2023-12-13T11:05:41-05:00", "date_modified": "2023-12-13T11:05:41-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "club", "shortcuts", "things", "Linked" ] }, { "id": "https://www.macstories.net/?p=73163", "url": "https://www.macstories.net/linked/automation-academy-leveraging-reminders-to-make-saving-tasks-to-things-more-reliable-on-the-go/", "title": "Automation Academy: Leveraging Reminders to Make Saving Tasks to Things More Reliable On-the-Go", "content_html": "
\"\"

\n

One of the perks of a Club MacStories+ and Club Premier membership are special columns published periodically by Federico and John. In today’s Automation Academy, which debuted a refreshed format, Federico explains how he leveraged the tight integration of Reminders and Siri with Things by Cultured Code to improve the experience of saving tasks to Things on the go.

\n

As Federico explains:

\n

\n One of the features I missed from Reminders was its deep integration with Siri and background sync privileges. Whether you’re using Siri on the iPhone or Apple Watch, you can quickly dictate a new task with natural language and rest assured you’ll find it a few seconds later on any other device signed into your iCloud account. For instance, I can’t tell you how many times I added a reminder (with dates and times) using Siri while driving via my Apple Watch and immediately found it on my iPad once I got home. You just don’t have to worry about sync if you’re using iCloud and Reminders, which is one of the most important advantages of the app.\n

\n

Among other techniques, the post explains how to use ‘Repeat for Each’ blocks with magic variables and an always-on Mac running Lingon X, which is available for 20% off on the Club MacStories Discount page, to create a rock-solid way of creating new tasks from an Apple Watch or other device using Siri.

\n

Automation Academy is one of the many perks of a Club MacStories+ and Club Premier membership and an excellent way to learn advanced Shortcuts techniques that are explained in the context of solutions to everyday problems.

\n

Join Club MacStories+:

\n
\nJoin Annual$100/yearJoin Monthly$10/month\n
\n

Join Club Premier:

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n

\u2192 Source: club.macstories.net

", "content_text": "One of the perks of a Club MacStories+ and Club Premier membership are special columns published periodically by Federico and John. In today’s Automation Academy, which debuted a refreshed format, Federico explains how he leveraged the tight integration of Reminders and Siri with Things by Cultured Code to improve the experience of saving tasks to Things on the go.\nAs Federico explains:\n\n One of the features I missed from Reminders was its deep integration with Siri and background sync privileges. Whether you’re using Siri on the iPhone or Apple Watch, you can quickly dictate a new task with natural language and rest assured you’ll find it a few seconds later on any other device signed into your iCloud account. For instance, I can’t tell you how many times I added a reminder (with dates and times) using Siri while driving via my Apple Watch and immediately found it on my iPad once I got home. You just don’t have to worry about sync if you’re using iCloud and Reminders, which is one of the most important advantages of the app.\n\nAmong other techniques, the post explains how to use ‘Repeat for Each’ blocks with magic variables and an always-on Mac running Lingon X, which is available for 20% off on the Club MacStories Discount page, to create a rock-solid way of creating new tasks from an Apple Watch or other device using Siri.\nAutomation Academy is one of the many perks of a Club MacStories+ and Club Premier membership and an excellent way to learn advanced Shortcuts techniques that are explained in the context of solutions to everyday problems.\nJoin Club MacStories+:\n\nJoin Annual$100/yearJoin Monthly$10/month\n\nJoin Club Premier:\n\nJoin Annual$120/yearJoin Monthly$12/month\n\n\u2192 Source: club.macstories.net", "date_published": "2023-10-19T12:03:24-04:00", "date_modified": "2023-10-19T12:03:24-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "shortcuts", "things", "Linked" ] }, { "id": "https://www.macstories.net/?p=73003", "url": "https://www.macstories.net/ios/introducing-multibutton-assign-two-shortcuts-to-the-same-action-button-press-on-iphone-15-pro/", "title": "Introducing MultiButton: Assign Two Shortcuts to the Same Action Button Press on iPhone 15 Pro", "content_html": "
\"MultiButton

MultiButton for iPhone 15

\n

I got my iPhone 15 Pro Max last week, and I’m loving the possibilities opened by the Action button combined with the Shortcuts app. But as I was playing around with different ideas for the Action button, I had a thought:

\n

Wouldn’t it be great if instead of just one shortcut, I could toggle between two shortcuts with the same Action button press? That’s exactly what my new MultiButton shortcut does.

\n

With MultiButton, you’ll be able to assign two separate shortcuts to the Action button. Unlike other solutions you may have seen that always make you pick shortcuts from a menu, MultiButton automatically cycles between two shortcuts if you press the Action button multiple times in rapid succession. You don’t need to pick shortcuts from a list; just press the Action button and MultiButton will take care of everything.

\n
\n
\n

Toggling between two shortcuts with MultiButton.

\n
\n

Allow me to explain how MultiButton works and how you can configure it for your Action button. In the process, I’ll also share some new shortcut ideas that you can start using today on your iPhone 15 Pro.

\n

\n

MultiButton

\n

The core idea behind MultiButton is this: you press the Action button the first time, and it runs a primary shortcut; if you press it again within a few seconds, it’ll run a secondary shortcut. Effectively, MultiButton lets you double the shortcuts you can assign to the Action button, thus making you (hopefully) faster and more productive on your iPhone.

\n

I created MultiButton because I wanted to run two shortcuts from the same Action button press, but I didn’t want to always pick them from a menu. I wanted to define a “primary” shortcut that would be my default Action button press, plus a secondary shortcut that would only run if I pressed the Action button again after a few seconds.

\n

All of this wouldn’t have been necessary if Apple supported multiple gestures for the Action button, such as double- and triple-presses. Alas, in this version of iOS 17 for the iPhone 15 Pro, the Action button can only be associated with one shortcut at a time. That seemed like something I could fix with some clever programming in Shortcuts.

\n

The first time you download and install MultiButton, you’ll be asked to enter the two names of the shortcuts you want to run with it. In this step, pictured below, it is very important that you enter the exact names of the shortcuts you want to run with MultiButton. You can always change these shortcuts later in the MultiButton editor.

\n
\"You'll

You’ll be asked to enter the names of the two shortcuts you want to run with MultiButton at setup (left), but you can always change them later in the editor.

\n

When you run MultiButton the first time, you may also be asked to give it permission to access a folder in iCloud Drive and run other shortcuts. Files/Finder access is necessary because MultiButton needs to create and modify a configuration file (named MultiButton.json) stored in iCloud Drive ⇾ Shortcuts. Don’t worry: MultiButton won’t read any other files or contact any third-party servers. It simply needs to store a timestamp of the last time you used MultiButton.

\n
\"The

The initial setup flow for MultiButton. You’ll only see these prompts once.

\n

You see, the “clever” technique behind MultiButton is that it takes advantage of Shortcuts’ built-in date calculations to understand if you pressed the Action button within 7 seconds of the last time you pressed it and ran a shortcut with it. If you do, the shortcut will then run the secondary shortcut instead of the primary one. If you wait longer than 7 seconds, MultiButton will run the default, primary shortcut instead.

\n
\"By

By default, MultiButton runs the second shortcut if you press the Action button within 7 seconds of the first shortcut run. You can change this number if you want (right).

\n

The 7-second threshold is an arbitrary number that I picked based on my real-life usage of the Action button. In most scenarios, I thought 7 seconds was an amount of time short enough to press the Action button again, but not too long so that you’d need to wait for MultiButton to “reset” its state. If you want to modify this time limit, you can do so from the ‘Number’ action shown above.

\n

That’s all you need to know about MultiButton. There’s nothing else to configure and no other actions to tweak. Simply assign it as the Action button shortcut in the Settings app, and you’re good to go.

\n

Using Shortcuts with MultiButton and the Action Button

\n

I’m going to give you some examples about my own usage of MultiButton. On my iPhone, I wanted to have a way to perform two actions with the same Action button press:

\n

As you can see, this is what MultiButton does on my iPhone 15 Pro Max if I press the Action button a second time after running the first shortcut:

\n
\"My

My MultiButton setup.

\n

There are some technical details and limitations worth explaining here. For starters, you cannot press the Action button a second time while the first shortcut is still running. You’ll have to wait for the primary shortcut to be finished running. On iOS, Shortcuts doesn’t support running two shortcuts from the Action button at the same time. You can, however, cancel the first shortcut (if it comes with a ‘Cancel’ button) and press the Action button again; in that case, even if you canceled the first shortcut, the secondary one will still trigger.

\n

Furthermore, the Action button offers a handy ‘Show Folder…’ action for Shortcuts that brings up this UI element:

\n
\"This

This menu is exclusive to the Settings app.

\n

Alas, this new menu is exclusive to the Action button Settings screen. There is no action in the Shortcuts app to replicate the look of this menu with rounded buttons and the ‘Open App’ launcher. I hope that’s something Apple will add in the future; for now, the only way to have this menu is to choose it from the Settings app, in which case you won’t be able to use MultiButton.

\n

There is, however, a workaround I implemented to simulate the same behavior, and it’s a simple shortcut I created called ‘Run Shortcut From Folder’. With this shortcut, you can select any folder from the Shortcuts app and run one of the shortcuts contained in that folder. It’s a handy way to have access to a folder of “favorites” in the Shortcuts app and run one of them from MultiButton.

\n
\"A

A shortcut that presents a menu to run more shortcuts.

\n
\n
\n \"\"
\n

Run Shortcut From Folder

Select and run one of your shortcuts from a specific folder.

\n

Get the shortcut here.

\n\n
\n
\n
\n

While I was putting together MultiButton, I also created some other shortcuts that would make for nice utilities for the Action button. For instance, my ‘Pause Media’ shortcut uses the free Actions app to detect if audio is playing and, if so, pause playback from the Action button.

\n
\"\"

\n
\n
\n \"\"
\n

Pause Media

If media is playing, pause playback.

\n

Get the shortcut here.

\n\n
\n
\n
\n

I find myself taking and sharing a lot of screenshots on my iPhone on a daily basis, so I created a couple of shortcuts to speed up the process from the Action button. The first one, called ‘Take Screenshot and Share’, does exactly what it says: it captures a screenshot of whatever is onscreen and immediately presents the share sheet to send the image to an app or someone else. This is a great one to assign as the primary shortcut in MultiButton.

\n
\"\"

\n
\n
\n \"\"
\n

Take Screenshot and Share

Take a screenshot and share it.

\n

Get the shortcut here.

\n\n
\n
\n
\n

A modified version of this shortcut, called ‘Screenshot, Markup, and Share’, lets you modify a screenshot with the native Markup editing UI before passing it to the share sheet:

\n
\"\"

\n
\n
\n \"\"
\n

Screenshot, Markup, and Share

Take a screenshot, edit it, and share it.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Lastly, I realized I could get a bit creative with conditions in Shortcuts and put together a shortcut to open all my shutters with a press of the Action button only if I’m on my home WiFi and if it’s after 9 AM. The typical scenario: I wake up, and as I stumble my way across the living room to turn on my espresso machine, I can half-mindedly press the Action button to instantly open all the shutters and get some much needed sunlight. That’s exactly what my ‘Blinds After 9’ shortcut does.

\n
\"\"

\n

When you first run the shortcut, you’ll be asked to enter the name of your home WiFi network and select a time after which the blinds should open via HomeKit. My recommendation is to enter a date and time in natural language (like “Today at 9 AM”), which Shortcuts will recognize. You’ll need, of course, to select the accessories you want to open from the ‘Control My Home’ action using your own HomeKit setup.

\n
\n
\n \"\"
\n

Blinds After 9

Open the blinds/shutters via HomeKit only if you’re on your home WiFi network and if it’s after a certain hour of the day.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Download MultiButton

\n

Despite the poor state of the Shortcuts app in iOS 17, Action button programming is creating a great opportunity for power and casual users alike to automate tasks on their iPhones with a single press of a button. And with MultiButton, you’ll be able to toggle between two shortcuts instead of one, preserving the benefit of having a default “fallback” shortcut while also having a secondary one ready to go with a second press of the Action button.

\n

I have some other ideas for how to evolve MultiButton in the future and I am, of course, open to feedback and requests. In the meantime, you can download MultiButton below and find it, alongside hundreds more shortcuts as well as the extra ones I mentioned above, in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

MultiButton

Toggle between two shortcuts from the Action button. MultiButton will run a secondary shortcut if you press the Action button within a few seconds of your first press.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "MultiButton for iPhone 15\nI got my iPhone 15 Pro Max last week, and I’m loving the possibilities opened by the Action button combined with the Shortcuts app. But as I was playing around with different ideas for the Action button, I had a thought:\nWouldn’t it be great if instead of just one shortcut, I could toggle between two shortcuts with the same Action button press? That’s exactly what my new MultiButton shortcut does.\nWith MultiButton, you’ll be able to assign two separate shortcuts to the Action button. Unlike other solutions you may have seen that always make you pick shortcuts from a menu, MultiButton automatically cycles between two shortcuts if you press the Action button multiple times in rapid succession. You don’t need to pick shortcuts from a list; just press the Action button and MultiButton will take care of everything.\n\n \nToggling between two shortcuts with MultiButton.\n\nAllow me to explain how MultiButton works and how you can configure it for your Action button. In the process, I’ll also share some new shortcut ideas that you can start using today on your iPhone 15 Pro.\n\nMultiButton\nThe core idea behind MultiButton is this: you press the Action button the first time, and it runs a primary shortcut; if you press it again within a few seconds, it’ll run a secondary shortcut. Effectively, MultiButton lets you double the shortcuts you can assign to the Action button, thus making you (hopefully) faster and more productive on your iPhone.\n\nMultiButton lets you double the shortcuts you can assign to the Action button.\n\nI created MultiButton because I wanted to run two shortcuts from the same Action button press, but I didn’t want to always pick them from a menu. I wanted to define a “primary” shortcut that would be my default Action button press, plus a secondary shortcut that would only run if I pressed the Action button again after a few seconds.\nAll of this wouldn’t have been necessary if Apple supported multiple gestures for the Action button, such as double- and triple-presses. Alas, in this version of iOS 17 for the iPhone 15 Pro, the Action button can only be associated with one shortcut at a time. That seemed like something I could fix with some clever programming in Shortcuts.\nThe first time you download and install MultiButton, you’ll be asked to enter the two names of the shortcuts you want to run with it. In this step, pictured below, it is very important that you enter the exact names of the shortcuts you want to run with MultiButton. You can always change these shortcuts later in the MultiButton editor.\nYou’ll be asked to enter the names of the two shortcuts you want to run with MultiButton at setup (left), but you can always change them later in the editor.\nWhen you run MultiButton the first time, you may also be asked to give it permission to access a folder in iCloud Drive and run other shortcuts. Files/Finder access is necessary because MultiButton needs to create and modify a configuration file (named MultiButton.json) stored in iCloud Drive ⇾ Shortcuts. Don’t worry: MultiButton won’t read any other files or contact any third-party servers. It simply needs to store a timestamp of the last time you used MultiButton.\nThe initial setup flow for MultiButton. You’ll only see these prompts once.\nYou see, the “clever” technique behind MultiButton is that it takes advantage of Shortcuts’ built-in date calculations to understand if you pressed the Action button within 7 seconds of the last time you pressed it and ran a shortcut with it. If you do, the shortcut will then run the secondary shortcut instead of the primary one. If you wait longer than 7 seconds, MultiButton will run the default, primary shortcut instead.\nBy default, MultiButton runs the second shortcut if you press the Action button within 7 seconds of the first shortcut run. You can change this number if you want (right).\nThe 7-second threshold is an arbitrary number that I picked based on my real-life usage of the Action button. In most scenarios, I thought 7 seconds was an amount of time short enough to press the Action button again, but not too long so that you’d need to wait for MultiButton to “reset” its state. If you want to modify this time limit, you can do so from the ‘Number’ action shown above.\nThat’s all you need to know about MultiButton. There’s nothing else to configure and no other actions to tweak. Simply assign it as the Action button shortcut in the Settings app, and you’re good to go.\nUsing Shortcuts with MultiButton and the Action Button\nI’m going to give you some examples about my own usage of MultiButton. On my iPhone, I wanted to have a way to perform two actions with the same Action button press:\nBy default, save a quick note;\nIf I pressed the button again, bring up a list of more shortcuts.\nAs you can see, this is what MultiButton does on my iPhone 15 Pro Max if I press the Action button a second time after running the first shortcut:\nMy MultiButton setup.\nThere are some technical details and limitations worth explaining here. For starters, you cannot press the Action button a second time while the first shortcut is still running. You’ll have to wait for the primary shortcut to be finished running. On iOS, Shortcuts doesn’t support running two shortcuts from the Action button at the same time. You can, however, cancel the first shortcut (if it comes with a ‘Cancel’ button) and press the Action button again; in that case, even if you canceled the first shortcut, the secondary one will still trigger.\nFurthermore, the Action button offers a handy ‘Show Folder…’ action for Shortcuts that brings up this UI element:\nThis menu is exclusive to the Settings app.\nAlas, this new menu is exclusive to the Action button Settings screen. There is no action in the Shortcuts app to replicate the look of this menu with rounded buttons and the ‘Open App’ launcher. I hope that’s something Apple will add in the future; for now, the only way to have this menu is to choose it from the Settings app, in which case you won’t be able to use MultiButton.\nThere is, however, a workaround I implemented to simulate the same behavior, and it’s a simple shortcut I created called ‘Run Shortcut From Folder’. With this shortcut, you can select any folder from the Shortcuts app and run one of the shortcuts contained in that folder. It’s a handy way to have access to a folder of “favorites” in the Shortcuts app and run one of them from MultiButton.\nA shortcut that presents a menu to run more shortcuts.\n\n \n \n Run Shortcut From FolderSelect and run one of your shortcuts from a specific folder.\nGet the shortcut here.\n\n \n \n\nWhile I was putting together MultiButton, I also created some other shortcuts that would make for nice utilities for the Action button. For instance, my ‘Pause Media’ shortcut uses the free Actions app to detect if audio is playing and, if so, pause playback from the Action button.\n\n\n \n \n Pause MediaIf media is playing, pause playback.\nGet the shortcut here.\n\n \n \n\nI find myself taking and sharing a lot of screenshots on my iPhone on a daily basis, so I created a couple of shortcuts to speed up the process from the Action button. The first one, called ‘Take Screenshot and Share’, does exactly what it says: it captures a screenshot of whatever is onscreen and immediately presents the share sheet to send the image to an app or someone else. This is a great one to assign as the primary shortcut in MultiButton.\n\n\n \n \n Take Screenshot and ShareTake a screenshot and share it.\nGet the shortcut here.\n\n \n \n\nA modified version of this shortcut, called ‘Screenshot, Markup, and Share’, lets you modify a screenshot with the native Markup editing UI before passing it to the share sheet:\n\n\n \n \n Screenshot, Markup, and ShareTake a screenshot, edit it, and share it.\nGet the shortcut here.\n\n \n \n\nLastly, I realized I could get a bit creative with conditions in Shortcuts and put together a shortcut to open all my shutters with a press of the Action button only if I’m on my home WiFi and if it’s after 9 AM. The typical scenario: I wake up, and as I stumble my way across the living room to turn on my espresso machine, I can half-mindedly press the Action button to instantly open all the shutters and get some much needed sunlight. That’s exactly what my ‘Blinds After 9’ shortcut does.\n\nWhen you first run the shortcut, you’ll be asked to enter the name of your home WiFi network and select a time after which the blinds should open via HomeKit. My recommendation is to enter a date and time in natural language (like “Today at 9 AM”), which Shortcuts will recognize. You’ll need, of course, to select the accessories you want to open from the ‘Control My Home’ action using your own HomeKit setup.\n\n \n \n Blinds After 9Open the blinds/shutters via HomeKit only if you’re on your home WiFi network and if it’s after a certain hour of the day.\nGet the shortcut here.\n\n \n \n\nDownload MultiButton\nDespite the poor state of the Shortcuts app in iOS 17, Action button programming is creating a great opportunity for power and casual users alike to automate tasks on their iPhones with a single press of a button. And with MultiButton, you’ll be able to toggle between two shortcuts instead of one, preserving the benefit of having a default “fallback” shortcut while also having a secondary one ready to go with a second press of the Action button.\nI have some other ideas for how to evolve MultiButton in the future and I am, of course, open to feedback and requests. In the meantime, you can download MultiButton below and find it, alongside hundreds more shortcuts as well as the extra ones I mentioned above, in the MacStories Shortcuts Archive.\n\n \n \n MultiButtonToggle between two shortcuts from the Action button. MultiButton will run a secondary shortcut if you press the Action button within a few seconds of your first press.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-09-27T11:35:04-04:00", "date_modified": "2023-10-10T19:34:27-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Action Button", "automation", "iOS", "iPhone 15 Pro", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=71818", "url": "https://www.macstories.net/ios/s-gpt-1-0-2-brings-date-and-time-awareness-integration-with-macos-services-menu-passthrough-mode-better-homepod-support-and-more/", "title": "S-GPT 1.0.2 Brings Date and Time Awareness, Integration with macOS Services Menu, Passthrough Mode, Better HomePod Support, and More", "content_html": "
\"S-GPT

S-GPT 1.0.2.

\n

I just published version 1.0.2 of S-GPT, the shortcut I released last week to have conversations with OpenAI’s ChatGPT and integrate it directly with native features of Apple’s OSes. You can find the updated download link at the end of this post, in the original article, and in the MacStories Shortcuts Archive; before you replace version 1.0.1 of S-GPT, save your existing OpenAI API key somewhere as you’ll have to paste it again in the shortcut later.

\n

I’m going to include the full changelog for S-GPT 1.0.2 below, but long story short: S-GPT is now aware of the current date and time, and I’ve heard all the requests about improving interactions with the HomePod and Siri, so I made that part much better. S-GPT can now perform a variety of date/time calculations with natural language, and you can end a conversation by saying “no” or “stop”.

\n

\n

Here are the full release notes for S-GPT 1.0.2:

\n

As you can see from the examples above, support for date and time calculations means S-GPT has gained a whole new layer of flexibility for natural language queries, which I find myself using on a regular basis to convert time zones or check specific dates in the year. Additionally, if you’re a Mac user, don’t underestimate the potential for integrating S-GPT with any shortcut, app, or text field via the Services menu: as long as you’re sharing some text with S-GPT, the shortcut will read it as input and make a request to the ChatGPT API for you.

\n
\"Triggering

Triggering S-GPT from the Services menu on macOS.

\n

On a similar note, passthrough mode means that S-GPT can now easily accept text input from any source and use it as the text for a request to ChatGPT. In the example below, you can see how I was able to hit the Tab key in Raycast to type my question for S-GPT directly from the launcher’s UI:

\n
\"Typing

Typing questions for S-GPT from Raycast.

\n

Better support for Siri interactions is also something I’ve heard from users a lot over the past week. In this version of S-GPT, I’ve made it easier to end a conversation while using a HomePod or Siri on the iPhone, but I wish I could do something about Siri being too fast when speaking its response and immediately dismissing the alert. Unfortunately, Apple doesn’t provide any tools at the moment to check whether a shortcut is running inside Siri. Ideally, I should be able to add a time delay only if a shortcut is being used in a Siri context. I’ll have to figure something out here.

\n

The response to S-GPT has been incredible, and I’m now working my way through a long list of feature requests and ideas, trying to prioritize what I would like to build for the next major updates to this shortcut. If you have ideas, you can reach out on Mastodon or find me on the Club MacStories Discord server.

\n

I also wanted to share two videos by Stephen Robles and Brandon Butch showing S-GPT in action and how you can integrate the shortcut in vastly different workflows:

\n
\n
\n

S-GPT Extras: Additional Personalities and ‘Making Of’ Class

\n

In addition to the main S-GPT shortcut (which is and will continue to be free for everyone), I also created some paid extras for Club MacStories members over the past week.

\n

In MacStories Weekly (available with a Club MacStories plan at $5/month or $50/year), I shared a set of prompts to infuse S-GPT with different personalities. There’s an evil one; there’s a personality that is modeled after Steve Jobs’ speech style and drops Apple references everywhere; there’s even one inspired by Roy Kent. You can sign up for a Club MacStories account and check out all the other perks it offers here, or by using the buttons below.

\n
\"Roy-GPT.\"

Roy-GPT.

\n
\nJoin Annual$50/yearJoin Monthly$5/month\n
\n

Today for Club MacStories+ and Premier members (the Premier tier is $12/month or $120/year and it’s the all-access pass for everything we go, including a longer, ad-free version of AppStories and a Discord server), I published an in-depth ‘Making Of’ post about S-GPT. Think of this as a masterclass about working with the ChatGPT API in Shortcuts, dealing with web APIs and Shortcuts in general, plus other advanced techniques that you can study and reuse in your automations. I’m already working on Part 2 for this series that will focus on the native integrations of S-GPT and how I built them.

\n
\"My

My latest Automation Academy class.

\n

You can read more about Club Premier here and sign up using the buttons below.

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n

Download S-GPT 1.0.2

\n

That’s all for now. You can download version 1.0.2 of S-GPT and S-GPT Encoder below or via the MacStories Shortcuts Archive. The original story has been updated with new links as well; remember to save your existing OpenAI API token somewhere since you’ll have to paste it again into S-GPT 1.0.2 at setup.

\n

If you have ideas and requests for S-GPT, you know where to find me.

\n
\n
\n \"\"
\n

S-GPT

S-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

S-GPT Encoder

This is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.

\n

Get the shortcut here.

\n\n
\n
\n
\n

You can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "S-GPT 1.0.2.\nI just published version 1.0.2 of S-GPT, the shortcut I released last week to have conversations with OpenAI’s ChatGPT and integrate it directly with native features of Apple’s OSes. You can find the updated download link at the end of this post, in the original article, and in the MacStories Shortcuts Archive; before you replace version 1.0.1 of S-GPT, save your existing OpenAI API key somewhere as you’ll have to paste it again in the shortcut later.\nI’m going to include the full changelog for S-GPT 1.0.2 below, but long story short: S-GPT is now aware of the current date and time, and I’ve heard all the requests about improving interactions with the HomePod and Siri, so I made that part much better. S-GPT can now perform a variety of date/time calculations with natural language, and you can end a conversation by saying “no” or “stop”.\n\nHere are the full release notes for S-GPT 1.0.2:\nAdded support for current date and time in the System message. (Thanks, Leon.)\nS-GPT is now aware of the current date and time, which allows you to perform various date/time calculations (“How long until Christmas”, “If it’s 5 PM for me, what’s the time in NYC”, “How many days left this year? Give me a completion rate for the year”), ask about historical events (“Give me some events for today in history”), and more.\n\nS-GPT now works better with HomePod and other Siri-enabled devices.\nYou can simply say “No” or “Stop” to end the conversation when S-GPT asks if you want to follow-up with another question.\n\nAdded support for asking follow-up questions with dictation enabled by default on Apple Watch.\nPassthrough mode: S-GPT can now accept input text from any other shortcut, extension, or app that can pass some text to it.\nThis allows you to integrate S-GPT as part of an existing workflow and turn it into a “feature” of other automations.\n\nFurther improved the logic for summarizing articles from Safari webpages without “hallucinating” results.\nS-GPT can now detect articles from Safari for Mac as well.\n\nIntegration with Services and Quick Actions on macOS: you can now run S-GPT for selected text using macOS’ Services menu and Quick Actions. The selected text will be passed to S-GPT as input.\nAs you can see from the examples above, support for date and time calculations means S-GPT has gained a whole new layer of flexibility for natural language queries, which I find myself using on a regular basis to convert time zones or check specific dates in the year. Additionally, if you’re a Mac user, don’t underestimate the potential for integrating S-GPT with any shortcut, app, or text field via the Services menu: as long as you’re sharing some text with S-GPT, the shortcut will read it as input and make a request to the ChatGPT API for you.\nTriggering S-GPT from the Services menu on macOS.\nOn a similar note, passthrough mode means that S-GPT can now easily accept text input from any source and use it as the text for a request to ChatGPT. In the example below, you can see how I was able to hit the Tab key in Raycast to type my question for S-GPT directly from the launcher’s UI:\nTyping questions for S-GPT from Raycast.\nBetter support for Siri interactions is also something I’ve heard from users a lot over the past week. In this version of S-GPT, I’ve made it easier to end a conversation while using a HomePod or Siri on the iPhone, but I wish I could do something about Siri being too fast when speaking its response and immediately dismissing the alert. Unfortunately, Apple doesn’t provide any tools at the moment to check whether a shortcut is running inside Siri. Ideally, I should be able to add a time delay only if a shortcut is being used in a Siri context. I’ll have to figure something out here.\nThe response to S-GPT has been incredible, and I’m now working my way through a long list of feature requests and ideas, trying to prioritize what I would like to build for the next major updates to this shortcut. If you have ideas, you can reach out on Mastodon or find me on the Club MacStories Discord server.\nI also wanted to share two videos by Stephen Robles and Brandon Butch showing S-GPT in action and how you can integrate the shortcut in vastly different workflows:\n\n\nS-GPT Extras: Additional Personalities and ‘Making Of’ Class\nIn addition to the main S-GPT shortcut (which is and will continue to be free for everyone), I also created some paid extras for Club MacStories members over the past week.\nIn MacStories Weekly (available with a Club MacStories plan at $5/month or $50/year), I shared a set of prompts to infuse S-GPT with different personalities. There’s an evil one; there’s a personality that is modeled after Steve Jobs’ speech style and drops Apple references everywhere; there’s even one inspired by Roy Kent. You can sign up for a Club MacStories account and check out all the other perks it offers here, or by using the buttons below.\nRoy-GPT.\n\nJoin Annual$50/yearJoin Monthly$5/month\n\nToday for Club MacStories+ and Premier members (the Premier tier is $12/month or $120/year and it’s the all-access pass for everything we go, including a longer, ad-free version of AppStories and a Discord server), I published an in-depth ‘Making Of’ post about S-GPT. Think of this as a masterclass about working with the ChatGPT API in Shortcuts, dealing with web APIs and Shortcuts in general, plus other advanced techniques that you can study and reuse in your automations. I’m already working on Part 2 for this series that will focus on the native integrations of S-GPT and how I built them.\nMy latest Automation Academy class.\nYou can read more about Club Premier here and sign up using the buttons below.\n\nJoin Annual$120/yearJoin Monthly$12/month\n\nDownload S-GPT 1.0.2\nThat’s all for now. You can download version 1.0.2 of S-GPT and S-GPT Encoder below or via the MacStories Shortcuts Archive. The original story has been updated with new links as well; remember to save your existing OpenAI API token somewhere since you’ll have to paste it again into S-GPT 1.0.2 at setup.\nIf you have ideas and requests for S-GPT, you know where to find me.\n\n \n \n S-GPTS-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.\nGet the shortcut here.\n\n \n \n\n\n \n \n S-GPT EncoderThis is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.\nGet the shortcut here.\n\n \n \n\nYou can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-04-13T12:24:43-04:00", "date_modified": "2024-12-02T14:53:53-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "Automation April", "ChatGPT", "iOS", "mac", "S-GPT", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=71809", "url": "https://www.macstories.net/reviews/automation-april-the-loupedeck-live-s-is-a-more-portable-and-affordable-automation-control-panel-for-the-mac/", "title": "Automation April: The Loupedeck Live S Is a More Portable and Affordable Automation Control Panel for the Mac", "content_html": "
\"\"

\n

In 2021, I reviewed the Loupedeck Live, a programmable control panel for the Mac and Windows PCs for Club MacStories members as part of my Macintosh Desktop Experience column. It’s an excellent device, but its price put it at a disadvantage to a similarly-sized Elgato Stream Deck despite several other advantages that I explained in the review.

\n

Last year, Loupedeck released the Loupedeck Live S, a smaller, more affordable Loupedeck that retains the core experience of the Loupedeck Live, but dispenses with a handful of physical buttons and dials. The new device retails for $189 compared to the Loupedeck Live, which is $269. That’s still $40 more than the 15-button Stream Deck MK.2, but a significantly narrower difference for a device that offers a wider range of functionality, making it worth another look if you were put off by the Loupedeck Live’s price.

\n

\n

Before digging into the details, I want to note that Loupedeck are providing us with a Loupedeck Live S free of charge as one of the prizes that we’ll be awarding to the winner of the Best Overall Shortcut as part of the Automation April Shortcuts Contest. However, this is not a sponsored post and reflects my options about the Loupedeck Live S, which I think you’ll find are in line with my previous reviews of Loupedeck products.

\n

Like the Loupedeck Live, the Live S has the same number of LCD buttons in the center of the device. There are three rows of five buttons for 15 in total that can be assigned a variety of actions. What’s changed is that instead of six dials, the Live S has two, and instead of eight physical buttons, the Live S has four.

\n
\"\"

\n

One advantage of the reduced number of buttons is the Loupedeck Live S’s size. At 150x110x30mm, 5.9” (150 mm) × 4.3” (110 mm) × 1.18” (30 mm), it’s comparable in size to an iPhone 14 Pro Max and about 10 grams lighter. The Live S can sit flat on a table or propped up at a fixed angle using the included stand that snaps into place along the top edge of the device. There are rubber feet on the back, which help keep the device from sliding on your desk too.

\n

The removable stand is one of my favorite features of the Loupedeck Live S. When it’s removed, the stand is flat, making it far easier to take with you on the go. In contrast, the Stream Deck has a removable stand, but it’s a chunky v-shaped piece of plastic that isn’t easily packed in a bag.

\n
\"Going

Going mobile with the Loupedeck Live S.

\"A

A $15 solution to carrying the Live S with you.

\n

One of my first purchases after Loupedeck sent me the Live S to test was an inexpensive portable hard drive case. On one side of the hard-shell case’s interior, there’s an elastic strap that holds the Loupedeck Live S in place, and on the other side, there’s a mesh pouch that holds the stand and a short USB-C cable for connecting the device to my MacBook Air. I’ve never considered taking a Stream Deck or the Loupedeck Live with me in my backpack, but the diminutive size and light weight of the Loupedeck Live S make it an excellent travel companion.

\n
\"\"

\n

The Live S has fewer buttons and dials than the Live, but I don’t miss them. The dials are great for incremental adjustments like applying slider-based effects and filters to photos, controlling volume and brightness, or zooming in and out of a Logic Pro X timeline. The dials can also be pressed to execute an action, which makes them the most versatile controls on the device. However, only having two is not a major compromise if, like me, your primary use of a Loupedeck is to run shortcuts, macros, and other automations. If you’re doing the sort of photo or video editing where more dials would be an advantage, I’d suggest looking at the Loupedeck Live, Loupedeck+, which I reviewed in 2019, or CT, which are more focused on those sorts of workflows.

\n

On the Loupedeck Live, there are eight buttons along the bottom edge of the device for switching between workspaces, which I’ll cover more in depth below, or triggering actions. The Live S’s physical buttons are along the sides of the LCD buttons, which enables the smaller form factor. The buttons include a circular LCD element, which can be customized too.

\n
\"Building

Building images for the Loupedeck Live S with Button Maker.

\n

What the Live and Live S have in common are the fifteen full-color LCD buttons in the center of the device. The buttons, which trigger customizable actions, are touch sensitive and feature haptic feedback. Although the Live S’s buttons display color, I prefer to use black and white images for them, so the device fades into the background of my desk setup instead of drawing my attention. The icons I use are a combination of SF Symbols and the MacStories Shortcuts Icons by Slivia Gatta that I adapted for the Loupedeck Live S using Button Creator. You can also set up multiple pages of LCD buttons that are accessed with a swipe across the screen.

\n
\"Sitting

Sitting outside with the Loupedeck Live S.

\n

The build quality of the Loupedeck Live S is excellent. The physical buttons and dials are clicky when pressed, and the dials have a satisfying incremental click as you twist them. I’m a fan of the LCD buttons too. Icons and text are sharp and can be set at a brightness that works with your environment. It’s a design that’s head and shoulders above the Stream Deck, which is chunky and uses concave buttons that distort the images beneath them. In contrast, the Loupedeck Live S is far sleeker, which is important for something that’s going to sit on your desk every day.

\n
\"Loupedeck's

Loupedeck’s setup utility.

\n

Setting up the Live S’s buttons and dials is accomplished from the Loupedeck app, which has evolved since I reviewed the Loupedeck Live. The app isn’t going to win any awards for following conventional Mac metaphors; you don’t have to look farther than the keyboard shortcut to invoke the app’s Preferences, which is ⌘P, to see that. However, it’s also not an app you’ll be using day in and day out. You’ll use it a lot initially as you set up actions, but once you have a configuration you like, you’ll only use it now and then to add new actions or modify your existing setup, so I can’t get to worked up about it.

\n

That said, the app itself has improved quite a bit since I last reviewed it. The bulk of the app’s window is dominated by an image of your Loupedeck Live S. To the right is where you can add OS, device, custom, and plugin-based actions, and, at the very top of the window, is where you choose between devices and switch between profiles and workspaces.

\n
\"Working

Working with Photoshop’s profile.

\n

My main profile is my macOS Default profile, which is a set of actions that appears whenever I’m using an app for which I haven’t created a specific profile. Loupedeck comes with a bunch of predefined profiles for apps like Ableton Live, Adobe apps, and Capture One, and there’s also a marketplace you can browse for more. Alternatively, you can build your own profiles for other apps. I’ve been working on an Obsidian profile, for example.

\n
\"Switching

Switching profiles.

\n

What’s interesting about profiles is that they can be assigned to buttons or activated dynamically when you switch to an app. Dynamic switching is great if you work in a single app for long periods of time and have set up multiple pages of actions you rely on. That way, as soon as you open the app, your Loupedeck Live S switches to the controls you set up for that app. Because I use multiple apps at the same time so frequently, I haven’t used profiles a lot. Instead, I mostly stick to my macOS Default profile and manually switch to other profiles as needed.

\n

For example, my Obsidian profile is tied to the purple button on the right side of my Live S. Pressing it switches me to a dedicated page of Obsidian actions that I’m in the process of building out, and it stays there until I tap the green button on the left side of the Live S to return to my default profile.

\n
\"A

A Loupedeck workspace.

\n

Workspaces are similar to profiles and can be composed of unique combinations of actions that you switch between for different tasks. However, workspaces don’t include the physical round buttons of the Live S, allowing those buttons to be used to switch between workspaces. The conceptual difference between profiles and workspaces can be difficult to grasp until you use them, but at the same time, I expect most people can get a lot out of the Loupedeck Live S just by sticking to profiles.

\n

Once you get beyond the organizational structure of the Live S, it’s worth poking around to see the actions you can assign to buttons. There is a series of OS-level actions, including:

\n

There are also built-in actions for navigating and controlling the Loupedeck Live S itself.

\n
\"Loupedeck's

Loupedeck’s Custom actions offer AppleScript, keyboard shortcut, macro, and other support.

\n

However, the actions I think most MacStories readers will use the most are the Custom actions for:

\n

Most of the actions are self-explanatory, but it’s worth considering the difference between multi-toggle actions and multi-actions. Multi-toggle actions execute up to five separate actions sequentially, whereas multi-actions execute individual actions simultaneously or in sequence, allowing for some very powerful automations.

\n

I, however, use the Keyboard Shortcut action more than any other. That’s because most of what I trigger from the Loupedeck Live S are features of apps that have predefined keyboard shortcuts or shortcuts built using the Shortcuts app. Some of the shortcuts I’ve built could be constructed, in part, using Loupedeck Multi-toggle or Multi-actions, but then they wouldn’t be available when I’m not using the Live S, which is why I assign them keyboard shortcuts to everything that can be accessed with a press of a Live S button, by their keyboard shortcut, or using Raycast.

\n
\"I

I use Raycast for shortcuts that require text input like S-GPT, but others work better as buttons on the Live S.

\n

This brings me to a broader point about automation on the Mac and how my use of control panels like the Live S has changed since I reviewed the Loupedeck Live. Raycast has cut down on my use of control panels a lot because it’s always available, and I never have to take my hands off the keyboard to trigger actions. That makes a lot of automations faster to trigger with Raycast.

\n

Still, I’ve found that I like the Live S for activating menu bar apps without searching for their tiny and potentially hidden, icons, controlling apps I keep hidden like Sleeve, and activating shortcuts for activities such as web browsing when my hands are less likely to be on the keyboard.

\n
\"My

My current, work-in-process default setup.

\n

Even these things could be handled by something like Raycast, but sometimes it just feels easier to tap a single button. For me, that’s meant using the Loupedeck Live S to primarily do things like:

\n

I’ve also got my Mac’s system volume and music volume connected to the Live S’s dials and profiles dedicated to Obsidian and Logic Pro X.

\n
\"\"

\n

I like what Loupedeck has done with the Live S a lot. The device retains the excellent design and versatility of the Loupedeck Live that I reviewed previously in a more affordable and smaller package. That makes it fit better with use cases like mine. I’m not streaming video or working in photo or video editors all day. Instead, I want a subset of my automations and app features available to me at the press of a button or twist of a dial. With the Loupedeck Live S, you get exactly that in a package that easily fits in a bag, making it a terrific Mac companion on your desk or on the go.

\n

You can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "In 2021, I reviewed the Loupedeck Live, a programmable control panel for the Mac and Windows PCs for Club MacStories members as part of my Macintosh Desktop Experience column. It’s an excellent device, but its price put it at a disadvantage to a similarly-sized Elgato Stream Deck despite several other advantages that I explained in the review.\nLast year, Loupedeck released the Loupedeck Live S, a smaller, more affordable Loupedeck that retains the core experience of the Loupedeck Live, but dispenses with a handful of physical buttons and dials. The new device retails for $189 compared to the Loupedeck Live, which is $269. That’s still $40 more than the 15-button Stream Deck MK.2, but a significantly narrower difference for a device that offers a wider range of functionality, making it worth another look if you were put off by the Loupedeck Live’s price.\n\nBefore digging into the details, I want to note that Loupedeck are providing us with a Loupedeck Live S free of charge as one of the prizes that we’ll be awarding to the winner of the Best Overall Shortcut as part of the Automation April Shortcuts Contest. However, this is not a sponsored post and reflects my options about the Loupedeck Live S, which I think you’ll find are in line with my previous reviews of Loupedeck products.\nLike the Loupedeck Live, the Live S has the same number of LCD buttons in the center of the device. There are three rows of five buttons for 15 in total that can be assigned a variety of actions. What’s changed is that instead of six dials, the Live S has two, and instead of eight physical buttons, the Live S has four.\n\nOne advantage of the reduced number of buttons is the Loupedeck Live S’s size. At 150x110x30mm, 5.9” (150 mm) × 4.3” (110 mm) × 1.18” (30 mm), it’s comparable in size to an iPhone 14 Pro Max and about 10 grams lighter. The Live S can sit flat on a table or propped up at a fixed angle using the included stand that snaps into place along the top edge of the device. There are rubber feet on the back, which help keep the device from sliding on your desk too.\nThe removable stand is one of my favorite features of the Loupedeck Live S. When it’s removed, the stand is flat, making it far easier to take with you on the go. In contrast, the Stream Deck has a removable stand, but it’s a chunky v-shaped piece of plastic that isn’t easily packed in a bag.\nGoing mobile with the Loupedeck Live S.A $15 solution to carrying the Live S with you.\nOne of my first purchases after Loupedeck sent me the Live S to test was an inexpensive portable hard drive case. On one side of the hard-shell case’s interior, there’s an elastic strap that holds the Loupedeck Live S in place, and on the other side, there’s a mesh pouch that holds the stand and a short USB-C cable for connecting the device to my MacBook Air. I’ve never considered taking a Stream Deck or the Loupedeck Live with me in my backpack, but the diminutive size and light weight of the Loupedeck Live S make it an excellent travel companion.\n\nThe Live S has fewer buttons and dials than the Live, but I don’t miss them. The dials are great for incremental adjustments like applying slider-based effects and filters to photos, controlling volume and brightness, or zooming in and out of a Logic Pro X timeline. The dials can also be pressed to execute an action, which makes them the most versatile controls on the device. However, only having two is not a major compromise if, like me, your primary use of a Loupedeck is to run shortcuts, macros, and other automations. If you’re doing the sort of photo or video editing where more dials would be an advantage, I’d suggest looking at the Loupedeck Live, Loupedeck+, which I reviewed in 2019, or CT, which are more focused on those sorts of workflows.\nOn the Loupedeck Live, there are eight buttons along the bottom edge of the device for switching between workspaces, which I’ll cover more in depth below, or triggering actions. The Live S’s physical buttons are along the sides of the LCD buttons, which enables the smaller form factor. The buttons include a circular LCD element, which can be customized too.\nBuilding images for the Loupedeck Live S with Button Maker.\nWhat the Live and Live S have in common are the fifteen full-color LCD buttons in the center of the device. The buttons, which trigger customizable actions, are touch sensitive and feature haptic feedback. Although the Live S’s buttons display color, I prefer to use black and white images for them, so the device fades into the background of my desk setup instead of drawing my attention. The icons I use are a combination of SF Symbols and the MacStories Shortcuts Icons by Slivia Gatta that I adapted for the Loupedeck Live S using Button Creator. You can also set up multiple pages of LCD buttons that are accessed with a swipe across the screen.\nSitting outside with the Loupedeck Live S.\nThe build quality of the Loupedeck Live S is excellent. The physical buttons and dials are clicky when pressed, and the dials have a satisfying incremental click as you twist them. I’m a fan of the LCD buttons too. Icons and text are sharp and can be set at a brightness that works with your environment. It’s a design that’s head and shoulders above the Stream Deck, which is chunky and uses concave buttons that distort the images beneath them. In contrast, the Loupedeck Live S is far sleeker, which is important for something that’s going to sit on your desk every day.\nLoupedeck’s setup utility.\nSetting up the Live S’s buttons and dials is accomplished from the Loupedeck app, which has evolved since I reviewed the Loupedeck Live. The app isn’t going to win any awards for following conventional Mac metaphors; you don’t have to look farther than the keyboard shortcut to invoke the app’s Preferences, which is ⌘P, to see that. However, it’s also not an app you’ll be using day in and day out. You’ll use it a lot initially as you set up actions, but once you have a configuration you like, you’ll only use it now and then to add new actions or modify your existing setup, so I can’t get to worked up about it.\nThat said, the app itself has improved quite a bit since I last reviewed it. The bulk of the app’s window is dominated by an image of your Loupedeck Live S. To the right is where you can add OS, device, custom, and plugin-based actions, and, at the very top of the window, is where you choose between devices and switch between profiles and workspaces.\nWorking with Photoshop’s profile.\nMy main profile is my macOS Default profile, which is a set of actions that appears whenever I’m using an app for which I haven’t created a specific profile. Loupedeck comes with a bunch of predefined profiles for apps like Ableton Live, Adobe apps, and Capture One, and there’s also a marketplace you can browse for more. Alternatively, you can build your own profiles for other apps. I’ve been working on an Obsidian profile, for example.\nSwitching profiles.\nWhat’s interesting about profiles is that they can be assigned to buttons or activated dynamically when you switch to an app. Dynamic switching is great if you work in a single app for long periods of time and have set up multiple pages of actions you rely on. That way, as soon as you open the app, your Loupedeck Live S switches to the controls you set up for that app. Because I use multiple apps at the same time so frequently, I haven’t used profiles a lot. Instead, I mostly stick to my macOS Default profile and manually switch to other profiles as needed.\nFor example, my Obsidian profile is tied to the purple button on the right side of my Live S. Pressing it switches me to a dedicated page of Obsidian actions that I’m in the process of building out, and it stays there until I tap the green button on the left side of the Live S to return to my default profile.\nA Loupedeck workspace.\nWorkspaces are similar to profiles and can be composed of unique combinations of actions that you switch between for different tasks. However, workspaces don’t include the physical round buttons of the Live S, allowing those buttons to be used to switch between workspaces. The conceptual difference between profiles and workspaces can be difficult to grasp until you use them, but at the same time, I expect most people can get a lot out of the Loupedeck Live S just by sticking to profiles.\nOnce you get beyond the organizational structure of the Live S, it’s worth poking around to see the actions you can assign to buttons. There is a series of OS-level actions, including:\nClipboard actions\nTime and date actions\nActions for controlling all the keys on your Mac\nMedia and mouse controls\nSystem app controls for Finder, Activity Monitor, and System Settings\nThere are also built-in actions for navigating and controlling the Loupedeck Live S itself.\nLoupedeck’s Custom actions offer AppleScript, keyboard shortcut, macro, and other support.\nHowever, the actions I think most MacStories readers will use the most are the Custom actions for:\nAppleScript\nKeyboard shortcuts\nOpening apps\nOpening webpages\nPlaying sounds\nInserting Text\nExecuting multiple actions\nTriggering Multi-toggle actions\nAdjusting the device’s dials \nLaunching executable files or URLs\nMost of the actions are self-explanatory, but it’s worth considering the difference between multi-toggle actions and multi-actions. Multi-toggle actions execute up to five separate actions sequentially, whereas multi-actions execute individual actions simultaneously or in sequence, allowing for some very powerful automations.\nI, however, use the Keyboard Shortcut action more than any other. That’s because most of what I trigger from the Loupedeck Live S are features of apps that have predefined keyboard shortcuts or shortcuts built using the Shortcuts app. Some of the shortcuts I’ve built could be constructed, in part, using Loupedeck Multi-toggle or Multi-actions, but then they wouldn’t be available when I’m not using the Live S, which is why I assign them keyboard shortcuts to everything that can be accessed with a press of a Live S button, by their keyboard shortcut, or using Raycast.\nI use Raycast for shortcuts that require text input like S-GPT, but others work better as buttons on the Live S.\nThis brings me to a broader point about automation on the Mac and how my use of control panels like the Live S has changed since I reviewed the Loupedeck Live. Raycast has cut down on my use of control panels a lot because it’s always available, and I never have to take my hands off the keyboard to trigger actions. That makes a lot of automations faster to trigger with Raycast.\nStill, I’ve found that I like the Live S for activating menu bar apps without searching for their tiny and potentially hidden, icons, controlling apps I keep hidden like Sleeve, and activating shortcuts for activities such as web browsing when my hands are less likely to be on the keyboard.\nMy current, work-in-process default setup.\nEven these things could be handled by something like Raycast, but sometimes it just feels easier to tap a single button. For me, that’s meant using the Loupedeck Live S to primarily do things like:\nQuickly checking my calendar or the time in Rome using Dato’s menu bar app\nActivating Dato’s quick event entry window\nRevealing Sleeve’s playback controls\nLoving a song I’m listening to in Music\nSending links to Raindrop.io or Matter\nGenerating Markdown-formatted and plain links from webpages I’m reading\nToggling between Obsidian’s source and preview modes\nPutting my Mac to sleep when I’m finished working\nI’ve also got my Mac’s system volume and music volume connected to the Live S’s dials and profiles dedicated to Obsidian and Logic Pro X.\n\nI like what Loupedeck has done with the Live S a lot. The device retains the excellent design and versatility of the Loupedeck Live that I reviewed previously in a more affordable and smaller package. That makes it fit better with use cases like mine. I’m not streaming video or working in photo or video editors all day. Instead, I want a subset of my automations and app features available to me at the press of a button or twist of a dial. With the Loupedeck Live S, you get exactly that in a package that easily fits in a bag, making it a terrific Mac companion on your desk or on the go.\nYou can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-04-13T10:01:56-04:00", "date_modified": "2023-04-14T14:55:11-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "accessories", "automation", "Automation April", "mac", "shortcuts", "reviews" ] }, { "id": "https://www.macstories.net/?p=71785", "url": "https://www.macstories.net/linked/s-gpt-1-0-1/", "title": "S-GPT 1.0.1", "content_html": "

I just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.

\n

Version 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:

\n

Additionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.

\n

In case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.

\n

I just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.

\n

Version 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:

\n

Additionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.

\n

In case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

S-GPT

S-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

S-GPT Encoder

This is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.

\n

Get the shortcut here.

\n\n
\n
\n
\n

\u2192 Source: macstories.net

", "content_text": "I just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.\nVersion 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:\nA proper error-checking alert that tells what went wrong with a request to the ChatGPT API;\nA better summarization of Safari webpages passed from the share sheet that no longer “hallucinates” results but actually summarizes text extracted via Safari’s Reader technology from any web article;\nA new behavior for text input on watchOS, which now defaults to dictation rather than keyboard input. I’ve covered this more in detail in today’s issue of MacStories Weekly for Club members.\nAdditionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.\nIn case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.\nI just released a small bug fix update for S-GPT, my shortcut to integrate OpenAI’s ChatGPT large language model with the Shortcuts app on all Apple platforms.\nVersion 1.0.1 of S-GPT is a relatively minor update that comes with an initial round of improvements based on early feedback I’ve received for the shortcut, including:\nA proper error-checking alert that tells what went wrong with a request to the ChatGPT API;\nA better summarization of Safari webpages passed from the share sheet that no longer “hallucinates” results but actually summarizes text extracted via Safari’s Reader technology from any web article;\nA new behavior for text input on watchOS, which now defaults to dictation rather than keyboard input. I’ve covered this more in detail in today’s issue of MacStories Weekly for Club members.\nAdditionally, I also realized that the usage tips that S-GPT was displaying every time it asked you to enter some text may have been nice the first three times you used the shortcut, but became annoying very quickly. That was especially true when using S-GPT with Siri in a voice context since they would be read aloud every time. For these reasons, I removed tips and simplified the shortcut’s questions to “What do you want to ask?” and “Want to follow up?”.\nIn case you missed my introduction of S-GPT earlier this week, you can read the original story here and find out more about how the shortcut works and what it does. I updated the links to the S-GPT and S-GPT Encoder shortcuts in the story to the latest version; you can also find the updated shortcuts in MacStories Shortcuts Archive.\n\n \n \n S-GPTS-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.\nGet the shortcut here.\n\n \n \n\n\n \n \n S-GPT EncoderThis is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.\nGet the shortcut here.\n\n \n \n\n\u2192 Source: macstories.net", "date_published": "2023-04-07T12:06:22-04:00", "date_modified": "2024-12-02T14:53:16-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "Automation April", "ChatGPT", "S-GPT", "shortcuts", "Linked" ] }, { "id": "https://www.macstories.net/?p=71774", "url": "https://www.macstories.net/ios/introducing-s-gpt-a-shortcut-to-connect-openais-chatgpt-with-native-features-of-apples-operating-systems/", "title": "Introducing S-GPT, A Shortcut to Connect OpenAI\u2019s ChatGPT with Native Features of Apple\u2019s Operating Systems", "content_html": "
\"S-GPT

S-GPT for Shortcuts.

\n

Update, April 13: I’ve updated S-GPT to version 1.0.2. You can read the full changelog here. All download links have been updated.

\n

Update, April 13: For Club MacStories+ and Premier members, I’ve published Part 1 of an extensive ‘Making Of’ series about S-GPT. This is a technical deep dive for my Automation Academy series. You can find it here and sign up for or upgrade to a Premier account using the buttons below.

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n

Update, April 7: For Club MacStories members, I’ve shared some optional prompts to add different personalities to S-GPT, including two inspired by Roy Kent and Steve Jobs. You can get the prompts and read more here; the main S-GPT shortcut is and will remain free-to-use for everyone, of course.

\n
\nJoin Annual$50/yearJoin Monthly$5/month\n
\n

Update, April 7: I’ve updated S-GPT to version 1.0.1. You can read more details here. All download links to the shortcuts have been updated to the latest version.

\n

It’s the inaugural week of the second annual edition of Automation April, and to celebrate the occasion, I’ve been working on something special: today, I’m introducing S-GPT, an advanced conversational shortcut for ChatGPT that bridges OpenAI’s assistant to native system features of iOS, iPadOS, macOS, and watchOS.

\n

S-GPT (which stands for Shortcuts-GPT) is free to use for everyone, but it requires an OpenAI account with an associated pay-as-you-go billing plan since it takes advantage of OpenAI’s developer API, which has a cost. S-GPT was built with the latest ChatGPT API, and it can be used both with the existing ChatGPT 3.5 model or – if you have access to it – the ChatGPT 4 API.

\n

While the shortcut is free for MacStories readers, I will be publishing a detailed, in-depth Automation Academy class soon for Club MacStories Plus or Premier members to explain the techniques and strategies I used to build this shortcut. I genuinely think that S-GPT is, from a technical perspective, my best and most advanced work to date; I hope my Academy class will help others learn some useful tips for Shortcuts and, in return, make even better automations for our contest.

\n

With that said, let’s look at what S-GPT is and what you can do with it.

\n

\n

Getting Started with S-GPT

\n

As I noted above, the first thing you should do if you want to use S-GPT is create an OpenAI account and make sure you have billing set up with pay-as-you-go; you’re going to pay very little for what you’re actually using with the ChatGPT API. The shortcut uses the native ChatGPT API, and that costs money for every call to the API; since my shortcut is free to use, you’ll have to provide your own API key.

\n
\"Setting

Setting up S-GPT with your own API key.

\n

Thankfully, since S-GPT was built with the new ChatGPT API, the cost of those API calls is going to be extremely small: the new model used by the ChatGPT API is very cost-efficient, as you can read here. To give you some context, I’ve been testing S-GPT extensively for the past month, and my usage is up to $1.50 so far. (The actual cost for the GPT 3.5 model: $0.002 / 1K tokens.)

\n

My recommendation is to not upgrade to ChatGPT Plus but instead set up a pay-as-you-go billing method with a spending limit. You can do so from the Billing page. Once you’ve done that, head over to the API Keys page, create a new secret key for your account, and copy it. You’ll be asked by S-GPT at setup to paste your API key, and that’s it.

\n
\"The

The Billing page on the OpenAI website.

\n

If you get an error from S-GPT without any response, it’s likely that you haven’t set up a billing method or are trying to use an old API key. I strongly recommend generating a new API key from scratch if you just set up a billing method on the OpenAI website.

\n

After pasting in your secret key, you’ll no longer have to see any other ChatGPT code or special syntax. I designed the shortcut to be intuitive, visual, and as native as possible on Apple platforms.

\n

There is one optional setting you can change in S-GPT: if you have access to the ChatGPT 4 API (which is invitation-only at the moment), you can replace the default model used by S-GPT with the updated one.

\n
\"If

If you have access to the ChatGPT 4 API, this is where you can replace which model to use in the shortcut.

\n

Toward the beginning of the shortcut, find the ‘Text’ action that contains gpt-3.5-turbo and replace it with gpt-4. Again: you should only do this if your account has access to the ChatGPT 4 API. If not, you should continue using the default ChatGPT 3.5 model, which is fast, inexpensive, and accurate. In my tests, I haven’t noticed meaningful performance improvements in the GPT 4 model compared to 3.5.

\n

S-GPT and Conversational Mode

\n

At a high level, S-GPT is a shortcut that lets you ask ChatGPT questions from an input box on your iPhone, iPad, or Mac; answers are returned by ChatGPT and displayed in an alert on your devices. You can ask whatever you want, wait a couple of seconds, and get a response back from the assistant. S-GPT only supports text, and there are no limits in terms of question length.

\n
\"Conversations

Conversations with S-GPT.

\n

There are several aspects of S-GPT, however, that set it apart from similar shortcuts you may have seen in recent months. Let me start from the underlying foundation of this shortcut.

\n

S-GPT uses the new chat API released by OpenAI, which is more cost-efficient than the previous text completion API and can produce high-quality results. More importantly, S-GPT supports conversational mode: as you talk to ChatGPT and ask follow-up questions in the same “session”, S-GPT retains the context of your previous questions and the assistant’s series of answers. In fact, you can stop the shortcut at any point and export a full log of an entire conversation as a single transcript.

\n
\"A

A full conversation saved to a text file.

\n

The ability to hold a back-and-forth conversation, as we’ll see later, brings some terrific advantages over using, say, Siri for certain tasks. Unlike other shortcuts for ChatGPT, your conversations are only ever sent to the OpenAI API: by default, the shortcut does not keep a log or cache of your chats unless you manually ask it to export a transcript.

\n

S-GPT was designed to provide users with concise and clear answers that can be read in just a few seconds. I did this because S-GPT can be used both as a shortcut launched from the Shortcuts app, an icon on the Home Screen, or a widget, or as a shortcut running inside Siri. When S-GPT runs inside Siri, it’s also running with more memory constraints and I’m guessing you wouldn’t want Siri to speak an answer that takes two minutes to be read in full. So, thanks to the ChatGPT API, I was able to assign a specific “personality” and role to S-GPT. This is the system prompt that controls S-GPT’s default behavior:

\n

\n You are S-GPT, a fork of ChatGPT created by Federico Viticci. You have all the capabilities of ChatGPT but you run inside Apple’s Shortcuts app and Siri.

\n

Your responses should be informative and clear, but not excessively long. You have to help users quickly.

\n

Users need to be able to listen to your answers in 15 seconds. Never go longer unless I ask you to be more detailed.\n

\n

At any point, you can ask S-GPT to be more detailed or to tell you more about a specific topic; whenever possible, S-GPT will prioritize brevity and short, informative responses.

\n
\"You

You can always ask S-GPT for more detail about any topic.

\n

As I will explain in the Automation Academy class for Club Plus and Premier members, S-GPT requires an additional helper shortcut to be installed. The shortcut, which you can find at the end of this article, is called S-GPT Encoder and it’s a helper utility that runs as a sub-module. The shortcut helps S-GPT properly encode and decode information; it was necessary to offer the ability to ask a single question or have an entire conversation with S-GPT.

\n

The defining feature of S-GPT, however, is its native integration with Apple’s platforms.

\n

With S-GPT, I wanted to start building a bridge between ChatGPT and Apple’s OSes. I know that this is a lofty goal, and there’s only so much I can do with Shortcuts, but I look at how Microsoft is integrating ChatGPT with Windows, and I’m jealous that the same isn’t true on Apple’s platforms (and likely will never be).

\n

So, more than a simple bot to have a conversation with ChatGPT in Shortcuts, I set out to create a tool that would connect ChatGPT responses to native iOS, iPadOS, macOS, and watchOS functionalities. I wanted to create a ChatGPT-based utility that would help you process your data and make things happen on your computer rather than simply answer trivia questions or write poems.

\n

The most powerful aspect of S-GPT is how Shortcuts becomes the glue between ChatGPT and your devices with a local, on-device, privacy-conscious approach. There are several native integrations in S-GPT already, and I have a long list of future ones to add in subsequent updates.

\n

Let’s take a look.

\n

The Native Integrations of S-GPT

\n

In this first version of S-GPT, the shortcut supports the following integrations on iOS, iPadOS, and macOS:

\n
\"S-GPT

S-GPT integrations.

\n

As you can see from the list above, I tried to come up with a series of features for version 1.0 of this shortcut that would appeal to a wide range of users on different platforms, and I have more planned for future updates. The integrations are triggered by a set of prebuilt words or sentences (which I listed above) and, right now, S-GPT only supports the English language.

\n

Now, allow me to dig a little deeper into a few examples of the integrations supported by S-GPT and what you can do with this shortcut.

\n

By far my favorite feature of S-GPT is the ability to ask ChatGPT for a list of songs and turn that into an actual playlist in the Music app via Shortcuts’ playlist actions. What’s amazing about this is that the command can be issued immediately with details of the kind of playlist you’re looking for or later in a conversation, retaining the context of what was discussed before.

\n

For instance, if you ask S-GPT this:

\n

\n Make me a playlist with 20 popular songs by Oasis and Blur\n

\n

ChatGPT will use its intelligence to understand what you mean, it’ll pass back a list of songs to S-GPT, and you’ll be asked to enter a name for the playlist. Wait a few moments, open the Music app, and boom:

\n
\"Making

Making a playlist with S-GPT.

\n

To me, this is incredible: ChatGPT can turn a moderately complex natural language query into a list of songs; the engine I created in S-GPT translates that into a playlist inside Apple’s Music app. But we can do better than this.

\n

One of the perks of ChatGPT is that it can go multiple levels deep into the meaning of a query. So, imagine this prompt:

\n

\n I want a playlist with the top 15 songs by the members of boygenius\n

\n

Ask this, and S-GPT will create a playlist with the top songs by Phoebe Bridgers, Julien Baker, and Lucy Dacus. ChatGPT knows how to search for “the members of boygenius” and it returns songs from the individual artists who are members of this supergroup.

\n
\"ChatGPT

ChatGPT knew what I meant, and S-GPT created the playlist for my query.

\n

Siri, by comparison, has no idea what to do with this query.

\n
\"Oh,

Oh, Siri.

\n

This is still the surface. ChatGPT can find and recommend songs by vibe, release date, mood, and more. Imagine this:

\n

\n I’m feeling nostalgic. Make me a playlist with 25 mellow indie rock songs released between 2000 and 2010 and sort them by release year, from oldest to most recent.\n

\n
\"The

The natural language prompt in S-GPT.

\n

That prompt will take a while to execute (more on why in the Automation Academy), but it’ll work, and it’ll generate this playlist:

\n
\"You

You can’t go wrong with Death Cab and Modest Mouse.

\n

The list of mind-blowing music examples could go on forever, and even though ChatGPT isn’t perfect at music recommendations, it’s pretty good, and I just love the ability to quickly and dynamically make a new playlist based on a set of arbitrary commands. This integration reminds me of the never-forgotten Sentence feature from Beats Music.

\n

The last thing I’d point out is that S-GPT can hold the context of the current conversation, which works by asking it to generate a playlist later in the chat too. So, if S-GPT returns a list of songs and you then decide to turn that into a playlist, that flow will also work. Check out the screenshots below for an example of this:

\n
\"From

From a conversation to a playlist in the Music app, all thanks to ChatGPT and Shortcuts.

\n

The other integration I’d like to call out is the clipboard one. By simply asking S-GPT to do “something” with the clipboard, the shortcut will be able to access the text contents of your system clipboard and pass that to ChatGPT for processing. For example, this command…

\n

\n Check the paragraphs of text in my clipboard for grammar mistakes. Provide a list of mistakes, annotate them, and offer suggestions for fixes.\n

\n

…will allow S-GPT to access multiple paragraphs of text you’ve previously copied to the clipboard, and it’ll ask ChatGPT to process them for grammar mistakes based on its language model.

\n
\"An

An example of S-GPT checking text in my clipboard for errors.

\n

The flexibility of this integration is only limited by your imagination. Want to quickly summarize an article from Safari? Open an article’s Safari Reader view, copy all text in it, then ask S-GPT…

\n

\n Summarize the text in my clipboard\n

\n

…and you’ll have a summary, ready to be copied or saved somewhere.

\n
\"As

As long as the article isn’t too long (don’t try and give 5,000 words to ChatGPT via the API), this will work.

\n

How about asking for a list of adjectives and adverbs contained in your clipboard? Sure thing:

\n
\"Adverbs

Adverbs and adjectives as scanned from my iPad’s clipboard.

\n

And what about going back to the original text and asking to also translate it to Italian? That also works.

\n
\"Context

Context retention and translation in S-GPT.

\n

Speaking of copying and saving chat transcripts: S-GPT comes with a series of actions that you can perform to save or export the conversation you had with it. To invoke the list of actions, simply say…

\n

\n Export chat\n

\n

…and you’ll be presented with the following menu:

\n
\"S-GPT's

S-GPT’s list of export actions.

\n

As you can see, S-GPT offers buttons to copy the full chat log to the clipboard, save it as a text file, copy the last response from ChatGPT only, or even translate everything to a different language using Apple’s own Translate feature for iOS, iPadOS, and macOS. At any point during a conversation, you can say “export chat” to launch this menu and choose what you want to do.

\n

Like I said, I don’t want to get too deep into the technicalities of S-GPT: I intentionally designed this shortcut to be intuitive and flexible so that everyone can find their own use cases and have a unique experience with it.

\n

As Apple often likes to say about its products, I can’t wait to see what you make with this shortcut.

\n

Coming Soon for Club Members: More Personalities and an Automation Academy Class

\n

But wait, there’s more!

\n

Coming this Friday in MacStories Weekly for all Club members, I designed a series of special “behavioral” prompts to unlock different personalities in S-GPT. There is one, for example, in which the AI is extremely evil, unkind, and malevolent towards you, which was inspired by CARROT Weather. There is another flavor of S-GPT that talks like Steve Jobs and is obsessed with Apple references and fun facts. In MacStories Weekly, I will share some of these prompts so you can infuse S-GPT with different personalities and have some fun with it.

\n
\"Additional

Additional personalities for S-GPT.

\n

For Club Plus and Premier members, I’ve been working on the grand return of my Automation Academy series (it’s been too long, and I apologize) with a deep dive into the making of S-GPT and how the ChatGPT API works in Shortcuts behind the scenes. S-GPT is the most advanced and complex shortcut I’ve ever created. It employs several high-level techniques for data processing, encoding, exceptions, and system integrations that tap into some of the most esoteric advanced options for Shortcuts power users.

\n

Whether you’re interested in building shortcuts for ChatGPT or just want to learn some advanced Shortcuts techniques that will help you during Automation April, this lesson is for you. I’m working on it now, and it’s going to be ready soon. To get access to it, my recommendation is to sign up for a Club Premier account, which includes everything from the base tier of the Club, plus:

\n

You can sign up using the buttons below.

\n
\nJoin Annual$120/yearJoin Monthly$12/month\n
\n

Download S-GPT for Free

\n

The 1.0 version of S-GPT I’m releasing today is just the beginning for this shortcut.

\n

In the weeks I’ve spent building S-GPT, it has turned out to be a transformative shortcut that is altering my idea of chatting with an assistant on iOS, iPadOS, and macOS. I’m happy with the system integrations the shortcut has so far, but I’m working on a lot more for future updates – including the ability to run Terminal commands and scripts on macOS or ways to let ChatGPT process the contents of text documents from Files and Finder. Once I realized the potential for a large language model combined with Shortcuts’ native OS integrations, I knew this shortcut could be something special.

\n

I’m only just getting started with S-GPT, and there’s a lot more to come in the near future. If you’re intrigued by the idea of blending Apple’s OSes and ChatGPT using Shortcuts, you can follow me for updates on Mastodon, and get started today with the first version of S-GPT.

\n
\n
\n \"\"
\n

S-GPT

S-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

S-GPT Encoder

This is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.

\n

Get the shortcut here.

\n\n
\n
\n
\n

You can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "S-GPT for Shortcuts.\nUpdate, April 13: I’ve updated S-GPT to version 1.0.2. You can read the full changelog here. All download links have been updated.\nUpdate, April 13: For Club MacStories+ and Premier members, I’ve published Part 1 of an extensive ‘Making Of’ series about S-GPT. This is a technical deep dive for my Automation Academy series. You can find it here and sign up for or upgrade to a Premier account using the buttons below.\n\nJoin Annual$120/yearJoin Monthly$12/month\n\nUpdate, April 7: For Club MacStories members, I’ve shared some optional prompts to add different personalities to S-GPT, including two inspired by Roy Kent and Steve Jobs. You can get the prompts and read more here; the main S-GPT shortcut is and will remain free-to-use for everyone, of course.\n\nJoin Annual$50/yearJoin Monthly$5/month\n\nUpdate, April 7: I’ve updated S-GPT to version 1.0.1. You can read more details here. All download links to the shortcuts have been updated to the latest version.\nIt’s the inaugural week of the second annual edition of Automation April, and to celebrate the occasion, I’ve been working on something special: today, I’m introducing S-GPT, an advanced conversational shortcut for ChatGPT that bridges OpenAI’s assistant to native system features of iOS, iPadOS, macOS, and watchOS.\nS-GPT (which stands for Shortcuts-GPT) is free to use for everyone, but it requires an OpenAI account with an associated pay-as-you-go billing plan since it takes advantage of OpenAI’s developer API, which has a cost. S-GPT was built with the latest ChatGPT API, and it can be used both with the existing ChatGPT 3.5 model or – if you have access to it – the ChatGPT 4 API.\nWhile the shortcut is free for MacStories readers, I will be publishing a detailed, in-depth Automation Academy class soon for Club MacStories Plus or Premier members to explain the techniques and strategies I used to build this shortcut. I genuinely think that S-GPT is, from a technical perspective, my best and most advanced work to date; I hope my Academy class will help others learn some useful tips for Shortcuts and, in return, make even better automations for our contest.\nWith that said, let’s look at what S-GPT is and what you can do with it.\n\nGetting Started with S-GPT\nAs I noted above, the first thing you should do if you want to use S-GPT is create an OpenAI account and make sure you have billing set up with pay-as-you-go; you’re going to pay very little for what you’re actually using with the ChatGPT API. The shortcut uses the native ChatGPT API, and that costs money for every call to the API; since my shortcut is free to use, you’ll have to provide your own API key.\nSetting up S-GPT with your own API key.\nThankfully, since S-GPT was built with the new ChatGPT API, the cost of those API calls is going to be extremely small: the new model used by the ChatGPT API is very cost-efficient, as you can read here. To give you some context, I’ve been testing S-GPT extensively for the past month, and my usage is up to $1.50 so far. (The actual cost for the GPT 3.5 model: $0.002 / 1K tokens.)\nMy recommendation is to not upgrade to ChatGPT Plus but instead set up a pay-as-you-go billing method with a spending limit. You can do so from the Billing page. Once you’ve done that, head over to the API Keys page, create a new secret key for your account, and copy it. You’ll be asked by S-GPT at setup to paste your API key, and that’s it.\nThe Billing page on the OpenAI website.\nIf you get an error from S-GPT without any response, it’s likely that you haven’t set up a billing method or are trying to use an old API key. I strongly recommend generating a new API key from scratch if you just set up a billing method on the OpenAI website.\nAfter pasting in your secret key, you’ll no longer have to see any other ChatGPT code or special syntax. I designed the shortcut to be intuitive, visual, and as native as possible on Apple platforms.\nThere is one optional setting you can change in S-GPT: if you have access to the ChatGPT 4 API (which is invitation-only at the moment), you can replace the default model used by S-GPT with the updated one.\nIf you have access to the ChatGPT 4 API, this is where you can replace which model to use in the shortcut.\nToward the beginning of the shortcut, find the ‘Text’ action that contains gpt-3.5-turbo and replace it with gpt-4. Again: you should only do this if your account has access to the ChatGPT 4 API. If not, you should continue using the default ChatGPT 3.5 model, which is fast, inexpensive, and accurate. In my tests, I haven’t noticed meaningful performance improvements in the GPT 4 model compared to 3.5.\nS-GPT and Conversational Mode\nAt a high level, S-GPT is a shortcut that lets you ask ChatGPT questions from an input box on your iPhone, iPad, or Mac; answers are returned by ChatGPT and displayed in an alert on your devices. You can ask whatever you want, wait a couple of seconds, and get a response back from the assistant. S-GPT only supports text, and there are no limits in terms of question length.\nConversations with S-GPT.\nThere are several aspects of S-GPT, however, that set it apart from similar shortcuts you may have seen in recent months. Let me start from the underlying foundation of this shortcut.\nS-GPT uses the new chat API released by OpenAI, which is more cost-efficient than the previous text completion API and can produce high-quality results. More importantly, S-GPT supports conversational mode: as you talk to ChatGPT and ask follow-up questions in the same “session”, S-GPT retains the context of your previous questions and the assistant’s series of answers. In fact, you can stop the shortcut at any point and export a full log of an entire conversation as a single transcript.\nA full conversation saved to a text file.\nThe ability to hold a back-and-forth conversation, as we’ll see later, brings some terrific advantages over using, say, Siri for certain tasks. Unlike other shortcuts for ChatGPT, your conversations are only ever sent to the OpenAI API: by default, the shortcut does not keep a log or cache of your chats unless you manually ask it to export a transcript.\nS-GPT was designed to provide users with concise and clear answers that can be read in just a few seconds. I did this because S-GPT can be used both as a shortcut launched from the Shortcuts app, an icon on the Home Screen, or a widget, or as a shortcut running inside Siri. When S-GPT runs inside Siri, it’s also running with more memory constraints and I’m guessing you wouldn’t want Siri to speak an answer that takes two minutes to be read in full. So, thanks to the ChatGPT API, I was able to assign a specific “personality” and role to S-GPT. This is the system prompt that controls S-GPT’s default behavior:\n\n You are S-GPT, a fork of ChatGPT created by Federico Viticci. You have all the capabilities of ChatGPT but you run inside Apple’s Shortcuts app and Siri.\n Your responses should be informative and clear, but not excessively long. You have to help users quickly.\n Users need to be able to listen to your answers in 15 seconds. Never go longer unless I ask you to be more detailed.\n\nAt any point, you can ask S-GPT to be more detailed or to tell you more about a specific topic; whenever possible, S-GPT will prioritize brevity and short, informative responses.\nYou can always ask S-GPT for more detail about any topic.\nAs I will explain in the Automation Academy class for Club Plus and Premier members, S-GPT requires an additional helper shortcut to be installed. The shortcut, which you can find at the end of this article, is called S-GPT Encoder and it’s a helper utility that runs as a sub-module. The shortcut helps S-GPT properly encode and decode information; it was necessary to offer the ability to ask a single question or have an entire conversation with S-GPT.\nThe defining feature of S-GPT, however, is its native integration with Apple’s platforms.\nWith S-GPT, I wanted to start building a bridge between ChatGPT and Apple’s OSes. I know that this is a lofty goal, and there’s only so much I can do with Shortcuts, but I look at how Microsoft is integrating ChatGPT with Windows, and I’m jealous that the same isn’t true on Apple’s platforms (and likely will never be).\nSo, more than a simple bot to have a conversation with ChatGPT in Shortcuts, I set out to create a tool that would connect ChatGPT responses to native iOS, iPadOS, macOS, and watchOS functionalities. I wanted to create a ChatGPT-based utility that would help you process your data and make things happen on your computer rather than simply answer trivia questions or write poems.\n\nI wanted to create a ChatGPT-based utility that would help you process your data and make things happen on your computer.\n\nThe most powerful aspect of S-GPT is how Shortcuts becomes the glue between ChatGPT and your devices with a local, on-device, privacy-conscious approach. There are several native integrations in S-GPT already, and I have a long list of future ones to add in subsequent updates.\nLet’s take a look.\nThe Native Integrations of S-GPT\nIn this first version of S-GPT, the shortcut supports the following integrations on iOS, iPadOS, and macOS:\nSafari share sheet. If you share a webpage with S-GPT, it’ll try to summarize it for you;\nClipboard. If you ask S-GPT to do anything with text in the clipboard, ChatGPT will process the contents of your system clipboard. Try asking “Check my clipboard for grammar mistakes” or “Summarize what’s in the clipboard”.\nTrigger word: “clipboard”\n\nReminders and Calendar. S-GPT can look at your upcoming schedule and tell you which days are too busy with the help of ChatGPT. Try asking “Help me with my schedule”.\nTrigger phrase: “my schedule”\n\nLive Text. You can use S-GPT to summarize text extracted from any image in your photo library via Apple’s Live Text technology. Your photo will not be sent to OpenAI – just the text extracted from it locally, on-device. Try asking “Use Live Text.”\nTrigger phrase: “live text”\n\nSafari and URLs. If ChatGPT returns web URLs as part of its responses, S-GPT will offer you the ability to open those links – even multiple at once – in Safari as tabs.\nQuick Look, Files, Finder, Translate, and other export actions. When you want to end your conversation with S-GPT and do something with it, you can say “export chat” to be presented with a list of actions. These include the ability to copy the full chat to the clipboard, save it to a text file, and even translate it to a different language using Apple’s own Translate feature. By default, responses exported to text files will be saved in iCloud Drive/Shortcuts.\nMusic. This is the big one: S-GPT can make a playlist in Apple’s Music app for any list of songs returned by ChatGPT. As long as there’s a list of songs provided by S-GPT, you can ask it to turn it into a playlist and you’ll end up with a brand new, actual playlist in the Music app. Try asking “Make me a playlist with 10 emo songs from the late 2000s” or “I want a playlist with the top 15 songs by the members of boygenius”.\nTrigger word: “playlist”\n\nS-GPT integrations.\nAs you can see from the list above, I tried to come up with a series of features for version 1.0 of this shortcut that would appeal to a wide range of users on different platforms, and I have more planned for future updates. The integrations are triggered by a set of prebuilt words or sentences (which I listed above) and, right now, S-GPT only supports the English language.\nNow, allow me to dig a little deeper into a few examples of the integrations supported by S-GPT and what you can do with this shortcut.\nBy far my favorite feature of S-GPT is the ability to ask ChatGPT for a list of songs and turn that into an actual playlist in the Music app via Shortcuts’ playlist actions. What’s amazing about this is that the command can be issued immediately with details of the kind of playlist you’re looking for or later in a conversation, retaining the context of what was discussed before.\nFor instance, if you ask S-GPT this:\n\n Make me a playlist with 20 popular songs by Oasis and Blur\n\nChatGPT will use its intelligence to understand what you mean, it’ll pass back a list of songs to S-GPT, and you’ll be asked to enter a name for the playlist. Wait a few moments, open the Music app, and boom:\nMaking a playlist with S-GPT.\nTo me, this is incredible: ChatGPT can turn a moderately complex natural language query into a list of songs; the engine I created in S-GPT translates that into a playlist inside Apple’s Music app. But we can do better than this.\nOne of the perks of ChatGPT is that it can go multiple levels deep into the meaning of a query. So, imagine this prompt:\n\n I want a playlist with the top 15 songs by the members of boygenius\n\nAsk this, and S-GPT will create a playlist with the top songs by Phoebe Bridgers, Julien Baker, and Lucy Dacus. ChatGPT knows how to search for “the members of boygenius” and it returns songs from the individual artists who are members of this supergroup.\nChatGPT knew what I meant, and S-GPT created the playlist for my query.\nSiri, by comparison, has no idea what to do with this query.\nOh, Siri.\nThis is still the surface. ChatGPT can find and recommend songs by vibe, release date, mood, and more. Imagine this:\n\n I’m feeling nostalgic. Make me a playlist with 25 mellow indie rock songs released between 2000 and 2010 and sort them by release year, from oldest to most recent.\n\nThe natural language prompt in S-GPT.\nThat prompt will take a while to execute (more on why in the Automation Academy), but it’ll work, and it’ll generate this playlist:\nYou can’t go wrong with Death Cab and Modest Mouse.\nYou may notice that not all songs returned by ChatGPT can be added to a playlist. Unfortunately, this is due to Apple’s non-existent implementation of Apple Music search in Shortcuts. The Shortcuts app doesn’t have a real ‘Search on Apple Music’ action; instead, I have to use an iTunes Store search action, which is often unreliable. I hope we’ll see native Apple Music search actions in iOS 17.\n\nThe list of mind-blowing music examples could go on forever, and even though ChatGPT isn’t perfect at music recommendations, it’s pretty good, and I just love the ability to quickly and dynamically make a new playlist based on a set of arbitrary commands. This integration reminds me of the never-forgotten Sentence feature from Beats Music.\nThe last thing I’d point out is that S-GPT can hold the context of the current conversation, which works by asking it to generate a playlist later in the chat too. So, if S-GPT returns a list of songs and you then decide to turn that into a playlist, that flow will also work. Check out the screenshots below for an example of this:\nFrom a conversation to a playlist in the Music app, all thanks to ChatGPT and Shortcuts.\nThe other integration I’d like to call out is the clipboard one. By simply asking S-GPT to do “something” with the clipboard, the shortcut will be able to access the text contents of your system clipboard and pass that to ChatGPT for processing. For example, this command…\n\n Check the paragraphs of text in my clipboard for grammar mistakes. Provide a list of mistakes, annotate them, and offer suggestions for fixes.\n\n…will allow S-GPT to access multiple paragraphs of text you’ve previously copied to the clipboard, and it’ll ask ChatGPT to process them for grammar mistakes based on its language model.\nAn example of S-GPT checking text in my clipboard for errors.\nThe flexibility of this integration is only limited by your imagination. Want to quickly summarize an article from Safari? Open an article’s Safari Reader view, copy all text in it, then ask S-GPT…\n\n Summarize the text in my clipboard\n\n…and you’ll have a summary, ready to be copied or saved somewhere.\nAs long as the article isn’t too long (don’t try and give 5,000 words to ChatGPT via the API), this will work.\nHow about asking for a list of adjectives and adverbs contained in your clipboard? Sure thing:\nAdverbs and adjectives as scanned from my iPad’s clipboard.\nAnd what about going back to the original text and asking to also translate it to Italian? That also works.\nContext retention and translation in S-GPT.\nSpeaking of copying and saving chat transcripts: S-GPT comes with a series of actions that you can perform to save or export the conversation you had with it. To invoke the list of actions, simply say…\n\n Export chat\n\n…and you’ll be presented with the following menu:\nS-GPT’s list of export actions.\nAs you can see, S-GPT offers buttons to copy the full chat log to the clipboard, save it as a text file, copy the last response from ChatGPT only, or even translate everything to a different language using Apple’s own Translate feature for iOS, iPadOS, and macOS. At any point during a conversation, you can say “export chat” to launch this menu and choose what you want to do.\nLike I said, I don’t want to get too deep into the technicalities of S-GPT: I intentionally designed this shortcut to be intuitive and flexible so that everyone can find their own use cases and have a unique experience with it.\nAs Apple often likes to say about its products, I can’t wait to see what you make with this shortcut.\nComing Soon for Club Members: More Personalities and an Automation Academy Class\nBut wait, there’s more!\nComing this Friday in MacStories Weekly for all Club members, I designed a series of special “behavioral” prompts to unlock different personalities in S-GPT. There is one, for example, in which the AI is extremely evil, unkind, and malevolent towards you, which was inspired by CARROT Weather. There is another flavor of S-GPT that talks like Steve Jobs and is obsessed with Apple references and fun facts. In MacStories Weekly, I will share some of these prompts so you can infuse S-GPT with different personalities and have some fun with it.\nAdditional personalities for S-GPT.\nFor Club Plus and Premier members, I’ve been working on the grand return of my Automation Academy series (it’s been too long, and I apologize) with a deep dive into the making of S-GPT and how the ChatGPT API works in Shortcuts behind the scenes. S-GPT is the most advanced and complex shortcut I’ve ever created. It employs several high-level techniques for data processing, encoding, exceptions, and system integrations that tap into some of the most esoteric advanced options for Shortcuts power users.\nWhether you’re interested in building shortcuts for ChatGPT or just want to learn some advanced Shortcuts techniques that will help you during Automation April, this lesson is for you. I’m working on it now, and it’s going to be ready soon. To get access to it, my recommendation is to sign up for a Club Premier account, which includes everything from the base tier of the Club, plus:\nDiscord access\nExtra original content\nMore Club web app features, including custom RSS feeds\nAppStories+\nYou can sign up using the buttons below.\n\nJoin Annual$120/yearJoin Monthly$12/month\n\nDownload S-GPT for Free\nThe 1.0 version of S-GPT I’m releasing today is just the beginning for this shortcut.\nIn the weeks I’ve spent building S-GPT, it has turned out to be a transformative shortcut that is altering my idea of chatting with an assistant on iOS, iPadOS, and macOS. I’m happy with the system integrations the shortcut has so far, but I’m working on a lot more for future updates – including the ability to run Terminal commands and scripts on macOS or ways to let ChatGPT process the contents of text documents from Files and Finder. Once I realized the potential for a large language model combined with Shortcuts’ native OS integrations, I knew this shortcut could be something special.\nI’m only just getting started with S-GPT, and there’s a lot more to come in the near future. If you’re intrigued by the idea of blending Apple’s OSes and ChatGPT using Shortcuts, you can follow me for updates on Mastodon, and get started today with the first version of S-GPT.\n\n \n \n S-GPTS-GPT is a shortcut to have conversations with OpenAI’s ChatGPT assistant on your iPhone, iPad, and Mac. The shortcut supports both text conversations as well as voice interactions when used inside Siri. S-GPT comes with native system integrations on Apple platforms including the ability to process text from your clipboard, summarize text found in photos, export conversations to Files and Finder, and even create playlists in the Music app. The shortcut requires an OpenAI API token and a helper shortcut called S-GPT Encoder that needs to be downloaded separately.\nGet the shortcut here.\n\n \n \n\n\n \n \n S-GPT EncoderThis is a helper shortcut for S-GPT that needs to be downloaded and installed separately. Without this shortcut, S-GPT won’t work.\nGet the shortcut here.\n\n \n \n\nYou can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-04-05T11:10:06-04:00", "date_modified": "2024-12-02T14:52:19-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "Automation April", "ChatGPT", "OpenAI", "S-GPT", "shortcuts", "iOS" ] }, { "id": "https://www.macstories.net/?p=71757", "url": "https://www.macstories.net/stories/automation-april-thinking-about-linking/", "title": "Automation April: Thinking About Linking", "content_html": "
\"\"

\n

Links are the currency of information overload and distraction. There’s more media available than we could ever get to in a lifetime, and more things we might want to buy, places may want to visit, and other things to explore online than can be fit into a day.

\n

The same problem exists in our work lives. That’s especially true for the kind of work I do. Links are part of everything. Whether I’m researching, writing, or preparing to record a podcast, I’m collecting, managing, and sharing links. I could follow all those trails as they cross my path, but I’d never get anything done.

\n

Instead of flitting from one online discovery to the next with no plan, wasting precious time, I save links for later, putting them aside until I have time for them. I’ve been doing this forever, but I’ve also never been happy with my system. So, it was inevitable that I’d begin tinkering with my setup again, both with the apps I use and the shortcuts that support them.

\n

\n
\"Whether

Whether it’s work or pleasure reading, most of my reading happens in Matter these days.

\n

After thinking about past experiments, I realized that those setups were over-complicated. I had tried to create a system where every link was magically saved and organized the moment I came across it. That was a mistake because it didn’t reflect how I actually work.

\n

With those old setups, there was also tension between the apps I wanted to use and my desire to automate my link collecting. The more I complicated my automation setup, the more I limited the apps I could use.

\n

This time, I decided to put the apps first and automation second. Shortcuts and AppleScript are still important parts of the system I’ve built, but they’re the glue that holds it together, not the system itself. Instead, my primary focus has been on creating something that works the way I do and with the apps I want to use.

\n

Where I Find Links

\n
\"Links

Links come from a lot of sources, including iMessage.

\n

For starters, it’s worth sharing where I find most of the links I save because that has the biggest impact on the process of saving and managing them. RSS is by far the most important source of the links I save. I have roughly 150 feeds divided into topics like Apple, General Tech, Music, and Videogames, which I spoke about in more detail on a recent episode of AppStories. Other links come from social media, in iMessage conversations, via email, and old-fashioned web browsing.

\n

The trouble with any system is trying to account for all those contexts in a single shortcut or other automation that has to work across multiple platforms. Things get messy quickly, especially if you add organizing and managing the links into the mix too. So, instead, I decided to break the process into three parts: collection, management, and use.

\n

Collecting Links

\n
\"Catching

Catching up on the latest news in Reeder using Feedly to sync my RSS feeds.

\n

When I started this journey, I knew I wanted to use Reeder by Silvio Rizzi (iOS/Mac) for RSS. It’s been one of my favorite apps for a long time, but I was skeptical that I could get it to work the way I wanted on the Mac. You see, sharing from Reeder on the Mac is bound to the share menu, which isn’t great. iOS and iPadOS are a different story. With the share sheet, it’s simple to send links from Reeder to any app that accepts URLs.

\n
\"Raindrop

Raindrop for Mac.

\n

On the Mac, though, I immediately ran into trouble. I’d decided that I wanted to use a combination of Raindrop.io (iOS/Mac) and Matter to collect my links. I chose Raindrop.io because it’s web-based and has an API. That means its sync is solid, and it can be integrated with apps like Obsidian whether or not the developer directly supports it. I chose Matter because I like its balance between design and functionality. It’s not as full-featured as Readwise Reader, which I know a lot of people love, but I prefer Matter’s more focused approach and reading environment.

\n
\"I

I primarily use Matter in a pinned Safari tab on my Mac.

\n

I toyed with the idea of using only Raindrop.io or only Matter, but I’m better off using both. Matter is reserved for articles that I want to save for later. In my current system, the only things that go directly to Matter are pleasure reading that is unrelated to my work. MacStories-related materials may end up in Matter, too, but not in this link collection phase. Everything else goes to Raindrop.io because it’s designed to handle more than just articles.

\n
\"For

For my purposes, Matter’s one Shortcuts action was enough, but I’d like to see more added in the future.

\n

The biggest hurdles my app choices posed are that Raindrop.io doesn’t support Shortcuts or the share menu, and Matter offers a single ‘Add to Matter’ Shortcuts action. Also, even though Matter works on M-series Macs, its share menu extension only seems to work with some apps, but not Reeder. It’s at this point that I started to think I might have to pick a different set of apps for collecting links, which Federico and I touched on in the latest episode of the Club MacStories podcast, MacStories Unplugged. However, despite the limitations, I’ve come up with a workflow that includes those apps plus a set of four shortcuts that tie everything together nicely.

\n

Because most of my link collection starts with Reeder on the Mac, that’s where I concentrated my automation efforts. It’s also where saving links can be surprisingly clunky compared to the iPhone and iPad’s superior share sheet. As a result, my Mac system was historically no system at all. If I found something in Reeder I wanted to save, I’d open it in Safari, where I could use the Matter and Raindrop.io browser extensions to save articles. However, that resulted in a cluttered mess of tabs and articles I forgot about and never saved. Plus, it was slow compared to saving links on iOS and iPadOS. The solution to integrating Raindrop.io and Matter with Reeder was Shortcuts.

\n
\"Reeder

Reeder -> Matter.

\n

The shortcuts called Reeder -> Matter and Reeder -> Raindrop begin by grabbing the URL of a post I want to save. Reeder has a built-in ‘Copy Link’ action that can be assigned a keyboard shortcut, which makes it possible to copy the link with Shortcuts even though Reeder doesn’t offer that as a Shortcuts action itself. To get the link, my Reeder -> Matter shortcut first runs a simple AppleScript that simulates the pressing of the keyboard shortcut for copying a Reeder link.

\n

A nice side benefit of the fact that Matter works with M-series Macs is that its Shortcuts action is available, making it easy to pass the link from the clipboard to a “Save to Matter” action. The final step of the shortcut sends a notification, so I know the link has been saved.

\n

What makes saving links to Matter fast with this shortcut is that I’ve tied it to a keyboard shortcut in Raycast and a button on the Loupedeck Live S, the automation accessory that we’re giving away as part of the Automation April Shortcuts Contest. That way I can trigger the shortcut with Raycast, which supports Shortcuts, using a keyboard shortcut, or the Loupedeck Live S, depending on whichever is easiest. I could bind Reeder -> Matter to a keyboard shortcut directly in Shortcuts, but I’ve found that’s unreliable, so I use Raycast instead.

\n
\"Reeder

Reeder -> Raindrop

\n

My Reeder -> Raindrop shortcut is similar but a little more involved than the Matter one. The Raindrop version starts off the same by invoking my Reeder keyboard shortcut for copying a link using AppleScript. Next, the shortcut pastes the link from the clipboard into Raindrop.io’s URL scheme, which opens a Safari page populating the URL field with the copied link. The shortcut waits three seconds to make sure the Raindrop.io webpage fully loads and then uses AppleScript to close the tab. Finally, an Open App action reopens Reeder.

\n

What I like about both of these shortcuts is that they use simple actions that historically haven’t been buggy, which means they should be reliable. They also work quickly, so they don’t interrupt my RSS reading in a meaningful way.

\n

You may wonder why I don’t bother filling out any other Raindrop.io fields that are available when using the app’s URL scheme. I’ll get to that more below, but it comes down to speed and staying in the flow of scanning RSS. The point of these shortcuts is to avoid context switching while saving links by reducing the process to a single keyboard shortcut or tap of a button.

\n
\"lire

lire is another option worth considering.

\n

Another option I considered when exploring better ways to collect links was switching my RSS reader to lire, another excellent RRS client. The advantage of lire is that it supports Raindrop.io directly, and for whatever reason, Matter’s share menu extension works with it on the Mac. Switching would have eliminated the need for Shortcuts entirely, but when it comes to reading, I’m very picky, and I prefer Reeder. Thanks to Shortcuts, I didn’t have to compromise on the apps I like using most.

\n

You can download Reeder -> Raindrop here and Reeder -> Matter here.

\n
\n
\n \"\"
\n

Reeder -> Raindrop

To use this shortcut with Reeder, first assign the keyboard shoortcut ⌥↑c to Reeder’s Copy Link action. The Apple Script will simulate that keyboard shortcut for the item you’re viewing in Reeder and then send it to Raindrop.io’s URL scheme, which will open a tab in Safari, save the link to Raindrop.io, close the tab, and return to Reeder after a 3 second wait.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

Reeder -> Matter

To use this shortcut with Reeder, first assign the keyboard shoortcut ⌥↑c to Reeder’s Copy Link action. The Apple Script will simulate that keyboard shortcut for the item you’re viewing in Reeder and then send it to the read-later app Matter, confirming that the save has occurred with a notification.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Managing Links

\n
\"Raindrop

Raindrop on the iPad.

\n

One of the insights I had while reevaluating my link workflow was that managing links was getting in the way of saving them. No matter what app I used, whether it was Raindrop.io, GoodLinks, Anybox, or something else, filing and tagging was an interruption in the reading process that slowed me down and was distracting. That’s why I save links without any tags in the initial collection step.

\n
\"Just

Just two Unsorted links so far today because I’ve been editing.

\n

Instead, I take advantage of Raindrop.io’s Unsorted category, which is a folder where links go if they haven’t been assigned a category or tag yet. GoodLinks has a similar concept of read/unread that could be used the same way. At least once a day, I review the links I’ve saved, categorizing them and adding tags.

\n

I do most of this link review and management on an iPhone or iPad because Raindrop.io’s new iOS and iPadOS app is a cut above its Mac app, making it easy to move quickly. Some links remain in Raindrop.io for future reference, and others are sent to Matter for reading more carefully.

\n
\"I

I try to keep my tags list simple.

\n

I don’t have an elaborate filing or tagging system because Raindrop.io’s full-text search works well. Instead, I use tags to assign links to areas of my life. Links get tagged ‘Weekly’ if I’m going to include them in the Interesting Links section of MacStories Weekly, our weekly Club newsletter. Others are tagged with ‘Post’ for linking on MacStories or ‘AS’ for adding to the show notes of an upcoming AppStories episode. I also use tags for specific projects, travel ideas and plans, and things I am considering buying. Using tags that mirror many of the projects I use in my task manager also makes it easier to work with tasks and links together.

\n

The one thing that no amount of process or automation can solve, though, is my own behavior. The weak spot in this workflow is the need to review links regularly. That hasn’t been a problem yet, but I know at some point I’ll get busy, and it will be. I’ve tried to counter that inevitability by being thoughtful about the number of links I save, but also by creating what I call the Link Lost and Found. It’s a Raindrop.io collection where I drag links that I can’t get to in a timely way. Maybe I’ll get to them when I free up, or perhaps they’ll sit there forever, but at least they’ll be searchable if I remember something I saved and want to find it again. That keeps my Unsorted collection manageable because it only ever stores recent materials.

\n

Using Links

\n
\"My

My shortcuts for formatting my links are available from the share sheet, but on the Mac, I usually trigger them with Raycast.

\n

Once you’ve collected and sorted links, the idea is to use them. Sometimes that just means reading an article in Matter or opening a site from Raindrop.io and buying something, but for me, it often means pasting a Markdown-formatted version of the link in a story I’m writing or adding a plain link to podcast show notes in Notion.

\n

I began building this last piece of the puzzle over a year ago with a shortcut that took the URL and name of a Safari page and created a Markdown-formatted link using regular expressions to trim extraneous text from the page’s name. That shortcut was good but too limited, so I expanded it and built another for when I’m not using Safari.

\n

By the time I’m ready to copy a Markdown or plain link, I’m usually using Raindrop.io or Matter. For those apps, it’s easy to copy a URL. Once the link is on the clipboard, I use a shortcut called Clip Link to create a Markdown formatted link.

\n
\"Clip

Clip Link.

\n

First, Clip Link checks whether there is a URL on the clipboard using a regular expression that’s stored in a dictionary. If not, the shortcut ends and alerts me of the problem. Otherwise, the shortcut uses a ‘Repeat with each item’ action to iterate over each URL that was found pulling the title of the page using the app Actions and the URL to create a Markdown link. The final step copies the results to the clipboard.

\n
\"Safari

Safari Link.

\n

Because I use Raycast’s clipboard manager, the result is that I have both the Markdown and plain links sitting on my clipboard, ready to paste elsewhere. Safari Clip is similar, but because I’m already on a webpage, I don’t need to check the clipboard, and I can simply use the name and URL of the active tab to create the Markdown and plain link.

\n

Because I don’t have the benefit of a clipboard manager on iOS or iPadOS, I always make the Markdown version of the link the last thing copied to the clipboard. I could send both versions to an app for temporary storage when I’m using my iPhone or iPad, but the plain version of a link is usually so easy to copy anyway that I decided it wasn’t worth the additional overhead.

\n

You can download Clip Link here and Safari link here.

\n
\n
\n \"\"
\n

Clip Link

Clip Link uses regex to verify there’s at least one URL on the clipboard. If it finds one or more, it takes each and coverts them into Markdown links using the Actions app, which is a dependency of this shortcut.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
\n \"\"
\n

Safari Link

Safari Link uses the active browser tab to create a Markdown link using the Actions app, which is a dependency of this shortcut.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Saving Highlights

\n
\"Obsidian

Obsidian for Mac’s beta bookmark sidebar.

\n

Aside from saving URLs, I’ve begun experimenting more with saving highlights from articles. This is the newest part of my system and one I’m currently using exclusively with Matter since that’s where most of my reading occurs. I haven’t historically highlighted a lot of what I read because it doesn’t have a long shelf life. However, Matter, its Obsidian plugin, and Obsidian for Mac’s new beta bookmarking feature gave me an idea.

\n

I collect links every week to publish in an Interesting Links section in MacStories Weekly. My process for collecting those links has always involved saving links and tagging them with ‘Weekly’ but not much more. Now, I’m sending those links to Matter where I read them throughout the week, tagging and highlighting them as I go. Matter’s Obsidian plugin automatically syncs the highlights to a folder in my Obsidian notes vault.

\n
\"A

A search that combines the path to my Matter highlights and the tag ‘weekly.’

\n

Obsidian search has special parameters for searching the path to a specific folder and tags. The new bookmarking feature, which is currently in beta on the Mac only, lets you bookmark searches. That means I now have a saved search of highlights from Matter articles that are tagged ‘Weekly’ that I can access at any time with one click. I’ve only been using this highlights workflow for a short time, but I can already see it coming in handy for research-intensive projects like writing my annual macOS review.

\n

Also, Raindrop.io supports highlighting, and Readwise can sync those highlights to Obsidian, but I haven’t gone down that path yet. Anything that is part of my writing eventually ends up in Matter, so it’s good to know that functionality is there in Raindrop.io, but for now, I’m sticking to syncing my Matter highlights only.

\n

This new multi-phase workflow has been far better than my past overly-complex or haphazard efforts. The differences are links stashed in fewer places and a staged approach that separates collecting links from processing and using them. With a sprinkling of Shortcuts on top to make the apps I want to use work better together, it’s a much more efficient system than I was using before.

\n
\"Will

Will my latest link workflow stick? I think so.

\n

The lesson I’ve taken from redoing my link workflow is that it really pays to think about how you work before you set out to streamline it. Too often, I’ve wound up trying to fit my work habits to a system instead of creating a system that works the way I do. It also helps to start with modest goals and figure out how to balance built-in app functionality with automation solutions like Shortcuts and Obsidian plugins. Only after breaking down a workflow into its component parts can you make the call about where automation fits into the setup you need. Sometimes the right solution is to build everything in an app like Shortcuts, and other times automation works better as the glue that holds other apps together, but you’ll never know which will work best until you really understand what you need first.

\n

You can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Links are the currency of information overload and distraction. There’s more media available than we could ever get to in a lifetime, and more things we might want to buy, places may want to visit, and other things to explore online than can be fit into a day.\nThe same problem exists in our work lives. That’s especially true for the kind of work I do. Links are part of everything. Whether I’m researching, writing, or preparing to record a podcast, I’m collecting, managing, and sharing links. I could follow all those trails as they cross my path, but I’d never get anything done.\nInstead of flitting from one online discovery to the next with no plan, wasting precious time, I save links for later, putting them aside until I have time for them. I’ve been doing this forever, but I’ve also never been happy with my system. So, it was inevitable that I’d begin tinkering with my setup again, both with the apps I use and the shortcuts that support them.\n\nWhether it’s work or pleasure reading, most of my reading happens in Matter these days.\nAfter thinking about past experiments, I realized that those setups were over-complicated. I had tried to create a system where every link was magically saved and organized the moment I came across it. That was a mistake because it didn’t reflect how I actually work.\nWith those old setups, there was also tension between the apps I wanted to use and my desire to automate my link collecting. The more I complicated my automation setup, the more I limited the apps I could use.\nThis time, I decided to put the apps first and automation second. Shortcuts and AppleScript are still important parts of the system I’ve built, but they’re the glue that holds it together, not the system itself. Instead, my primary focus has been on creating something that works the way I do and with the apps I want to use.\nWhere I Find Links\nLinks come from a lot of sources, including iMessage.\nFor starters, it’s worth sharing where I find most of the links I save because that has the biggest impact on the process of saving and managing them. RSS is by far the most important source of the links I save. I have roughly 150 feeds divided into topics like Apple, General Tech, Music, and Videogames, which I spoke about in more detail on a recent episode of AppStories. Other links come from social media, in iMessage conversations, via email, and old-fashioned web browsing.\nThe trouble with any system is trying to account for all those contexts in a single shortcut or other automation that has to work across multiple platforms. Things get messy quickly, especially if you add organizing and managing the links into the mix too. So, instead, I decided to break the process into three parts: collection, management, and use.\nCollecting Links\nCatching up on the latest news in Reeder using Feedly to sync my RSS feeds.\nWhen I started this journey, I knew I wanted to use Reeder by Silvio Rizzi (iOS/Mac) for RSS. It’s been one of my favorite apps for a long time, but I was skeptical that I could get it to work the way I wanted on the Mac. You see, sharing from Reeder on the Mac is bound to the share menu, which isn’t great. iOS and iPadOS are a different story. With the share sheet, it’s simple to send links from Reeder to any app that accepts URLs.\nRaindrop for Mac.\nOn the Mac, though, I immediately ran into trouble. I’d decided that I wanted to use a combination of Raindrop.io (iOS/Mac) and Matter to collect my links. I chose Raindrop.io because it’s web-based and has an API. That means its sync is solid, and it can be integrated with apps like Obsidian whether or not the developer directly supports it. I chose Matter because I like its balance between design and functionality. It’s not as full-featured as Readwise Reader, which I know a lot of people love, but I prefer Matter’s more focused approach and reading environment.\nI primarily use Matter in a pinned Safari tab on my Mac.\nI toyed with the idea of using only Raindrop.io or only Matter, but I’m better off using both. Matter is reserved for articles that I want to save for later. In my current system, the only things that go directly to Matter are pleasure reading that is unrelated to my work. MacStories-related materials may end up in Matter, too, but not in this link collection phase. Everything else goes to Raindrop.io because it’s designed to handle more than just articles.\nFor my purposes, Matter’s one Shortcuts action was enough, but I’d like to see more added in the future.\nThe biggest hurdles my app choices posed are that Raindrop.io doesn’t support Shortcuts or the share menu, and Matter offers a single ‘Add to Matter’ Shortcuts action. Also, even though Matter works on M-series Macs, its share menu extension only seems to work with some apps, but not Reeder. It’s at this point that I started to think I might have to pick a different set of apps for collecting links, which Federico and I touched on in the latest episode of the Club MacStories podcast, MacStories Unplugged. However, despite the limitations, I’ve come up with a workflow that includes those apps plus a set of four shortcuts that tie everything together nicely.\nBecause most of my link collection starts with Reeder on the Mac, that’s where I concentrated my automation efforts. It’s also where saving links can be surprisingly clunky compared to the iPhone and iPad’s superior share sheet. As a result, my Mac system was historically no system at all. If I found something in Reeder I wanted to save, I’d open it in Safari, where I could use the Matter and Raindrop.io browser extensions to save articles. However, that resulted in a cluttered mess of tabs and articles I forgot about and never saved. Plus, it was slow compared to saving links on iOS and iPadOS. The solution to integrating Raindrop.io and Matter with Reeder was Shortcuts.\nReeder -> Matter.\nThe shortcuts called Reeder -> Matter and Reeder -> Raindrop begin by grabbing the URL of a post I want to save. Reeder has a built-in ‘Copy Link’ action that can be assigned a keyboard shortcut, which makes it possible to copy the link with Shortcuts even though Reeder doesn’t offer that as a Shortcuts action itself. To get the link, my Reeder -> Matter shortcut first runs a simple AppleScript that simulates the pressing of the keyboard shortcut for copying a Reeder link.\nA nice side benefit of the fact that Matter works with M-series Macs is that its Shortcuts action is available, making it easy to pass the link from the clipboard to a “Save to Matter” action. The final step of the shortcut sends a notification, so I know the link has been saved.\nWhat makes saving links to Matter fast with this shortcut is that I’ve tied it to a keyboard shortcut in Raycast and a button on the Loupedeck Live S, the automation accessory that we’re giving away as part of the Automation April Shortcuts Contest. That way I can trigger the shortcut with Raycast, which supports Shortcuts, using a keyboard shortcut, or the Loupedeck Live S, depending on whichever is easiest. I could bind Reeder -> Matter to a keyboard shortcut directly in Shortcuts, but I’ve found that’s unreliable, so I use Raycast instead.\nReeder -> Raindrop\nMy Reeder -> Raindrop shortcut is similar but a little more involved than the Matter one. The Raindrop version starts off the same by invoking my Reeder keyboard shortcut for copying a link using AppleScript. Next, the shortcut pastes the link from the clipboard into Raindrop.io’s URL scheme, which opens a Safari page populating the URL field with the copied link. The shortcut waits three seconds to make sure the Raindrop.io webpage fully loads and then uses AppleScript to close the tab. Finally, an Open App action reopens Reeder.\nWhat I like about both of these shortcuts is that they use simple actions that historically haven’t been buggy, which means they should be reliable. They also work quickly, so they don’t interrupt my RSS reading in a meaningful way.\nYou may wonder why I don’t bother filling out any other Raindrop.io fields that are available when using the app’s URL scheme. I’ll get to that more below, but it comes down to speed and staying in the flow of scanning RSS. The point of these shortcuts is to avoid context switching while saving links by reducing the process to a single keyboard shortcut or tap of a button.\nlire is another option worth considering.\nAnother option I considered when exploring better ways to collect links was switching my RSS reader to lire, another excellent RRS client. The advantage of lire is that it supports Raindrop.io directly, and for whatever reason, Matter’s share menu extension works with it on the Mac. Switching would have eliminated the need for Shortcuts entirely, but when it comes to reading, I’m very picky, and I prefer Reeder. Thanks to Shortcuts, I didn’t have to compromise on the apps I like using most.\nYou can download Reeder -> Raindrop here and Reeder -> Matter here.\n\n \n \n Reeder -> RaindropTo use this shortcut with Reeder, first assign the keyboard shoortcut ⌥↑c to Reeder’s Copy Link action. The Apple Script will simulate that keyboard shortcut for the item you’re viewing in Reeder and then send it to Raindrop.io’s URL scheme, which will open a tab in Safari, save the link to Raindrop.io, close the tab, and return to Reeder after a 3 second wait.\nGet the shortcut here.\n\n \n \n\n\n \n \n Reeder -> MatterTo use this shortcut with Reeder, first assign the keyboard shoortcut ⌥↑c to Reeder’s Copy Link action. The Apple Script will simulate that keyboard shortcut for the item you’re viewing in Reeder and then send it to the read-later app Matter, confirming that the save has occurred with a notification.\nGet the shortcut here.\n\n \n \n\nManaging Links\nRaindrop on the iPad.\nOne of the insights I had while reevaluating my link workflow was that managing links was getting in the way of saving them. No matter what app I used, whether it was Raindrop.io, GoodLinks, Anybox, or something else, filing and tagging was an interruption in the reading process that slowed me down and was distracting. That’s why I save links without any tags in the initial collection step.\nJust two Unsorted links so far today because I’ve been editing.\nInstead, I take advantage of Raindrop.io’s Unsorted category, which is a folder where links go if they haven’t been assigned a category or tag yet. GoodLinks has a similar concept of read/unread that could be used the same way. At least once a day, I review the links I’ve saved, categorizing them and adding tags.\nI do most of this link review and management on an iPhone or iPad because Raindrop.io’s new iOS and iPadOS app is a cut above its Mac app, making it easy to move quickly. Some links remain in Raindrop.io for future reference, and others are sent to Matter for reading more carefully.\nI try to keep my tags list simple.\nI don’t have an elaborate filing or tagging system because Raindrop.io’s full-text search works well. Instead, I use tags to assign links to areas of my life. Links get tagged ‘Weekly’ if I’m going to include them in the Interesting Links section of MacStories Weekly, our weekly Club newsletter. Others are tagged with ‘Post’ for linking on MacStories or ‘AS’ for adding to the show notes of an upcoming AppStories episode. I also use tags for specific projects, travel ideas and plans, and things I am considering buying. Using tags that mirror many of the projects I use in my task manager also makes it easier to work with tasks and links together.\nThe one thing that no amount of process or automation can solve, though, is my own behavior. The weak spot in this workflow is the need to review links regularly. That hasn’t been a problem yet, but I know at some point I’ll get busy, and it will be. I’ve tried to counter that inevitability by being thoughtful about the number of links I save, but also by creating what I call the Link Lost and Found. It’s a Raindrop.io collection where I drag links that I can’t get to in a timely way. Maybe I’ll get to them when I free up, or perhaps they’ll sit there forever, but at least they’ll be searchable if I remember something I saved and want to find it again. That keeps my Unsorted collection manageable because it only ever stores recent materials.\nUsing Links\nMy shortcuts for formatting my links are available from the share sheet, but on the Mac, I usually trigger them with Raycast.\nOnce you’ve collected and sorted links, the idea is to use them. Sometimes that just means reading an article in Matter or opening a site from Raindrop.io and buying something, but for me, it often means pasting a Markdown-formatted version of the link in a story I’m writing or adding a plain link to podcast show notes in Notion.\nI began building this last piece of the puzzle over a year ago with a shortcut that took the URL and name of a Safari page and created a Markdown-formatted link using regular expressions to trim extraneous text from the page’s name. That shortcut was good but too limited, so I expanded it and built another for when I’m not using Safari.\nBy the time I’m ready to copy a Markdown or plain link, I’m usually using Raindrop.io or Matter. For those apps, it’s easy to copy a URL. Once the link is on the clipboard, I use a shortcut called Clip Link to create a Markdown formatted link.\nClip Link.\nFirst, Clip Link checks whether there is a URL on the clipboard using a regular expression that’s stored in a dictionary. If not, the shortcut ends and alerts me of the problem. Otherwise, the shortcut uses a ‘Repeat with each item’ action to iterate over each URL that was found pulling the title of the page using the app Actions and the URL to create a Markdown link. The final step copies the results to the clipboard.\nSafari Link.\nBecause I use Raycast’s clipboard manager, the result is that I have both the Markdown and plain links sitting on my clipboard, ready to paste elsewhere. Safari Clip is similar, but because I’m already on a webpage, I don’t need to check the clipboard, and I can simply use the name and URL of the active tab to create the Markdown and plain link.\nBecause I don’t have the benefit of a clipboard manager on iOS or iPadOS, I always make the Markdown version of the link the last thing copied to the clipboard. I could send both versions to an app for temporary storage when I’m using my iPhone or iPad, but the plain version of a link is usually so easy to copy anyway that I decided it wasn’t worth the additional overhead.\nYou can download Clip Link here and Safari link here.\n\n \n \n Clip LinkClip Link uses regex to verify there’s at least one URL on the clipboard. If it finds one or more, it takes each and coverts them into Markdown links using the Actions app, which is a dependency of this shortcut.\nGet the shortcut here.\n\n \n \n\n\n \n \n Safari LinkSafari Link uses the active browser tab to create a Markdown link using the Actions app, which is a dependency of this shortcut.\nGet the shortcut here.\n\n \n \n\nSaving Highlights\nObsidian for Mac’s beta bookmark sidebar.\nAside from saving URLs, I’ve begun experimenting more with saving highlights from articles. This is the newest part of my system and one I’m currently using exclusively with Matter since that’s where most of my reading occurs. I haven’t historically highlighted a lot of what I read because it doesn’t have a long shelf life. However, Matter, its Obsidian plugin, and Obsidian for Mac’s new beta bookmarking feature gave me an idea.\nI collect links every week to publish in an Interesting Links section in MacStories Weekly. My process for collecting those links has always involved saving links and tagging them with ‘Weekly’ but not much more. Now, I’m sending those links to Matter where I read them throughout the week, tagging and highlighting them as I go. Matter’s Obsidian plugin automatically syncs the highlights to a folder in my Obsidian notes vault.\nA search that combines the path to my Matter highlights and the tag ‘weekly.’\nObsidian search has special parameters for searching the path to a specific folder and tags. The new bookmarking feature, which is currently in beta on the Mac only, lets you bookmark searches. That means I now have a saved search of highlights from Matter articles that are tagged ‘Weekly’ that I can access at any time with one click. I’ve only been using this highlights workflow for a short time, but I can already see it coming in handy for research-intensive projects like writing my annual macOS review.\nAlso, Raindrop.io supports highlighting, and Readwise can sync those highlights to Obsidian, but I haven’t gone down that path yet. Anything that is part of my writing eventually ends up in Matter, so it’s good to know that functionality is there in Raindrop.io, but for now, I’m sticking to syncing my Matter highlights only.\nThis new multi-phase workflow has been far better than my past overly-complex or haphazard efforts. The differences are links stashed in fewer places and a staged approach that separates collecting links from processing and using them. With a sprinkling of Shortcuts on top to make the apps I want to use work better together, it’s a much more efficient system than I was using before.\nWill my latest link workflow stick? I think so.\nThe lesson I’ve taken from redoing my link workflow is that it really pays to think about how you work before you set out to streamline it. Too often, I’ve wound up trying to fit my work habits to a system instead of creating a system that works the way I do. It also helps to start with modest goals and figure out how to balance built-in app functionality with automation solutions like Shortcuts and Obsidian plugins. Only after breaking down a workflow into its component parts can you make the call about where automation fits into the setup you need. Sometimes the right solution is to build everything in an app like Shortcuts, and other times automation works better as the glue that holds other apps together, but you’ll never know which will work best until you really understand what you need first.\nYou can also follow MacStories’ Automation April coverage through our dedicated hub, or subscribe to its RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-04-04T11:03:01-04:00", "date_modified": "2023-06-22T10:35:22-04:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "Obsidian", "read-later", "reeder", "RSS", "shortcuts", "stories" ] }, { "id": "https://www.macstories.net/?p=71580", "url": "https://www.macstories.net/linked/apple-frames-3-1-1-with-support-for-passthrough-mode/", "title": "Apple Frames 3.1.1 with Support for Passthrough Mode", "content_html": "
\"The

The ‘Shortcut Result’ variable, used as an image variable in a shortcut that calls Apple Frames.

\n

I just released a small update to Apple Frames 3.1, which came out earlier this week, with a new output command: &passthrough. With this output command for the Apple Frames API, you’ll be able to generate a framed image (from whatever source you like) and simply pass its result to the next action in a shortcut as a native image variable.

\n

I wrote about this as part of my Extension column in MacStories Weekly today, where I also covered the ability to run Apple Frames from the command line on macOS. Here’s the excerpt about version 3.1.1 of Apple Frames and the new passthrough mode:

\n

\n As I was researching this column for Weekly, I realized there was an obvious candidate for an output command I did not include in Apple Frames 3.1: a passthrough command to, well, pass framed images along as input for the next action of a shortcut.

\n

Here’s what I mean: when you run Apple Frames from a helper shortcut using the ‘Run Shortcut’ action, that action produces an output variable called ‘Shortcut Result’. If you’re running Apple Frames as a function, thus turning it into a feature of another workflow, it can be useful to take the framed images it produces and use them as a native variable in other actions of the shortcut. The problem is that the output commands I launched with Apple Frames 3.1 all involved “storing” the framed images somewhere, whether it was Files or the system clipboard.

\n

This is no longer the case with the &passthrough output command I added to Apple Frames 3.1.1, which you can redownload from the MacStories Shortcuts Archive or directly from this link. If you run the Apple Frames API with this command, framed images will be passed along as native output of the shortcut, which you can reuse as a variable elsewhere in a shortcut that’s invoking Apple Frames.\n

\n

And:

\n

\n Any shortcut or longer workflow that involves running Apple Frames in the background and retrieving the screenshots it frames can take advantage of this method, allowing you to bypass the need to store images in the clipboard, even if temporarily. Essentially, passthrough mode turns Apple Frames into a native action of the Shortcuts app that returns a standard image variable as its output.\n

\n

This is the only change in version 3.1.1 of Apple Frames, and I’m excited to see how people will take advantage of it to chain Apple Frames with other shortcuts on their devices. You can download the updated version of Apple Frames below.

\n
\n
\n \"\"
\n

Apple Frames

Add device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.

\n

Get the shortcut here.

\n\n
\n
\n
\n

\u2192 Source: club.macstories.net

", "content_text": "The ‘Shortcut Result’ variable, used as an image variable in a shortcut that calls Apple Frames.\nI just released a small update to Apple Frames 3.1, which came out earlier this week, with a new output command: &passthrough. With this output command for the Apple Frames API, you’ll be able to generate a framed image (from whatever source you like) and simply pass its result to the next action in a shortcut as a native image variable.\nI wrote about this as part of my Extension column in MacStories Weekly today, where I also covered the ability to run Apple Frames from the command line on macOS. Here’s the excerpt about version 3.1.1 of Apple Frames and the new passthrough mode:\n\n As I was researching this column for Weekly, I realized there was an obvious candidate for an output command I did not include in Apple Frames 3.1: a passthrough command to, well, pass framed images along as input for the next action of a shortcut.\n Here’s what I mean: when you run Apple Frames from a helper shortcut using the ‘Run Shortcut’ action, that action produces an output variable called ‘Shortcut Result’. If you’re running Apple Frames as a function, thus turning it into a feature of another workflow, it can be useful to take the framed images it produces and use them as a native variable in other actions of the shortcut. The problem is that the output commands I launched with Apple Frames 3.1 all involved “storing” the framed images somewhere, whether it was Files or the system clipboard.\n This is no longer the case with the &passthrough output command I added to Apple Frames 3.1.1, which you can redownload from the MacStories Shortcuts Archive or directly from this link. If you run the Apple Frames API with this command, framed images will be passed along as native output of the shortcut, which you can reuse as a variable elsewhere in a shortcut that’s invoking Apple Frames.\n\nAnd:\n\n Any shortcut or longer workflow that involves running Apple Frames in the background and retrieving the screenshots it frames can take advantage of this method, allowing you to bypass the need to store images in the clipboard, even if temporarily. Essentially, passthrough mode turns Apple Frames into a native action of the Shortcuts app that returns a standard image variable as its output.\n\nThis is the only change in version 3.1.1 of Apple Frames, and I’m excited to see how people will take advantage of it to chain Apple Frames with other shortcuts on their devices. You can download the updated version of Apple Frames below.\n\n \n \n Apple FramesAdd device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.\nGet the shortcut here.\n\n \n \n\n\u2192 Source: club.macstories.net", "date_published": "2023-03-03T12:19:11-05:00", "date_modified": "2024-03-21T08:38:46-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Apple Frames", "automation", "iOS", "shortcuts", "Linked" ] }, { "id": "https://www.macstories.net/?p=71536", "url": "https://www.macstories.net/stories/apple-frames-3-1-extending-screenshot-automation-with-the-new-apple-frames-api/", "title": "Apple Frames 3.1: Extending Screenshot Automation with the New Apple Frames API", "content_html": "
\"Apple

Apple Frames 3.1 comes with a lightweight Apple Frames API to extend its automation capabilities.

\n

Update, March 3: Version 3.1.1 of Apple Frames has been released with support for a new passthrough output command. This post has been updated to reflect the changes. You can redownload the updated shortcut at the end of this post.

\n

Today, I’m happy to introduce something I’ve been working on for the past couple of months: Apple Frames – my shortcut to put screenshots captured on Apple devices inside physical device frames – is getting a major upgrade to version 3.1 today. In addition to offering support for more devices that I missed in version 3.0 as well as some bug fixes, Apple Frames 3.1 brings a brand new API that lets you automate and extend the Apple Frames shortcut itself.

\n

By making Apple Frames scriptable, I wanted to allow power users – such as designers and developers who rely on this shortcut to frame hundreds of images each week – to save valuable time without compromising the accessible nature of Apple Frames for other people. This is why all of the new advanced features of Apple Frames are optional and hidden until you go look for them specifically. Furthermore, even if you do want to use the Apple Frames API, you’ll see that I designed it in the spirit of Shortcuts: it does not require any code and it’s entirely powered by simple, visual ‘Text’ actions.

\n

I’m incredibly excited about what Apple Frames can do in version 3.1, so let’s dive in.

\n

\n

Installation and Update

\n

To download Apple Frames 3.1, you can click the download link at the end of this article or find it in the MacStories Shortcuts Archive. If you already have an older version of Apple Frames installed, you will be prompted to replace it while setting up the shortcut. As with version 3.0, during setup you’ll be asked to pick a folder in iCloud Drive that will serve as the destination folder for the ‘Quick Save’ functionality; in version 3.1, you’ll be asked to pick a source folder too.

\n
\"The

The new setup flow of Apple Frames 3.1. You can now choose a destination folder for framed images (like in version 3.0) and, for the first time, a source folder that you can use with the Apple Frames API.

\n

Given that this version of Apple Frames includes support for more devices, you’ll also be prompted to download a new Frames.zip file from the MacStories CDN. You can just click ‘Allow’ here without worrying about anything else. The shortcut will extract the necessary assets, save them in iCloud Drive/Shortcuts, and continue automatically. You’ll only be prompted to do this once.

\n

New Devices and Fixes

\n

Apple Frames 3.1 re-introduces support for the following devices, which I missed in the transition to version 3.0 last year:

\n

Additionally, I fixed an issue with Apple Frames 3.0 that was causing screenshots taken on an iPad Pro 11” in portrait orientation with the ‘More Space’ display scaling mode enabled to not be framed properly. My apologies to all those who tried running Apple Frames on their 11” iPad Pros and couldn’t get it to work over the past few months. I ended up buying an 11” iPad Pro myself so I could debug the issue and figure out what was going on.

\n

Introducing the Apple Frames API

\n

The big change in Apple Frames 3.1 is the availability of a lightweight API that lets you control the shortcut’s behavior with simple text commands. It may seem silly to make an “API” for a shortcut running on your iPhone or Mac, but this is, after all, a little programming interface for Apple Frames, so I think it’s only fair to call it that.

\n

Here’s the gist: you can now script Apple Frames with commands that tell it where to take images from (input commands) and where to save the framed images (output commands). You can still run Apple Frames manually like you’ve always done; however, if you want to save even more time, you can also program Apple Frames 3.1 to get screenshots from a specific source and perform a specific action with the output without having to manually pick images or options from a list.

\n
\"Helper

Helper shortcuts for the Apple Frames API on my iPad Pro.

\n

The Apple Frames API is entirely optional and designed for advanced users of the shortcut who want to take their usage of Apple Frames to the next level.

\n

The new Apple Frames API is comprised of input commands and output commands. These are bits of text that you can pass to the shortcut to tell it where to get images and what to do with them. And here’s my favorite part: you can mix and match input and output commands however you like, creating diverse “recipes” for automating Apple Frames in different contexts. For example, you can create an Apple Frames automation to get the most recent screenshot, frame it, and save it to the Photos app with no manual interactions; or you can automate Apple Frames to get the 10 most recent screenshots from your library, frame them individually without merging them, and save the results in a folder of the Files app.

\n

When I refer to the ability to “pass” commands to Apple Frames 3.1, I mean it. The Apple Frames API was designed to be as simple as giving Apple Frames a ‘Text’ action to use as input, as shown in the image below:

\n
\"All

All you need to get started with the Apple Frames API are the ‘Text’ and ‘Run Shortcut’ actions.

\n

Of course, if you’re an advanced user, you have other options too. You can pass commands to Apple Frames 3.1 via the URL scheme or, if you’re a Mac user, via the shell. Any input method that can pass some text to Shortcuts will work.

\n

Input Commands

\n

With that being said, here are the input commands supported by Apple Frames 3.1:

\n

These input commands should be fairly self-explanatory: they tell Apple Frames where to get images, giving you more flexibility than having to select images manually every time.

\n

The only input command that needs to be configured is the folder one: in the shortcut, you’ll find a ‘Folder’ action that determines the folder in Files/Finder that you want to use as the source of images for the Apple Frames API. You can change this to whatever you want.

\n
\"You

You can configure the source and destination folders in the shortcut.

\n

Output Commands

\n

So those are the input commands. Here is the list of output commands you can use in Apple Frames 3.1:

\n

Output commands, as I noted above, can be chained after an input command; using an output command automates the process of saving a framed image somewhere, and it lets you bypass the final menu of actions that gets shown by Apple Frames at the end of the shortcut. By combining input and output commands, you’ll be able to fully automate the image generation process of Apple Frames, from source files to the resulting framed image, with zero user interaction in the middle. Make sure to check out the list of examples I prepared in the next section to get a sense of how you can combine these.

\n

Merge Mode

\n

Lastly, Apple Frames 3.1 comes with an option that has been requested dozens of times by advanced users: the ability to frame images individually, without merging them in a single composite image at the end. This option is available as a shortcut-wide toggle as well as a flag in the API.

\n

For starters, toward the beginning of the shortcut, you’ll see a new Merge Images variable that is set to ‘True’. This means that, by default, Apple Frames will merge multiple framed screenshots into a single image, as it’s always done. If you want to alter the default behavior of Apple Frames, you can change this variable to ‘False.’

\n
\"The

The new variable that controls the merging behavior in Apple Frames 3.1 (left) and images that have been framed individually (right).

\n

If you want to keep Apple Frames’ default merging behavior but override it every once in a while when needed, you can use the Apple Frames API. By passing +mergeImages=False to Apple Frames (alongside at least one input command), you’ll temporarily override the merging behavior of Apple Frames and tell it to save images individually if it normally merges them, and vice versa. For example, this…

\n

pick+mergeImages=False

\n

…will let you pick screenshots to frame, then save the results as individual images at the end.

\n
\"With

With this new merge option in the Apple Frames API, you’ll be able to temporarily override the merging behavior of Apple Frames.

\n

Examples of the Apple Frames API with Helper Shortcuts

\n

Since the Apple Frames API is based on passing some text input to the shortcut, you can put together “helper shortcuts” to perform specific actions in Apple Frames and rely on them as standalone utilities. These are simple, two-action shortcuts that send specific commands to Apple Frames.

\n
\"Helper

Helper shortcuts designed for the Apple Frames API.

\n

If you’re a Mac user, the Apple Frames API opens a world of possibilities: not only can you create all the helper shortcuts you want and assign a global hotkey to each one of them, but you can even trigger these dedicated utilities from the menu bar, via scripts, or via utilities like Raycast and Alfred.

\n

Let me give you some practical examples of how you can use these commands and chain them together.

\n

The first helper shortcut I created for Apple Frames 3.1 tells it to get the latest screenshot from the Photos app, frame it, and immediately copy the framed image to the clipboard. This lets you go from a screenshot to a framed image ready to be shared in iMessage or other apps in two seconds, especially if you trigger this shortcut from a widget on your Home Screen or Siri. Here’s how easy it is to put together:

\n
\"Framing

Framing the latest screenshot I’ve taken with this helper shortcut so I can quickly paste it in iMessage.

\n
\n
\n \"\"
\n

Frame Latest and Copy

Frame the most recent screenshot from the Photos app and copy it to the clipboard. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

What if you have a screenshot in your clipboard and want to quickly save it to a predefined folder in the Files app or Finder? Easy: using the clipboard&quickSave commands with the Apple Frames API, you can put together a helper shortcut to do exactly that in seconds:

\n
\"Copy

Copy a screenshot, frame it, and save it into a Files folder in two seconds.

\n
\n
\n \"\"
\n

Frame Clipboard

Frame an image from the clipboard and save it into a specific folder of Files or Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

How about getting all images from a folder, framing them, and deciding what to do with them at the end? Not a problem: with the Apple Frames API, you can omit an output command and just tell Apple Frames to work with a folder as an input command, like so:

\n
\"Take

Take all images from a folder, frame them, and present a list of actions.

\n
\n
\n \"\"
\n

Frame Folder

Frame all images from a specific folder in Files or Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Let me get a bit fancier now. Let’s say you want to frame a specific number of recent screenshots, save them in a predefined folder, and override Apple Frames’ merging behavior so that you get a bunch of standalone framed images instead of a single composite one. By chaining together multiple commands, the Frame Number helper shortcut does it: it’ll ask you for a number, get those screenshots, frame them individually, and save them.

\n
\"Given

Given a number of recent screenshots, frame them individually, and save them into a folder in Files.

\n
\n
\n \"\"
\n

Frame Number

Get a specific number of screenshots, frame them as individual images, and save them in a specific folder of Files and Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

The next one is a request I’ve heard multiple times: you have a folder full of screenshots and want to frame them all at once, saving the results to another folder as individual images. With the Frame Folder helper shortcut, you can fetch screenshots from a specific location in Files/Finder, frame them, and save them as standalone, framed images in a different location.

\n
\"Take

Take a folder full of images, frame each individually, then save the results to another folder.

\n
\n
\n \"\"
\n

Frame Folder and Save As Individual Images

Get all images from a specific folder, frame them as individual images, and save them in another folder of Files and Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Lastly, let’s say you want to use Apple Frames on macOS and put together a helper shortcut that instantly captures what you’re looking at, frames it, then puts the framed image in your system clipboard. With Frame Capture and Copy, you can do just that. Assign a keyboard shortcut to this helper utility, and you’ll be able to instantly frame whatever you’re looking at on your Mac with a single keystroke.

\n
\"Capture

Capture what’s onscreen, frame it, and copy it immediately to the clipboard.

\n
\n
\n \"\"
\n

Frame Capture and Copy

Capture what’s onscreen, frame it, and copy it to rhe clipboard. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.

\n

Get the shortcut here.

\n\n
\n
\n
\n

By creating helper shortcuts based on the ‘Text’ and ‘Run Shortcuts’ actions, you now have the power to fully automate Apple Frames regardless of whether you’re using an iPhone, iPad, or Mac.

\n

Download Apple Frames 3.1

\n
\"A

A custom menu I created for Apple Frames 3.1.

\n

I’m incredibly excited about the Apple Frames API as a tool to help people save even more time with Apple Frames. You don’t have to use this advanced feature; if you don’t need it, just go ahead and install Apple Frames 3.1 below, and keep using it like you’ve always done. You can entirely ignore what I just described and just enjoy support for more devices and bug fixes.

\n

If you seek more power and flexibility in Apple Frames, however, the barebones API I created will let you control the input and output behavior of the shortcut. My hope is that these new options will help you fit Apple Frames into different and more elaborate workflows, allowing you to generate and share better-looking screenshots with fewer interactions than before.

\n

I had fun creating this new flavor Apple Frames, and I hope you’ll find it as useful as I do. You can download Apple Frames 3.1, with full support for the new Apple Frames API, below.

\n
\n
\n \"\"
\n

Apple Frames

Add device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Apple Frames 3.1 comes with a lightweight Apple Frames API to extend its automation capabilities.\nUpdate, March 3: Version 3.1.1 of Apple Frames has been released with support for a new passthrough output command. This post has been updated to reflect the changes. You can redownload the updated shortcut at the end of this post.\nToday, I’m happy to introduce something I’ve been working on for the past couple of months: Apple Frames – my shortcut to put screenshots captured on Apple devices inside physical device frames – is getting a major upgrade to version 3.1 today. In addition to offering support for more devices that I missed in version 3.0 as well as some bug fixes, Apple Frames 3.1 brings a brand new API that lets you automate and extend the Apple Frames shortcut itself.\nBy making Apple Frames scriptable, I wanted to allow power users – such as designers and developers who rely on this shortcut to frame hundreds of images each week – to save valuable time without compromising the accessible nature of Apple Frames for other people. This is why all of the new advanced features of Apple Frames are optional and hidden until you go look for them specifically. Furthermore, even if you do want to use the Apple Frames API, you’ll see that I designed it in the spirit of Shortcuts: it does not require any code and it’s entirely powered by simple, visual ‘Text’ actions.\nI’m incredibly excited about what Apple Frames can do in version 3.1, so let’s dive in.\n\nInstallation and Update\nTo download Apple Frames 3.1, you can click the download link at the end of this article or find it in the MacStories Shortcuts Archive. If you already have an older version of Apple Frames installed, you will be prompted to replace it while setting up the shortcut. As with version 3.0, during setup you’ll be asked to pick a folder in iCloud Drive that will serve as the destination folder for the ‘Quick Save’ functionality; in version 3.1, you’ll be asked to pick a source folder too.\nThe new setup flow of Apple Frames 3.1. You can now choose a destination folder for framed images (like in version 3.0) and, for the first time, a source folder that you can use with the Apple Frames API.\nGiven that this version of Apple Frames includes support for more devices, you’ll also be prompted to download a new Frames.zip file from the MacStories CDN. You can just click ‘Allow’ here without worrying about anything else. The shortcut will extract the necessary assets, save them in iCloud Drive/Shortcuts, and continue automatically. You’ll only be prompted to do this once.\nNew Devices and Fixes\nApple Frames 3.1 re-introduces support for the following devices, which I missed in the transition to version 3.0 last year:\niPhone 8 Plus\nApple Watch Series 7/8 in the 41mm size \nMacBook Pro 13”\nAdditionally, I fixed an issue with Apple Frames 3.0 that was causing screenshots taken on an iPad Pro 11” in portrait orientation with the ‘More Space’ display scaling mode enabled to not be framed properly. My apologies to all those who tried running Apple Frames on their 11” iPad Pros and couldn’t get it to work over the past few months. I ended up buying an 11” iPad Pro myself so I could debug the issue and figure out what was going on.\nIntroducing the Apple Frames API\nThe big change in Apple Frames 3.1 is the availability of a lightweight API that lets you control the shortcut’s behavior with simple text commands. It may seem silly to make an “API” for a shortcut running on your iPhone or Mac, but this is, after all, a little programming interface for Apple Frames, so I think it’s only fair to call it that.\nHere’s the gist: you can now script Apple Frames with commands that tell it where to take images from (input commands) and where to save the framed images (output commands). You can still run Apple Frames manually like you’ve always done; however, if you want to save even more time, you can also program Apple Frames 3.1 to get screenshots from a specific source and perform a specific action with the output without having to manually pick images or options from a list.\nHelper shortcuts for the Apple Frames API on my iPad Pro.\nThe Apple Frames API is entirely optional and designed for advanced users of the shortcut who want to take their usage of Apple Frames to the next level.\nThe new Apple Frames API is comprised of input commands and output commands. These are bits of text that you can pass to the shortcut to tell it where to get images and what to do with them. And here’s my favorite part: you can mix and match input and output commands however you like, creating diverse “recipes” for automating Apple Frames in different contexts. For example, you can create an Apple Frames automation to get the most recent screenshot, frame it, and save it to the Photos app with no manual interactions; or you can automate Apple Frames to get the 10 most recent screenshots from your library, frame them individually without merging them, and save the results in a folder of the Files app.\nWhen I refer to the ability to “pass” commands to Apple Frames 3.1, I mean it. The Apple Frames API was designed to be as simple as giving Apple Frames a ‘Text’ action to use as input, as shown in the image below:\nAll you need to get started with the Apple Frames API are the ‘Text’ and ‘Run Shortcut’ actions.\nOf course, if you’re an advanced user, you have other options too. You can pass commands to Apple Frames 3.1 via the URL scheme or, if you’re a Mac user, via the shell. Any input method that can pass some text to Shortcuts will work.\nInput Commands\nWith that being said, here are the input commands supported by Apple Frames 3.1:\npick Pick images manually. (This command is mandatory if you don’t want to use any other input command, but still want an output command later.)\nclipboard Get an image from the system clipboard.\nlatest Get the latest screenshot.\nnumber(n) Pass a numeric value to Apple Frames that tells it how many recent screenshots to retrieve from the photo library. e.g. number(5) will tell Apple Frames to get the five most recent screenshots.\ncapture Ask Apple Frames to capture a screenshot and frame it immediately. Best used on macOS, where you can trigger shortcuts with the keyboard without a UI shown.\nfolder Get images from a specific folder in Files or Finder.\nThese input commands should be fairly self-explanatory: they tell Apple Frames where to get images, giving you more flexibility than having to select images manually every time.\nThe only input command that needs to be configured is the folder one: in the shortcut, you’ll find a ‘Folder’ action that determines the folder in Files/Finder that you want to use as the source of images for the Apple Frames API. You can change this to whatever you want.\nYou can configure the source and destination folders in the shortcut.\nYou can use an input command without an output one. For instance, you can get images from the clipboard, and then be presented with Apple Frames’ classic menu of actions at the end. However, if you use an output command, you’ll always have to include an input command too. You can’t pass &quickLook without an input command before it, for example.\n\nOutput Commands\nSo those are the input commands. Here is the list of output commands you can use in Apple Frames 3.1:\n&quickLook Preview images with Quick Look.\n&photos Save framed images to the Photos app.\n&quickSave Save framed images to a specific folder of your choice in Files/Finder.\n&copy Copy framed images to the clipboard.\n&airDrop Instantly AirDrop framed images.\n&upload If you have a shortcut that uploads images to your own server or other places, this is the output command that can trigger it.\n&passthrough Pass framed images as output to the next Shortcuts action. The resulting variable should be set as an ‘Image’ variable. More details here.\nOutput commands, as I noted above, can be chained after an input command; using an output command automates the process of saving a framed image somewhere, and it lets you bypass the final menu of actions that gets shown by Apple Frames at the end of the shortcut. By combining input and output commands, you’ll be able to fully automate the image generation process of Apple Frames, from source files to the resulting framed image, with zero user interaction in the middle. Make sure to check out the list of examples I prepared in the next section to get a sense of how you can combine these.\nMerge Mode\nLastly, Apple Frames 3.1 comes with an option that has been requested dozens of times by advanced users: the ability to frame images individually, without merging them in a single composite image at the end. This option is available as a shortcut-wide toggle as well as a flag in the API.\nFor starters, toward the beginning of the shortcut, you’ll see a new Merge Images variable that is set to ‘True’. This means that, by default, Apple Frames will merge multiple framed screenshots into a single image, as it’s always done. If you want to alter the default behavior of Apple Frames, you can change this variable to ‘False.’\nThe new variable that controls the merging behavior in Apple Frames 3.1 (left) and images that have been framed individually (right).\nIf you want to keep Apple Frames’ default merging behavior but override it every once in a while when needed, you can use the Apple Frames API. By passing +mergeImages=False to Apple Frames (alongside at least one input command), you’ll temporarily override the merging behavior of Apple Frames and tell it to save images individually if it normally merges them, and vice versa. For example, this…\npick+mergeImages=False\n…will let you pick screenshots to frame, then save the results as individual images at the end.\nWith this new merge option in the Apple Frames API, you’ll be able to temporarily override the merging behavior of Apple Frames.\nExamples of the Apple Frames API with Helper Shortcuts\nSince the Apple Frames API is based on passing some text input to the shortcut, you can put together “helper shortcuts” to perform specific actions in Apple Frames and rely on them as standalone utilities. These are simple, two-action shortcuts that send specific commands to Apple Frames.\nHelper shortcuts designed for the Apple Frames API.\nIf you’re a Mac user, the Apple Frames API opens a world of possibilities: not only can you create all the helper shortcuts you want and assign a global hotkey to each one of them, but you can even trigger these dedicated utilities from the menu bar, via scripts, or via utilities like Raycast and Alfred.\nLet me give you some practical examples of how you can use these commands and chain them together.\nThe first helper shortcut I created for Apple Frames 3.1 tells it to get the latest screenshot from the Photos app, frame it, and immediately copy the framed image to the clipboard. This lets you go from a screenshot to a framed image ready to be shared in iMessage or other apps in two seconds, especially if you trigger this shortcut from a widget on your Home Screen or Siri. Here’s how easy it is to put together:\nFraming the latest screenshot I’ve taken with this helper shortcut so I can quickly paste it in iMessage.\n\n \n \n Frame Latest and CopyFrame the most recent screenshot from the Photos app and copy it to the clipboard. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nWhat if you have a screenshot in your clipboard and want to quickly save it to a predefined folder in the Files app or Finder? Easy: using the clipboard&quickSave commands with the Apple Frames API, you can put together a helper shortcut to do exactly that in seconds:\nCopy a screenshot, frame it, and save it into a Files folder in two seconds.\n\n \n \n Frame ClipboardFrame an image from the clipboard and save it into a specific folder of Files or Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nHow about getting all images from a folder, framing them, and deciding what to do with them at the end? Not a problem: with the Apple Frames API, you can omit an output command and just tell Apple Frames to work with a folder as an input command, like so:\nTake all images from a folder, frame them, and present a list of actions.\n\n \n \n Frame FolderFrame all images from a specific folder in Files or Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nLet me get a bit fancier now. Let’s say you want to frame a specific number of recent screenshots, save them in a predefined folder, and override Apple Frames’ merging behavior so that you get a bunch of standalone framed images instead of a single composite one. By chaining together multiple commands, the Frame Number helper shortcut does it: it’ll ask you for a number, get those screenshots, frame them individually, and save them.\nGiven a number of recent screenshots, frame them individually, and save them into a folder in Files.\n\n \n \n Frame NumberGet a specific number of screenshots, frame them as individual images, and save them in a specific folder of Files and Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nThe next one is a request I’ve heard multiple times: you have a folder full of screenshots and want to frame them all at once, saving the results to another folder as individual images. With the Frame Folder helper shortcut, you can fetch screenshots from a specific location in Files/Finder, frame them, and save them as standalone, framed images in a different location.\nTake a folder full of images, frame each individually, then save the results to another folder.\n\n \n \n Frame Folder and Save As Individual ImagesGet all images from a specific folder, frame them as individual images, and save them in another folder of Files and Finder. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nLastly, let’s say you want to use Apple Frames on macOS and put together a helper shortcut that instantly captures what you’re looking at, frames it, then puts the framed image in your system clipboard. With Frame Capture and Copy, you can do just that. Assign a keyboard shortcut to this helper utility, and you’ll be able to instantly frame whatever you’re looking at on your Mac with a single keystroke.\nCapture what’s onscreen, frame it, and copy it immediately to the clipboard.\n\n \n \n Frame Capture and CopyCapture what’s onscreen, frame it, and copy it to rhe clipboard. This shortcut is based on the Apple Frames API and requires Apple Frames 3.1 or above.\nGet the shortcut here.\n\n \n \n\nBy creating helper shortcuts based on the ‘Text’ and ‘Run Shortcuts’ actions, you now have the power to fully automate Apple Frames regardless of whether you’re using an iPhone, iPad, or Mac.\nDownload Apple Frames 3.1\nA custom menu I created for Apple Frames 3.1.\nI’m incredibly excited about the Apple Frames API as a tool to help people save even more time with Apple Frames. You don’t have to use this advanced feature; if you don’t need it, just go ahead and install Apple Frames 3.1 below, and keep using it like you’ve always done. You can entirely ignore what I just described and just enjoy support for more devices and bug fixes.\nIf you seek more power and flexibility in Apple Frames, however, the barebones API I created will let you control the input and output behavior of the shortcut. My hope is that these new options will help you fit Apple Frames into different and more elaborate workflows, allowing you to generate and share better-looking screenshots with fewer interactions than before.\nI had fun creating this new flavor Apple Frames, and I hope you’ll find it as useful as I do. You can download Apple Frames 3.1, with full support for the new Apple Frames API, below.\n\n \n \n Apple FramesAdd device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet. The shortcut also supports an API for automating input images and framed results.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-03-01T11:12:05-05:00", "date_modified": "2024-03-21T08:39:37-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Apple Frames", "automation", "iOS", "iPadOS", "macOS", "stories" ] }, { "id": "https://www.macstories.net/?p=71405", "url": "https://www.macstories.net/stories/raycast-adds-deeplinking-of-commands/", "title": "Raycast Adds Deeplinking of Commands", "content_html": "
\"\"

\n

Raycast, the app launcher and command utility that was our MacStories Selects Best Mac app of 2022, introduced URL scheme support for its extensive collection of built-in and third-party commands. The app’s existing system of hotkey and alias triggers is still the best way to send a command to Raycast in most circumstances, but with deeplinks, Raycast has opened up new automation possibilities.

\n

\n

The structure of a Raycast deeplink is:

\n

raycast://extensions/<author-or-owner>/<extension-name>/<command-name>

\n

However, you don’t need to find each of those components yourself because Raycast has made it easy to copy deeplinks to commands inside Raycast itself. Just invoke Raycast, search for the command you want, and press ⌘K. One of the options in the menu will be Copy Deeplink, which also uses the keyboard shortcut ⌘⇧C.

\n
\"You'll

You’ll be prompted by macOS to allow Raycast commands to be triggered with a deeplink.

\n

To test a newly copied URL, paste it into Safari and hit Return to launch the command. The first time you use a deeplink, you’ll be prompted to allow it to trigger the command once or always.

\n

What I love about Raycast’s URL scheme is that it can be used to fill in gaps in Shortcuts. With Ventura, Apple didn’t expand the system-level actions available in Shortcuts. There’s not even a Shortcuts action to toggle Stage Manager. However, with Raycast’s URL scheme, you can do just that, as well as toggle light and dark mode, sleep your Mac, and shut it down, none of which is possible with a Shortcuts action unless you resort to scripting.

\n
\"CleanShot

CleanShot X doesn’t support Shortcuts, but with Raycast’s URL scheme, it does now.

\n

Raycast’s URL scheme also adds actions for Mac apps that don’t support Shortcuts yet. With an extensive catalog of commands for popular Mac apps and services, that opens a lot of new options. For instance, I love CleanShot X, but it doesn’t support Shortcuts. However, it does have an excellent Raycast extension. That made it simple to create a shortcut that lets me select an area of an image and copy unselectable text from it using CleanShot X’s ‘Capture Text (OCR)’ command.

\n

In my initial testing, not every deeplink works when opened from Shortcuts. I had hoped to create a shortcut to open a set of apps and place them on different parts of my screen using Raycast’s ‘Window Management’ commands, but the deeplink changes the focus from a newly-opened window to Raycast, making that impossible. Of course, there are other ways to accomplish the same thing, but they’re not as simple as opening a URL.

\n

Still, I highly recommend thinking about Raycast the next time you’re working on a shortcut. It was already a great way to launch shortcuts, but now, it’s also one of the many third-party utilities that extends Shortcuts.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Raycast, the app launcher and command utility that was our MacStories Selects Best Mac app of 2022, introduced URL scheme support for its extensive collection of built-in and third-party commands. The app’s existing system of hotkey and alias triggers is still the best way to send a command to Raycast in most circumstances, but with deeplinks, Raycast has opened up new automation possibilities.\n\nThe structure of a Raycast deeplink is:\nraycast://extensions/<author-or-owner>/<extension-name>/<command-name>\nHowever, you don’t need to find each of those components yourself because Raycast has made it easy to copy deeplinks to commands inside Raycast itself. Just invoke Raycast, search for the command you want, and press ⌘K. One of the options in the menu will be Copy Deeplink, which also uses the keyboard shortcut ⌘⇧C.\nYou’ll be prompted by macOS to allow Raycast commands to be triggered with a deeplink.\nTo test a newly copied URL, paste it into Safari and hit Return to launch the command. The first time you use a deeplink, you’ll be prompted to allow it to trigger the command once or always.\nWhat I love about Raycast’s URL scheme is that it can be used to fill in gaps in Shortcuts. With Ventura, Apple didn’t expand the system-level actions available in Shortcuts. There’s not even a Shortcuts action to toggle Stage Manager. However, with Raycast’s URL scheme, you can do just that, as well as toggle light and dark mode, sleep your Mac, and shut it down, none of which is possible with a Shortcuts action unless you resort to scripting.\nCleanShot X doesn’t support Shortcuts, but with Raycast’s URL scheme, it does now.\nRaycast’s URL scheme also adds actions for Mac apps that don’t support Shortcuts yet. With an extensive catalog of commands for popular Mac apps and services, that opens a lot of new options. For instance, I love CleanShot X, but it doesn’t support Shortcuts. However, it does have an excellent Raycast extension. That made it simple to create a shortcut that lets me select an area of an image and copy unselectable text from it using CleanShot X’s ‘Capture Text (OCR)’ command.\nIn my initial testing, not every deeplink works when opened from Shortcuts. I had hoped to create a shortcut to open a set of apps and place them on different parts of my screen using Raycast’s ‘Window Management’ commands, but the deeplink changes the focus from a newly-opened window to Raycast, making that impossible. Of course, there are other ways to accomplish the same thing, but they’re not as simple as opening a URL.\nStill, I highly recommend thinking about Raycast the next time you’re working on a shortcut. It was already a great way to launch shortcuts, but now, it’s also one of the many third-party utilities that extends Shortcuts.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2023-02-02T11:54:27-05:00", "date_modified": "2024-11-06T11:13:09-05:00", "authors": [ { "name": "John Voorhees", "url": "https://www.macstories.net/author/johnvoorhees/", "avatar": "https://secure.gravatar.com/avatar/5a1475dcd87638ed2f250b6213881115?s=512&d=mm&r=g" } ], "tags": [ "automation", "mac", "Raycast", "URL Scheme", "utility", "stories" ] }, { "id": "https://www.macstories.net/?p=70980", "url": "https://www.macstories.net/ios/masto-redirect-a-mastodon-shortcut-to-redirect-profiles-and-posts-to-your-own-instance/", "title": "Masto-Redirect, a Mastodon Shortcut to Redirect Profiles and Posts to Your Own Instance", "content_html": "
\"Using

Using Masto-Redirect in Safari.

\n

Like many others over the past month, I’ve been thinking deeply about my experience with Twitter and whether I want to align my social media usage with the kind of platform Twitter is rapidly becoming. It’s a complex discussion (if my readers are still on Twitter, am I doing them a disservice by not using Twitter?), but in the meantime, I’ve decided to learn more about Mastodon. And in doing so, I came across an aspect of the service that I wanted to improve with a shortcut.

\n

I created an account on Mastodon.social all the way back in 2018, and you can find me as @viticci there as well. I don’t want to turn this post into a guide to Mastodon (you can find an excellent one here), but, long story short, Mastodon is a decentralized service that is based on a federated network of instances. Essentially, there isn’t a single “Mastodon website” like, say, twitter.com; instead, there can be multiple Mastodon instances across different domains (hence why it’s “decentralized”) but, thanks to an underlying API, you can follow and be followed by people regardless of the instance they’re on. I can be on Mastodon.social, and you can be on Journa.host or Mastodon.online (different instances of Mastodon), but we can still communicate with one another via the protocol Mastodon uses. It’s like living in different countries but speaking the same language. You can read more about this here.

\n

\n

At this point, you may be wondering: if someone has an account on a different instance, or posted something I want to reply to, how can I do this from my account on a separate Mastodon instance?

\n

This is where my friend Jason Snell comes in: a few days ago, he shared a post in which he noted that the default method for redirecting a post or profile from another Mastodon instance back to yours is, well, somewhat convoluted. If you come across a profile or post from a different Mastodon server, you have to copy its original URL, go to your instance, manually paste it into the search box, find the result you’re looking for, and only then you can interact with it. That works, but it’s not intuitive, and I figured I could improve this aspect of the Mastodon experience with a shortcut.

\n

That’s why I built Masto-Redirect. This is a universal shortcut that works on all Apple platforms, which can redirect both user profiles and posts from an external Mastodon instance back to your instance. You can run this shortcut from the share sheet, with a URL in your clipboard, or even from the current tab in Safari for Mac; it’ll understand whether you’re looking at an account or an individual post, and it’ll redirect it so you can view it on your Mastodon instance without having to paste the URL manually in a search box.

\n
\"Redirecting

Redirecting a post to your own Mastodon instance by pasting its original URL.

\n

There is one caveat here: if you want to use Masto-Redirect to redirect individual posts, you’ll have to create an API token on your local Mastodon instance. Don’t worry: the process is extremely easy and the permissions you’ll grant to the shortcut are scoped to the search API: the shortcut won’t be able to do anything else on your behalf except searching for specific posts.

\n

To create an API token for Masto-Redirect, head over to Settings ⇾ Development on your Mastodon instance (on Mastodon.social, it’s here) and click the ‘New application’ button.

\n
\"\"

\n

Enter ‘Masto-Redirect’ as the application name, then scroll down to the ‘Scopes’ section and uncheck all the items that are selected by default. We don’t need any top-level permissions for the shortcut to work. As pictured below, the only item you need to check is the read:search scope, which will grant Masto-Redirect the ability to look up posts from other instances on your behalf.

\n
\"\"

\n

Next, scroll down, click the ‘Submit’ button, and your new API-enabled app will be saved. Return to the Development page, and Masto-Redirect will be listed under ‘Your applications’.

\n
\"\"

\n
\"The

The newly created app for the Mastodon API.

\n

Open the application, and you’ll see a section at the top with your client key, client secret, and access token. Do not share these with anyone else. What we need for the shortcut is the access token, so save that somewhere safe, and proceed to install the shortcut by clicking the link at the end of this article.

\n
\"Copy

Copy the access token, which you’ll have to paste in Shortcuts.

\n

Upon installing Masto-Redirect, you’ll be asked to paste your API token. Then, you’ll have to enter the URL of your local Mastodon instance without a trailing slash – for example, https://mastodon.social or https://hachyderm.io. It does not matter which instance you’re on: as long as the instance comforms to the Mastodon API spec, the shortcut will work – which is the beauty of a federated, decentralized structure.

\n
\"Upon

Upon installing the shortcut, you’ll be asked to paste the access token you just generated.

\n
\"Lastly,

Lastly, you’ll be asked to enter the base URL of the Mastodon instance you’re using, without a trailing slash at the end.

\n

And that’s it in terms of what you need to do to use Masto-Redirect. Once the shortcut is configured, you’ll be able to run it in Safari (or via URLs copied to the clipboard or entered manually in a prompt) when viewing user profiles or individual posts, and you’ll be correctly redirected to the pages for those items on your Mastodon instance. In the screenshot below, you can see how I was able to run Masto-Redirect on a post by Tapbots’ Mark Jardine (who has an account at tapbots.social) and redirect it to Mastodon.social, where I can interact with it.

\n
\"The

The same post viewed on its original Mastodon instance (left) vs. the instance I’m using (right). Notice how I’m signed in and can interact with the post in the Safari window on the right.

\n

Similarly, when viewing an account on a different instance, you can run Masto-Redirect from the share sheet and be instantly taken to that account’s page on your own instance.

\n
\"The

The same account on its original instance (left) and redirected to the one I’m on (right).

\n

If you’re running Masto-Redirect on a Mac, the shortcut will be even faster in that a) it can be triggered system-wide with a keyboard shortcut and b) it uses AppleScript to determine if Safari is the frontmost window and, if so, grab the URL of the current tab. (For this to work, make sure to enable ‘Allow Running Scripts’ in Shortcuts ⇾ Settings ⇾ Advanced on macOS.)

\n

So that’s Masto-Redirect. Like I said, I’m trying to use Mastodon more, and I quickly came across a slightly confusing aspect of its decentralized approach that seemed ripe for an automation based on Shortcuts.

\n

I hope you’ll find Masto-Redirect as useful as I do. You can download Masto-Redirect below or from the MacStories Shortcuts Archive, and you can find me on Mastodon here.

\n
\n
\n \"\"
\n

Masto-Redirect

Redirect user profiles and individual posts from their original Mastodon instance back to your own instance.

\n

Get the shortcut here.

\n\n
\n
\n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Using Masto-Redirect in Safari.\nLike many others over the past month, I’ve been thinking deeply about my experience with Twitter and whether I want to align my social media usage with the kind of platform Twitter is rapidly becoming. It’s a complex discussion (if my readers are still on Twitter, am I doing them a disservice by not using Twitter?), but in the meantime, I’ve decided to learn more about Mastodon. And in doing so, I came across an aspect of the service that I wanted to improve with a shortcut.\nI created an account on Mastodon.social all the way back in 2018, and you can find me as @viticci there as well. I don’t want to turn this post into a guide to Mastodon (you can find an excellent one here), but, long story short, Mastodon is a decentralized service that is based on a federated network of instances. Essentially, there isn’t a single “Mastodon website” like, say, twitter.com; instead, there can be multiple Mastodon instances across different domains (hence why it’s “decentralized”) but, thanks to an underlying API, you can follow and be followed by people regardless of the instance they’re on. I can be on Mastodon.social, and you can be on Journa.host or Mastodon.online (different instances of Mastodon), but we can still communicate with one another via the protocol Mastodon uses. It’s like living in different countries but speaking the same language. You can read more about this here.\n\nAt this point, you may be wondering: if someone has an account on a different instance, or posted something I want to reply to, how can I do this from my account on a separate Mastodon instance?\nThis is where my friend Jason Snell comes in: a few days ago, he shared a post in which he noted that the default method for redirecting a post or profile from another Mastodon instance back to yours is, well, somewhat convoluted. If you come across a profile or post from a different Mastodon server, you have to copy its original URL, go to your instance, manually paste it into the search box, find the result you’re looking for, and only then you can interact with it. That works, but it’s not intuitive, and I figured I could improve this aspect of the Mastodon experience with a shortcut.\nThat’s why I built Masto-Redirect. This is a universal shortcut that works on all Apple platforms, which can redirect both user profiles and posts from an external Mastodon instance back to your instance. You can run this shortcut from the share sheet, with a URL in your clipboard, or even from the current tab in Safari for Mac; it’ll understand whether you’re looking at an account or an individual post, and it’ll redirect it so you can view it on your Mastodon instance without having to paste the URL manually in a search box.\nRedirecting a post to your own Mastodon instance by pasting its original URL.\nThere is one caveat here: if you want to use Masto-Redirect to redirect individual posts, you’ll have to create an API token on your local Mastodon instance. Don’t worry: the process is extremely easy and the permissions you’ll grant to the shortcut are scoped to the search API: the shortcut won’t be able to do anything else on your behalf except searching for specific posts.\nTo create an API token for Masto-Redirect, head over to Settings ⇾ Development on your Mastodon instance (on Mastodon.social, it’s here) and click the ‘New application’ button.\n\nEnter ‘Masto-Redirect’ as the application name, then scroll down to the ‘Scopes’ section and uncheck all the items that are selected by default. We don’t need any top-level permissions for the shortcut to work. As pictured below, the only item you need to check is the read:search scope, which will grant Masto-Redirect the ability to look up posts from other instances on your behalf.\n\nNext, scroll down, click the ‘Submit’ button, and your new API-enabled app will be saved. Return to the Development page, and Masto-Redirect will be listed under ‘Your applications’.\n\nThe newly created app for the Mastodon API.\nOpen the application, and you’ll see a section at the top with your client key, client secret, and access token. Do not share these with anyone else. What we need for the shortcut is the access token, so save that somewhere safe, and proceed to install the shortcut by clicking the link at the end of this article.\nCopy the access token, which you’ll have to paste in Shortcuts.\nUpon installing Masto-Redirect, you’ll be asked to paste your API token. Then, you’ll have to enter the URL of your local Mastodon instance without a trailing slash – for example, https://mastodon.social or https://hachyderm.io. It does not matter which instance you’re on: as long as the instance comforms to the Mastodon API spec, the shortcut will work – which is the beauty of a federated, decentralized structure.\nUpon installing the shortcut, you’ll be asked to paste the access token you just generated.\nLastly, you’ll be asked to enter the base URL of the Mastodon instance you’re using, without a trailing slash at the end.\nAnd that’s it in terms of what you need to do to use Masto-Redirect. Once the shortcut is configured, you’ll be able to run it in Safari (or via URLs copied to the clipboard or entered manually in a prompt) when viewing user profiles or individual posts, and you’ll be correctly redirected to the pages for those items on your Mastodon instance. In the screenshot below, you can see how I was able to run Masto-Redirect on a post by Tapbots’ Mark Jardine (who has an account at tapbots.social) and redirect it to Mastodon.social, where I can interact with it.\nThe same post viewed on its original Mastodon instance (left) vs. the instance I’m using (right). Notice how I’m signed in and can interact with the post in the Safari window on the right.\nSimilarly, when viewing an account on a different instance, you can run Masto-Redirect from the share sheet and be instantly taken to that account’s page on your own instance.\nThe same account on its original instance (left) and redirected to the one I’m on (right).\nIf you’re running Masto-Redirect on a Mac, the shortcut will be even faster in that a) it can be triggered system-wide with a keyboard shortcut and b) it uses AppleScript to determine if Safari is the frontmost window and, if so, grab the URL of the current tab. (For this to work, make sure to enable ‘Allow Running Scripts’ in Shortcuts ⇾ Settings ⇾ Advanced on macOS.)\nSo that’s Masto-Redirect. Like I said, I’m trying to use Mastodon more, and I quickly came across a slightly confusing aspect of its decentralized approach that seemed ripe for an automation based on Shortcuts.\nI hope you’ll find Masto-Redirect as useful as I do. You can download Masto-Redirect below or from the MacStories Shortcuts Archive, and you can find me on Mastodon here.\n\n \n \n Masto-RedirectRedirect user profiles and individual posts from their original Mastodon instance back to your own instance.\nGet the shortcut here.\n\n \n \n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2022-11-29T11:33:30-05:00", "date_modified": "2022-12-05T20:27:38-05:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "mastodon", "shortcuts", "iOS" ] }, { "id": "https://www.macstories.net/?p=70874", "url": "https://www.macstories.net/ios/apple-frames-3-0-completely-rewritten-support-for-iphone-14-pro-and-dynamic-island-new-devices-multiple-display-resolutions-and-more/", "title": "Apple Frames 3.0: Completely Rewritten, Support for iPhone 14 Pro and Dynamic Island, New Devices, Multiple Display Resolutions, and More", "content_html": "
\"Apple

Apple Frames 3.0.

\n

Today, I’m pleased to announce the release of version 3.0 of Apple Frames, my shortcut to put screenshots taken on various Apple devices inside physical frames for iPhone, iPad, Mac, and Apple Watch.

\n

Apple Frames 3.0 is a major update that involved a complete re-architecture of the shortcut to improve its performance and reliability on all Apple platforms. For Apple Frames 3.0, I entirely rebuilt its underlying file structure to move away from base64 and embrace Files/Finder to store assets. As a result, Apple Frames 3.0 is faster, easier to debug, and – hopefully – easier to maintain going forward.

\n

But Apple Frames 3.0 goes beyond a new technical foundation. This update to the shortcut introduces full compatibility with the iPhone 14 Pro and 14 Pro Max with Dynamic Island, Apple Watch Ultra, and the M2 MacBook Air. And that’s not all: Apple Frames 3.0 also brings full support for resolution scaling on all iPad models that offer the ‘More Space’ display mode in iPadOS 16. And in the process, I also added support for ‘Default’ and ‘More Space’ options on the Apple Silicon-based MacBook Airs, MacBook Pros, and iMac. All of this, as always, in a native shortcut designed for high performance that uses Apple’s official device images and requires no manual configuration whatsoever.

\n

Apple Frames 3.0 is the biggest, most versatile version of Apple Frames to date, and I’m proud of the results. Let’s dive in.

\n

\n

A Rewritten Foundation

\n

As I began working on Apple Frames 3.0, it became clear that my previous approach for storing graphical assets had reached its limits.

\n

If you recall, Apple Frames 2.0 was built around a single Frames.json file that contained plain text representations of device frames as base64-encoded strings. That technique worked well for a long time, but when I started adding new frames for the latest Macs and iPhones, I realized that the file had grown too large for Shortcuts and other apps. The JSON file was so big, Jayson couldn’t open it anymore – thus preventing me from debugging it – and Shortcuts would crash when prompting users to grant access to the file. Something had to change.

\n

Apple Frames 3.0 features a new underlying structure entirely based on storing assets in a folder in iCloud Drive. Upon installing Apple Frames 3.0, you’ll be prompted to download a .zip archive from cdn.mactories.net. Once that’s done, you’ll find all the assets used by Apple Frames in iCloud Drive ⇾ Shortcuts ⇾ Frames.

\n
\"The

The initial setup prompt. You’ll only get this once.

\n
\"\"

\n
\"The

The new file-based structure of Apple Frames.

\n

And that’s it! Once the initial download is done, you’ll be able to start using the shortcut and frame screenshots of your favorite devices. As long as you never modify the contents of the Frames folder in iCloud Drive ⇾ Shortcuts, there will never be anything else for you to do.

\n

With this new foundation based on storing assets in the filesystem as images, Apple Frames should be faster and less prone to strange compositing errors that used to happen in older versions. Furthermore, this approach will make it easier for me to debug and maintain Apple Frames going forward. Viewing and renaming actual images in Files is much, much easier than having to deal with a giant JSON file containing base64-encoded text, which should result in faster updates in the future.

\n

Ideally, Shortcuts should let me store assets inside a shortcut as linked resources instead of forcing me to host them on the MacStories CDN and save them as a folder in iCloud Drive. (Automator used to support this.) But that’s a feature request for another time.

\n

New Devices: iPhone 14 Pro, iPhone 14 Pro Max, Apple Watch Ultra, and M2 MacBook Air

\n

Once you’re set with Apple Frames 3.0, run it, pick an iPhone 14 Pro or Pro Max screenshot from your library (or pass one via the share sheet or Quick Actions), wait a couple seconds, and voilà – you’ll be staring at a nicely framed iPhone screenshot featuring the Dynamic Island in any of its layout modes.

\n
\"Dynamic

Dynamic Island support in Apple Frames 3.0. The app is Shelf by Michael Tigas.

\n

Regardless of the kind of screenshot you capture on your iPhone 14 Pro with the Dynamic Island “active” or not, Apple Frames will put it inside a matching frame. This was made possible by Silvia’s excellent work in slicing Apple’s assets and allowing me to overlay the necessary parts of iPhone 14 Pro frames on the Dynamic Island. So, whether you have zero, one, or two activities in the Dynamic Island, Apple Frames 3.0 will work.

\n
\"iPhone

iPhone 14 Pro Max screenshots with no activities in the Dynamic Island.

\n

Apple Frames 3.0 also supports the latest Apple Watch Ultra and its 45mm display. If you’re one of the adventurous explorers and extreme sports people (and John) who purchased an Apple Watch Ultra recently, you’ll be happy to know that you can now show off its large watchOS UI with properly framed images.

\n
\"Apple

Apple Watch Ultra screenshots framed by Apple Frames 3.0.

\n

Apple Frames 3.0 also works with the M2 MacBook Air Apple released a few months ago. Take a screenshot on your new MacBook Air, run Apple Frames (on macOS, I like passing screenshots from the desktop to the shortcut via Finder’s Quick Actions), and you’ll get a framed version of it featuring a latest-generation MacBook Air in the Midnight color (this is the only color version provided by Apple).

\n
\"Myke

Myke Hurley’s M2 MacBook Air, framed with Apple Frames 3.0.

\n

Display Scaling Support on iPadOS 16 and macOS

\n

One of my favorite features of iPadOS 16 is the ability to set a different display scale for a supported iPad Pro or iPad Air. This new option is labeled ‘More Space’, and you can find it under Settings ⇾ Display & Brightness ⇾ Display Zoom ⇾ More Space. Once activated, display scaling will increase the iPad’s virtual resolution to make every UI element smaller and therefore let you fit more content onscreen.

\n
\"The

The ‘More Space’ resolution on iPadOS 16…

\n
\"...versus

…versus the old ‘Default’ resolution.

\n

When you capture a screenshot with More Space display scaling enabled, images have a higher resolution than those taken with an iPad set to ‘Default’ display scaling mode. Effectively, this meant that I had to add support for three new device sizes (iPad Pro 12.9”, iPad Pro 11”, and iPad Air) with two different orientations each when More Space is enabled.

\n

Apple Frames 3.0 will seamlessly work with iPad screenshots captured at Default and More Space display scaling modes.1 Run Apple Frames with some iPad screenshots captured on iPadOS 16 with display scaling activated, and you’ll get a nicely framed iPad image showing more content onscreen than before.

\n
\"My

My current iPad Pro Home Screen.

\n

Of course, if you don’t want to use display scaling on iPadOS 16 but are instead sticking to the iPad’s default resolution, that’s fine. Apple Frames will continue working in that case as before.

\n

While I was at it, I figured I could add support for the ‘More Space’ resolution on compatible Macs too. And, well, I’m pleased to say that Apple Frames 3.0 supports both ‘Default’ and ‘More Space’ resolutions on the following Mac models:

\n

To test Apple Frames’ support for display scaling on macOS, head over to the Displays section of System Settings (or System Preferences if you haven’t updated to Ventura yet), select ‘More Space’ as your computer’s resolution, take a screenshot, and pass it to Apple Frames.

\n
\"The

The ‘More Space’ resolution on the M1 MacBook Air.

\n

New Action: Quick Save to Files and Finder

\n

The last addition to Apple Frames I want to mention is a new action displayed in Apple Frames’ final menu: Quick Save to Files/Finder.

\n

Apple Frames was already capable of saving framed images to Files or Finder by letting you rename them and pick a destination folder. However, over time I’ve realized that I often want to just save an image or multiple ones to the Files app without having to do anything else. This is exactly what Quick Save does: it saves framed images to either Files or Finder immediately, without asking you to confirm anything else.

\n
\"The

The setup process for Quick Save. You can pick any folder you want.

\n

Upon installing Apple Frames 3.0, you’ll be asked to pick a destination folder for this new Quick Save functionality. By default, I picked iCloud Drive’s root folder, but you can use whatever you prefer. If you change your mind, you can always change the default folder for Quick Save later.

\n

Welcome to Apple Frames 3.0

\n

Apple Frames 3.0 is the biggest, most versatile version of Apple Frames to date. If my calculations are correct, Apple Frames 3.0 should support a total of 43 unique devices, which are:

\n

When you factor in the two possible orientations for iPhone and iPad, plus the two resolution modes of compatible Macs and iPads, you end up with hundreds of potential combinations that are supported in Apple Frames 3.0. And, at the end of the day, it still remains a native shortcut that makes a one-time web request to MacStories to download some image assets. Apple Frames 3.0 is fast, respects your privacy, and, best of all, provides a better foundation for me to work with in the future.

\n

I hope you’ll enjoy Apple Frames 3.0. As always, feel free to send me ideas, feature requests, or bug reports on Twitter, and I’ll try my best to handle them.

\n

You can download Apple Frames 3.0 below or by visiting the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

Apple Frames

Add device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
  1. \nAs before, Apple Frames will not work with the old ‘Larger Text’ option of Display Zoom on either iPhone or iPad. ↩︎\n
  2. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "Apple Frames 3.0.\nToday, I’m pleased to announce the release of version 3.0 of Apple Frames, my shortcut to put screenshots taken on various Apple devices inside physical frames for iPhone, iPad, Mac, and Apple Watch.\nApple Frames 3.0 is a major update that involved a complete re-architecture of the shortcut to improve its performance and reliability on all Apple platforms. For Apple Frames 3.0, I entirely rebuilt its underlying file structure to move away from base64 and embrace Files/Finder to store assets. As a result, Apple Frames 3.0 is faster, easier to debug, and – hopefully – easier to maintain going forward.\nBut Apple Frames 3.0 goes beyond a new technical foundation. This update to the shortcut introduces full compatibility with the iPhone 14 Pro and 14 Pro Max with Dynamic Island, Apple Watch Ultra, and the M2 MacBook Air. And that’s not all: Apple Frames 3.0 also brings full support for resolution scaling on all iPad models that offer the ‘More Space’ display mode in iPadOS 16. And in the process, I also added support for ‘Default’ and ‘More Space’ options on the Apple Silicon-based MacBook Airs, MacBook Pros, and iMac. All of this, as always, in a native shortcut designed for high performance that uses Apple’s official device images and requires no manual configuration whatsoever.\nApple Frames 3.0 is the biggest, most versatile version of Apple Frames to date, and I’m proud of the results. Let’s dive in.\n\nA Rewritten Foundation\nAs I began working on Apple Frames 3.0, it became clear that my previous approach for storing graphical assets had reached its limits.\nIf you recall, Apple Frames 2.0 was built around a single Frames.json file that contained plain text representations of device frames as base64-encoded strings. That technique worked well for a long time, but when I started adding new frames for the latest Macs and iPhones, I realized that the file had grown too large for Shortcuts and other apps. The JSON file was so big, Jayson couldn’t open it anymore – thus preventing me from debugging it – and Shortcuts would crash when prompting users to grant access to the file. Something had to change.\nApple Frames 3.0 features a new underlying structure entirely based on storing assets in a folder in iCloud Drive. Upon installing Apple Frames 3.0, you’ll be prompted to download a .zip archive from cdn.mactories.net. Once that’s done, you’ll find all the assets used by Apple Frames in iCloud Drive ⇾ Shortcuts ⇾ Frames.\nThe initial setup prompt. You’ll only get this once.\n\nThe new file-based structure of Apple Frames.\nAnd that’s it! Once the initial download is done, you’ll be able to start using the shortcut and frame screenshots of your favorite devices. As long as you never modify the contents of the Frames folder in iCloud Drive ⇾ Shortcuts, there will never be anything else for you to do.\nIf you have an existing version of the Frames.json file stored in the root level of iCloud Drive ⇾ Shortcuts, you can delete it since it will no longer be necessary to Apple Frames. That’s the old Frames.json file used by previous versions of Apple Frames. The new, much lighter version of Frames.json lives in iCloud Drive ⇾ Shortcuts ⇾ Frames and that is the one you’re not supposed to touch.\n\nWith this new foundation based on storing assets in the filesystem as images, Apple Frames should be faster and less prone to strange compositing errors that used to happen in older versions. Furthermore, this approach will make it easier for me to debug and maintain Apple Frames going forward. Viewing and renaming actual images in Files is much, much easier than having to deal with a giant JSON file containing base64-encoded text, which should result in faster updates in the future.\nIdeally, Shortcuts should let me store assets inside a shortcut as linked resources instead of forcing me to host them on the MacStories CDN and save them as a folder in iCloud Drive. (Automator used to support this.) But that’s a feature request for another time.\nNew Devices: iPhone 14 Pro, iPhone 14 Pro Max, Apple Watch Ultra, and M2 MacBook Air\nOnce you’re set with Apple Frames 3.0, run it, pick an iPhone 14 Pro or Pro Max screenshot from your library (or pass one via the share sheet or Quick Actions), wait a couple seconds, and voilà – you’ll be staring at a nicely framed iPhone screenshot featuring the Dynamic Island in any of its layout modes.\nDynamic Island support in Apple Frames 3.0. The app is Shelf by Michael Tigas.\nRegardless of the kind of screenshot you capture on your iPhone 14 Pro with the Dynamic Island “active” or not, Apple Frames will put it inside a matching frame. This was made possible by Silvia’s excellent work in slicing Apple’s assets and allowing me to overlay the necessary parts of iPhone 14 Pro frames on the Dynamic Island. So, whether you have zero, one, or two activities in the Dynamic Island, Apple Frames 3.0 will work.\niPhone 14 Pro Max screenshots with no activities in the Dynamic Island.\nApple Frames 3.0 also supports the latest Apple Watch Ultra and its 45mm display. If you’re one of the adventurous explorers and extreme sports people (and John) who purchased an Apple Watch Ultra recently, you’ll be happy to know that you can now show off its large watchOS UI with properly framed images.\nApple Watch Ultra screenshots framed by Apple Frames 3.0.\nApple Frames 3.0 also works with the M2 MacBook Air Apple released a few months ago. Take a screenshot on your new MacBook Air, run Apple Frames (on macOS, I like passing screenshots from the desktop to the shortcut via Finder’s Quick Actions), and you’ll get a framed version of it featuring a latest-generation MacBook Air in the Midnight color (this is the only color version provided by Apple).\nMyke Hurley’s M2 MacBook Air, framed with Apple Frames 3.0.\nDisplay Scaling Support on iPadOS 16 and macOS\nOne of my favorite features of iPadOS 16 is the ability to set a different display scale for a supported iPad Pro or iPad Air. This new option is labeled ‘More Space’, and you can find it under Settings ⇾ Display & Brightness ⇾ Display Zoom ⇾ More Space. Once activated, display scaling will increase the iPad’s virtual resolution to make every UI element smaller and therefore let you fit more content onscreen.\nThe ‘More Space’ resolution on iPadOS 16…\n…versus the old ‘Default’ resolution.\nWhen you capture a screenshot with More Space display scaling enabled, images have a higher resolution than those taken with an iPad set to ‘Default’ display scaling mode. Effectively, this meant that I had to add support for three new device sizes (iPad Pro 12.9”, iPad Pro 11”, and iPad Air) with two different orientations each when More Space is enabled.\nApple Frames 3.0 will seamlessly work with iPad screenshots captured at Default and More Space display scaling modes.1 Run Apple Frames with some iPad screenshots captured on iPadOS 16 with display scaling activated, and you’ll get a nicely framed iPad image showing more content onscreen than before.\nMy current iPad Pro Home Screen.\nOf course, if you don’t want to use display scaling on iPadOS 16 but are instead sticking to the iPad’s default resolution, that’s fine. Apple Frames will continue working in that case as before.\nWhile I was at it, I figured I could add support for the ‘More Space’ resolution on compatible Macs too. And, well, I’m pleased to say that Apple Frames 3.0 supports both ‘Default’ and ‘More Space’ resolutions on the following Mac models:\n2020 MacBook Air (M1)\n2021 iMac (M1)\n2021 MacBook Pro 14”\n2021 MacBook Pro 16”\n2022 MacBook Air (M2)\nTo test Apple Frames’ support for display scaling on macOS, head over to the Displays section of System Settings (or System Preferences if you haven’t updated to Ventura yet), select ‘More Space’ as your computer’s resolution, take a screenshot, and pass it to Apple Frames.\nThe ‘More Space’ resolution on the M1 MacBook Air.\nNew Action: Quick Save to Files and Finder\nThe last addition to Apple Frames I want to mention is a new action displayed in Apple Frames’ final menu: Quick Save to Files/Finder.\nApple Frames was already capable of saving framed images to Files or Finder by letting you rename them and pick a destination folder. However, over time I’ve realized that I often want to just save an image or multiple ones to the Files app without having to do anything else. This is exactly what Quick Save does: it saves framed images to either Files or Finder immediately, without asking you to confirm anything else.\nThe setup process for Quick Save. You can pick any folder you want.\nUpon installing Apple Frames 3.0, you’ll be asked to pick a destination folder for this new Quick Save functionality. By default, I picked iCloud Drive’s root folder, but you can use whatever you prefer. If you change your mind, you can always change the default folder for Quick Save later.\nWelcome to Apple Frames 3.0\nApple Frames 3.0 is the biggest, most versatile version of Apple Frames to date. If my calculations are correct, Apple Frames 3.0 should support a total of 43 unique devices, which are:\niPhone 8\niPhone SE\niPhone 11\niPhone 11 Pro and Pro Max\niPhone 12 mini\niPhone 12\niPhone 12 Pro and Pro Max\niPhone 13 mini\niPhone 13\niPhone 13 Pro and Pro Max\niPhone 14 and 14 Plus\niPhone 14 Pro and Pro Max\nApple Watch Series 4, 5, 6, 7, and 8\nApple Watch Ultra\niPad (6th, 7th, 8th, and 9th generations)\niPad mini (6th generation)\niPad Air (4th and 5th generations)\niPad Pro 11” (all generations)\niPad Pro 12.9” (3rd, 4th, 5th, and 6th generations)\nMacBook Pro 14” (2021)\nMacBook Pro 16” (2021)\niMac 24” (2021)\nMacBook Air with M1 (2020)\nMacBook Air with M2 (2022)\nWhen you factor in the two possible orientations for iPhone and iPad, plus the two resolution modes of compatible Macs and iPads, you end up with hundreds of potential combinations that are supported in Apple Frames 3.0. And, at the end of the day, it still remains a native shortcut that makes a one-time web request to MacStories to download some image assets. Apple Frames 3.0 is fast, respects your privacy, and, best of all, provides a better foundation for me to work with in the future.\nI hope you’ll enjoy Apple Frames 3.0. As always, feel free to send me ideas, feature requests, or bug reports on Twitter, and I’ll try my best to handle them.\nYou can download Apple Frames 3.0 below or by visiting the MacStories Shortcuts Archive.\n\n \n \n Apple FramesAdd device frames to screenshots for iPhones (11, 8/SE, and 12-13-14 generations in mini/standard/Plus/Pro Max sizes), iPad Pro (11” and 12.9”, 2018-2022 models), iPad Air (10.9”, 2020-2022 models), iPad mini (2021 model), Apple Watch S4/5/6/7/8/Ultra, iMac (24” model, 2021), MacBook Air (2020-2022 models), and MacBook Pro (2021 models). The shortcut supports portrait and landscape orientations, but does not support Display Zoom; on iPadOS and macOS, the shortcut supports Default and More Space resolutions. If multiple screenshots are passed as input, they will be combined in a single image. The shortcut can be run in the Shortcuts app, as a Home Screen widget, as a Finder Quick Action, or via the share sheet.\nGet the shortcut here.\n\n \n \n\n\n\nAs before, Apple Frames will not work with the old ‘Larger Text’ option of Display Zoom on either iPhone or iPad. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2022-11-08T10:23:04-05:00", "date_modified": "2024-03-21T08:39:53-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "Apple Frames", "automation", "iOS", "macOS", "shortcuts" ] }, { "id": "https://www.macstories.net/?p=70650", "url": "https://www.macstories.net/ios/creating-lock-screen-widgets-for-specific-notes-via-the-apple-notes-url-scheme/", "title": "Creating Lock Screen Widgets for Specific Notes via the Apple Notes URL Scheme", "content_html": "
\"All

All I wanted was a widget.

\n

A few days ago, as I was playing around with my Lock Screen on iOS 16, I wondered: would it be possible to use the hidden Apple Notes URL scheme to create widget launchers to reopen specific notes in the Notes app?

\n

That led me down a fascinating rabbit hole filled with hidden Shortcuts tricks and discoveries I thought would be useful to document on MacStories for everyone to see.

\n

You know, for posterity.

\n

\n

It all started with the idea that I wanted to use the Widgetsmith app to make a Lock Screen widget to open an individual note in the Notes app. I have a ‘Scratchpad’ note I keep in Notes, and I wanted a one-click button to launch it from the Lock Screen. As I explained in my iOS 16 review, the Notes team hasn’t built any Lock Screen widgets in iOS 16, but I figured I could fix the problem myself by using the undocumented Apple Notes URL scheme that’s been an open secret in the iOS automation community for the past few years. There are multiple layers to this technique, so let’s dig in.

\n

First, let me explain my position. If you want to make a simple widget that opens a specific note in the Notes app, you can also do it by using a shortcut. The Shortcuts app has an ‘Open Note’ action that you can point to a specific note, and that will immediately reopen it in the app. Then, you can use an app like LockFlow (once again, because Shortcuts doesn’t have Lock Screen widgets) to turn that shortcut into a Lock Screen launcher, and it’s going to work just fine.

\n
\"If

If you want to open a note with Shortcuts, it’s as easy as doing this.

\n

However, since I knew of the existence of an Apple Notes URL scheme, I wanted to remove the Shortcuts dependency and come up with a solution that wouldn’t involve creating a shortcut upfront. Think about it: if you create a launcher that relies on a shortcut, if the Shortcuts app crashes upon triggering the shortcut, then you’ve defeated the whole purpose of creating a quick launcher altogether. Given the…problematic state of Shortcuts for iOS 16 at the moment, I figured that it was preferable to invoke Notes directly via its own URL scheme.

\n

Now, something you should know about the Notes app is that it’s had a hidden URL scheme to reopen individual notes for years. The URL scheme, which is unchanged in iOS 16, is the following:1

\n

mobilenotes://showNote?identifier=UUID

\n

As you can imagine, the URL scheme uses a unique identifier (UUID) to reopen a specific note in the app. I’m not 100% sure about this, but I believe this URL scheme is the one Apple itself uses behind the scenes to open notes directly via widgets, shortcuts, and deep-links generated by Siri.

\n

Unsurprisingly, Apple doesn’t want you to worry about these UUIDs, and they do not advertise this URL scheme anywhere in iOS, except for one obscure place: the Content Graph action of the Shortcuts app.

\n

Without rehashing what I wrote eight (!) years ago about the Content Graph in the Workflow app, it is essentially the engine that tells Shortcuts how to interpret data that gets passed in and out of actions. The beauty of this approach is that, if you know what you’re doing, you can poke around in the Content Graph engine itself and take a look at the raw data representations of each data type using the ‘Show Content Graph’ action. It’s wild, and it looks like this:

\n
\"The

The Content Graph in Shortcuts.

\n

This context is necessary to understand where I’m going with this. I don’t know who first did this, but a few years ago someone in the automation community realized that you could look at the raw representation of a note from the Notes app inside the Content Graph and find its identifier and URL scheme. There are various Reddit threads about this technique. That was the only way to find the UUID of a specific note: you had to invoke the Content Graph and manually select a note’s UUID to copy it. There was no action in Shortcuts to programmatically extract the UUID of a note, nor was the UUID a property you could extract from a ‘Note’ variable. All you could do was combine the ‘Find Notes’ and ‘Show Content Graph’ actions, open the graph, and manually copy a note’s identifier out of it.

\n

So, as I was putting together my widget launcher for the iOS 16 Lock Screen earlier this week, I naturally tried to follow the same approach I’ve been using for years. And much to my surprise, I noticed that Apple had changed the representation of a note in the Content Graph with a different format for identifiers, as shown below:

\n
\"The

The new identifier format in iOS and iPadOS 16.

\n

Obviously, that’s not a valid URL scheme. But in looking at the string of text inside the identifier field, it seemed to me like the last part of it was a UUID, so I just copied that bit and tried it in Shortcuts…

\n
\"This

This is the note’s UUID.

\n

…and it didn’t work. At this point – you know me – I had to keep going. I reached out to a few folks in the Shortcuts community, and they had been equally stuck with UUIDs for individual notes reported by Shortcuts in iOS 16. That wasn’t an encouraging sign.

\n

After wasting way more time than I’d like to admit on this problem, I’m pleased to say that I was able to get this to work. And as is often the case with these things, the “fix” is the silliest thing you can imagine. So here we go:

\n

In iOS 16, Shortcuts’ Content Graph action shows note UUIDs with a lowercase format; they should be uppercase instead. That’s it.

\n

I don’t know why Shortcuts is doing this in iOS 16, but that’s your fix if you’re trying to use the old approach for copying note UUIDs: convert them to uppercase, and they will start working again.

\n

To simplify the process of grabbing note UUIDs and generating URL scheme launchers for them, I created a shortcut that you can download for free today. To use the shortcut, type the title of a note you have in the Notes app, select it from a list of results, and the Content Graph will open.

\n
\"Search

Search for a note by name, confirm the result, and open it in the Content Graph.

\n

At this point, find the blue node labeled ‘Note’ and tap it:

\n
\"\"

\n

Next, select the ‘LNEntity’ type from this list:

\n
\"\"

\n

Finally, copy the alphanumeric UUID that is contained inside the identifier field. It comes after the NoteEntity/notes:note/ part and before a comma, like so:

\n
\"Once

Once again, this part is the UUID of a note. Select it and copy it to the clipboard. This is the only way to get this detail out of Shortcuts.

\n

That’s it! The shortcut will then convert the UUID to uppercase and assemble the proper Apple Notes URL scheme to reopen the selected note. To test this, paste the URL in Safari, and you’ll see that a specific note will open in the Notes app:

\n
\"\"

\n

In my case, once I had this system working again, I opened Widgetsmith, created a Lock Screen widget, and gave it a custom URL scheme generated by Shortcuts. Then, I installed the new widget on my Lock Screen, and I now have a pretty icon that launches the note I want directly from the Lock Screen:

\n
\"Creating

Creating a Lock Screen widget for Notes in Widgetsmith.

\n

Like I said, I wanted to document this for all the Notes and Shortcuts users out there who may have come across this change in iOS 16 and have been unable to figure it out.

\n

In addition to quick launchers, there are other potential applications for having local URLs that point to specific notes in the Notes app. For example, the same system could be used to add a wiki-linking functionality (reminiscent of Craft and Obsidian) to Apple’s Notes app. Which is exactly what I’m going to document in the Monthly Log for Club MacStories members later this week.

\n

In the meantime, you can download my shortcut to generate note URL schemes below and find it in the MacStories Shortcuts Archive.

\n
\n
\n \"\"
\n

Copy Note UUID and URL Scheme

Find a note in the Notes app and copy its UUID from the Shortcuts Content Graph. The shortcut uses the UUID to generate a URL scheme launcher for that specific note in the Notes app.

\n

Get the shortcut here.

\n\n
\n
\n
\n
\n
  1. \nThe URL scheme is different on macOS, but I’ll talk about this in the Monthly Log later this week. ↩︎\n
  2. \n
\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "All I wanted was a widget.\nA few days ago, as I was playing around with my Lock Screen on iOS 16, I wondered: would it be possible to use the hidden Apple Notes URL scheme to create widget launchers to reopen specific notes in the Notes app?\nThat led me down a fascinating rabbit hole filled with hidden Shortcuts tricks and discoveries I thought would be useful to document on MacStories for everyone to see.\nYou know, for posterity.\n\nIt all started with the idea that I wanted to use the Widgetsmith app to make a Lock Screen widget to open an individual note in the Notes app. I have a ‘Scratchpad’ note I keep in Notes, and I wanted a one-click button to launch it from the Lock Screen. As I explained in my iOS 16 review, the Notes team hasn’t built any Lock Screen widgets in iOS 16, but I figured I could fix the problem myself by using the undocumented Apple Notes URL scheme that’s been an open secret in the iOS automation community for the past few years. There are multiple layers to this technique, so let’s dig in.\nFirst, let me explain my position. If you want to make a simple widget that opens a specific note in the Notes app, you can also do it by using a shortcut. The Shortcuts app has an ‘Open Note’ action that you can point to a specific note, and that will immediately reopen it in the app. Then, you can use an app like LockFlow (once again, because Shortcuts doesn’t have Lock Screen widgets) to turn that shortcut into a Lock Screen launcher, and it’s going to work just fine.\nIf you want to open a note with Shortcuts, it’s as easy as doing this.\nHowever, since I knew of the existence of an Apple Notes URL scheme, I wanted to remove the Shortcuts dependency and come up with a solution that wouldn’t involve creating a shortcut upfront. Think about it: if you create a launcher that relies on a shortcut, if the Shortcuts app crashes upon triggering the shortcut, then you’ve defeated the whole purpose of creating a quick launcher altogether. Given the…problematic state of Shortcuts for iOS 16 at the moment, I figured that it was preferable to invoke Notes directly via its own URL scheme.\nNow, something you should know about the Notes app is that it’s had a hidden URL scheme to reopen individual notes for years. The URL scheme, which is unchanged in iOS 16, is the following:1\nmobilenotes://showNote?identifier=UUID\nAs you can imagine, the URL scheme uses a unique identifier (UUID) to reopen a specific note in the app. I’m not 100% sure about this, but I believe this URL scheme is the one Apple itself uses behind the scenes to open notes directly via widgets, shortcuts, and deep-links generated by Siri.\nUnsurprisingly, Apple doesn’t want you to worry about these UUIDs, and they do not advertise this URL scheme anywhere in iOS, except for one obscure place: the Content Graph action of the Shortcuts app.\nWithout rehashing what I wrote eight (!) years ago about the Content Graph in the Workflow app, it is essentially the engine that tells Shortcuts how to interpret data that gets passed in and out of actions. The beauty of this approach is that, if you know what you’re doing, you can poke around in the Content Graph engine itself and take a look at the raw data representations of each data type using the ‘Show Content Graph’ action. It’s wild, and it looks like this:\nThe Content Graph in Shortcuts.\nThis context is necessary to understand where I’m going with this. I don’t know who first did this, but a few years ago someone in the automation community realized that you could look at the raw representation of a note from the Notes app inside the Content Graph and find its identifier and URL scheme. There are various Reddit threads about this technique. That was the only way to find the UUID of a specific note: you had to invoke the Content Graph and manually select a note’s UUID to copy it. There was no action in Shortcuts to programmatically extract the UUID of a note, nor was the UUID a property you could extract from a ‘Note’ variable. All you could do was combine the ‘Find Notes’ and ‘Show Content Graph’ actions, open the graph, and manually copy a note’s identifier out of it.\nSo, as I was putting together my widget launcher for the iOS 16 Lock Screen earlier this week, I naturally tried to follow the same approach I’ve been using for years. And much to my surprise, I noticed that Apple had changed the representation of a note in the Content Graph with a different format for identifiers, as shown below:\nThe new identifier format in iOS and iPadOS 16.\nObviously, that’s not a valid URL scheme. But in looking at the string of text inside the identifier field, it seemed to me like the last part of it was a UUID, so I just copied that bit and tried it in Shortcuts…\nThis is the note’s UUID.\n…and it didn’t work. At this point – you know me – I had to keep going. I reached out to a few folks in the Shortcuts community, and they had been equally stuck with UUIDs for individual notes reported by Shortcuts in iOS 16. That wasn’t an encouraging sign.\nAfter wasting way more time than I’d like to admit on this problem, I’m pleased to say that I was able to get this to work. And as is often the case with these things, the “fix” is the silliest thing you can imagine. So here we go:\nIn iOS 16, Shortcuts’ Content Graph action shows note UUIDs with a lowercase format; they should be uppercase instead. That’s it.\nI don’t know why Shortcuts is doing this in iOS 16, but that’s your fix if you’re trying to use the old approach for copying note UUIDs: convert them to uppercase, and they will start working again.\nTo simplify the process of grabbing note UUIDs and generating URL scheme launchers for them, I created a shortcut that you can download for free today. To use the shortcut, type the title of a note you have in the Notes app, select it from a list of results, and the Content Graph will open.\nSearch for a note by name, confirm the result, and open it in the Content Graph.\nAt this point, find the blue node labeled ‘Note’ and tap it:\n\nNext, select the ‘LNEntity’ type from this list:\n\nFinally, copy the alphanumeric UUID that is contained inside the identifier field. It comes after the NoteEntity/notes:note/ part and before a comma, like so:\nOnce again, this part is the UUID of a note. Select it and copy it to the clipboard. This is the only way to get this detail out of Shortcuts.\nThat’s it! The shortcut will then convert the UUID to uppercase and assemble the proper Apple Notes URL scheme to reopen the selected note. To test this, paste the URL in Safari, and you’ll see that a specific note will open in the Notes app:\n\nIn my case, once I had this system working again, I opened Widgetsmith, created a Lock Screen widget, and gave it a custom URL scheme generated by Shortcuts. Then, I installed the new widget on my Lock Screen, and I now have a pretty icon that launches the note I want directly from the Lock Screen:\nCreating a Lock Screen widget for Notes in Widgetsmith.\nLike I said, I wanted to document this for all the Notes and Shortcuts users out there who may have come across this change in iOS 16 and have been unable to figure it out.\nIn addition to quick launchers, there are other potential applications for having local URLs that point to specific notes in the Notes app. For example, the same system could be used to add a wiki-linking functionality (reminiscent of Craft and Obsidian) to Apple’s Notes app. Which is exactly what I’m going to document in the Monthly Log for Club MacStories members later this week.\nIn the meantime, you can download my shortcut to generate note URL schemes below and find it in the MacStories Shortcuts Archive.\n\n \n \n Copy Note UUID and URL SchemeFind a note in the Notes app and copy its UUID from the Shortcuts Content Graph. The shortcut uses the UUID to generate a URL scheme launcher for that specific note in the Notes app.\nGet the shortcut here.\n\n \n \n\n\n\nThe URL scheme is different on macOS, but I’ll talk about this in the Monthly Log later this week. ↩︎\n\n\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2022-09-28T12:28:54-04:00", "date_modified": "2022-09-28T12:28:54-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "iOS 16", "Notes" ] }, { "id": "https://www.macstories.net/?p=70472", "url": "https://www.macstories.net/stories/shorcuts-in-ios-16-the-potential-of-app-shortcuts-for-everyone/", "title": "Shortcuts in iOS 16: The Potential of App Shortcuts for Everyone", "content_html": "
\"App

App Shortcuts in iOS 16.

\n

A note from Federico: This year, I’ve decided to try some new things for my annual iOS 16 review. Some you’ll see on Monday. One of them is previewing small excerpts from the review in the OS Preview series on MacStories and MacStories Weekly for Club MacStories. Today, I’m posting a preview of a section of the Shortcuts chapter here, and a section of the Everything Else chapter in MacStories Weekly. I hope you enjoy these. I’ll see you for the full story – and more reveals – on Monday.

\n

In iOS 16, the Shortcuts app hasn’t undergone a major redesign or technical rewrite; instead, Apple’s efforts have focused on adding more actions for system apps, extending the developer API, bringing more stability, and making Shortcuts more approachable for new users.

\n

The last point is both important and likely the reason why some Shortcuts power users will be disappointed by this year’s update. There isn’t a lot for them in this new version of the app: as we’ll see in my iPadOS review, there’s no integration with Files quick actions, no support for Stage Manager actions, and no system-wide hotkeys still. If you’re an advanced Shortcuts user and were wishing for more system-level enhancements in addition to stability this year: I hear you, but we’ll talk about this later on.

\n

What we do have in iOS 16 is a fascinating new feature to get newcomers started with the Shortcuts app, a grab bag of useful new actions for Apple apps, and some solid developer-related enhancements that will make third-party actions much better than before. Let’s take a look.

\n

\n

App Shortcuts

\n

With iOS 16, Apple is launching a new feature designed to make Shortcuts more discoverable, and therefore useful to more people, overcoming the steep learning curve that has long scared thousands of users away from the app.

\n

Now, forgive Apple for their historically poor choice of words when it comes to this app, but the feature is called App Shortcuts. I know, I know. Jokes aside, App Shortcuts are interesting for two reasons: they weren’t made for users like me; and they show Apple going in a completely different direction when it comes to onboarding users with the Shortcuts app.

\n

App Shortcuts are simple, one-action shortcuts that are ready to use as soon as you install an app from the App Store that has been updated to include them. App Shortcuts come bundled with apps, and you don’t have to do anything to set them up. In the past, users had to discover ‘Siri shortcuts’ inside apps and explicitly register them in Siri with a custom phrase; that entire system is gone with App Shortcuts. Essentially, Apple is now putting the burden on developers rather than users: app makers will have to decide which parts of their apps’ functionalities they want to make available as App Shortcuts, which you can start using right away with no setup required.

\n
\"App

App Shortcuts in iOS 16.

\n

The whole point of App Shortcuts is removing pressure and friction from users so they can get immediate value out of the Shortcuts app and its integration with iOS. Apple has accomplished this in a variety of ways with App Shortcuts in iOS 16. The first one: once you’ve upgraded to iOS 16 and have updated some of your favorite apps as well, when you open the Shortcuts app you’ll no longer see an empty library or, at best, a handful of Apple’s generic recommended shortcuts. Instead, you’ll find a new section at the bottom of the Shortcuts sidebar filled with pre-installed App Shortcuts from your apps. Technically speaking, these shortcuts aren’t doing anything out of the ordinary: they’re one-action shortcuts to perform common actions in apps. The difference from before is that app developers created them for you based on what they think is going to be useful in their apps.

\n

This is immediately a better experience for folks who were turned off by the Shortcuts app in the past and found the blank shortcut editor daunting. App Shortcuts are ready to go: just click one and run it. Apple is advising developers to keep their selection of App Shortcuts flexible but reasonable, covering the typical actions that users typically perform in their apps. For example, the videogame tracking app GameTrack has an App Shortcut to see your finished games in a popup window; Voice Memos has one to quickly record a new voice memo; Focused Work has App Shortcuts to quickly pick a session or pause one (I used these during for my writing process this summer); GoodLinks has a handy one to open a random article you saved for later.

\n
\"Some

Some examples of upcoming App Shortcuts.

\n

App Shortcuts are one-action shortcuts that do one thing quickly; their parameters are pre-configured by developers; their names (and therefore invocation phrases) should be short and memorable so that users can also run into them in other places of the OS.

\n
\"App

App Shortcuts for Focused Work can also be used from Siri and Spotlight in addition to the Shortcuts app.

\n

That’s the other key design trait of App Shortcuts: discoverability. In iOS 16, App Shortcuts can be discovered in a couple additional ways. First off, if you ask Siri to show you something in an app and that also happens to be a pre-installed App Shortcut, it’ll work right away in the assistant. Second, if you search for an app in Spotlight, in addition to seeing the actual app in search results, the system will also show you the first App Shortcut included in it, which is another way to discover these shortcuts. You can, of course, also find App Shortcuts by typing their full name or by expanding the list of results in Spotlight. And third, Apple is replacing the ‘Add to Siri’ button with ‘Siri Tips’ – small and dismissible banners inside apps that tell you about App Shortcuts you can use for the current app.

\n
\"An

An example of a Siri Tip. I haven’t seen these in third-party apps yet.

\n
\"If

If you want, you can hide an app’s App Shortcuts from Siri or Spotlight by pressing the ‘i’ button in the app’s dedicated section.

\n

I can’t know for certain whether Apple’s strategy will be successful: only time will tell if new users will flock to the Shortcuts app thanks to App Shortcuts they discover in iOS 16. I do believe, however, that App Shortcuts are Apple’s best shot to date when it comes to introducing users to the incredible potential of Shortcuts. This approach makes sense, and I find it more valuable than the built-in Shortcuts Gallery because a) App Shortcuts can be discovered throughout the system, not just inside the Shortcuts app and b) they’re inherently personalized for users since they’re based on the apps they have installed.

\n

What’s also fascinating about App Shortcuts is the opportunity they present in terms of creating a natural progress path for users who want to go from a pre-installed App Shortcut to a custom one. App Shortcuts cannot be modified: wherever you tap on one, you’ll always run it instead of opening the editor view. If you long-press on an App Shortcut in the dedicated section of the Shortcuts app, you’ll see that you can use it as a starting point for a custom shortcut via the ‘Use in New Shortcut’ button.

\n
\"Using

Using an App Shortcut as the starting point for a custom shortcut.

\n

If you decide to use an App Shortcut as the foundation for a new, custom shortcut, you’ll be able to build upon the developer’s work and maybe tweak the shortcut a little, or add a menu for multiple options, or perhaps even bring in new actions from other apps.

\n

This progressive disclosure of the Shortcuts experience is the aspect I’m most intrigued to witness over the coming months. App Shortcuts weren’t created for someone like me, and that’s exactly the point: we need new users in the Shortcuts app. I hope Apple’s plan works this time.

\n

You can also follow our 2022 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.

\n

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

\n

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

\n

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

\n

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

\n

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

\n

Learn more here and from our Club FAQs.

\n

Join Now", "content_text": "App Shortcuts in iOS 16.\nA note from Federico: This year, I’ve decided to try some new things for my annual iOS 16 review. Some you’ll see on Monday. One of them is previewing small excerpts from the review in the OS Preview series on MacStories and MacStories Weekly for Club MacStories. Today, I’m posting a preview of a section of the Shortcuts chapter here, and a section of the Everything Else chapter in MacStories Weekly. I hope you enjoy these. I’ll see you for the full story – and more reveals – on Monday.\nIn iOS 16, the Shortcuts app hasn’t undergone a major redesign or technical rewrite; instead, Apple’s efforts have focused on adding more actions for system apps, extending the developer API, bringing more stability, and making Shortcuts more approachable for new users.\nThe last point is both important and likely the reason why some Shortcuts power users will be disappointed by this year’s update. There isn’t a lot for them in this new version of the app: as we’ll see in my iPadOS review, there’s no integration with Files quick actions, no support for Stage Manager actions, and no system-wide hotkeys still. If you’re an advanced Shortcuts user and were wishing for more system-level enhancements in addition to stability this year: I hear you, but we’ll talk about this later on.\nWhat we do have in iOS 16 is a fascinating new feature to get newcomers started with the Shortcuts app, a grab bag of useful new actions for Apple apps, and some solid developer-related enhancements that will make third-party actions much better than before. Let’s take a look.\nSupported By\nConcepts\n\n\nConcepts: Infinite, Flexible Sketching.\n\nApp Shortcuts\nWith iOS 16, Apple is launching a new feature designed to make Shortcuts more discoverable, and therefore useful to more people, overcoming the steep learning curve that has long scared thousands of users away from the app.\nNow, forgive Apple for their historically poor choice of words when it comes to this app, but the feature is called App Shortcuts. I know, I know. Jokes aside, App Shortcuts are interesting for two reasons: they weren’t made for users like me; and they show Apple going in a completely different direction when it comes to onboarding users with the Shortcuts app.\nApp Shortcuts are simple, one-action shortcuts that are ready to use as soon as you install an app from the App Store that has been updated to include them. App Shortcuts come bundled with apps, and you don’t have to do anything to set them up. In the past, users had to discover ‘Siri shortcuts’ inside apps and explicitly register them in Siri with a custom phrase; that entire system is gone with App Shortcuts. Essentially, Apple is now putting the burden on developers rather than users: app makers will have to decide which parts of their apps’ functionalities they want to make available as App Shortcuts, which you can start using right away with no setup required.\nApp Shortcuts in iOS 16.\nThe whole point of App Shortcuts is removing pressure and friction from users so they can get immediate value out of the Shortcuts app and its integration with iOS. Apple has accomplished this in a variety of ways with App Shortcuts in iOS 16. The first one: once you’ve upgraded to iOS 16 and have updated some of your favorite apps as well, when you open the Shortcuts app you’ll no longer see an empty library or, at best, a handful of Apple’s generic recommended shortcuts. Instead, you’ll find a new section at the bottom of the Shortcuts sidebar filled with pre-installed App Shortcuts from your apps. Technically speaking, these shortcuts aren’t doing anything out of the ordinary: they’re one-action shortcuts to perform common actions in apps. The difference from before is that app developers created them for you based on what they think is going to be useful in their apps.\nThis is immediately a better experience for folks who were turned off by the Shortcuts app in the past and found the blank shortcut editor daunting. App Shortcuts are ready to go: just click one and run it. Apple is advising developers to keep their selection of App Shortcuts flexible but reasonable, covering the typical actions that users typically perform in their apps. For example, the videogame tracking app GameTrack has an App Shortcut to see your finished games in a popup window; Voice Memos has one to quickly record a new voice memo; Focused Work has App Shortcuts to quickly pick a session or pause one (I used these during for my writing process this summer); GoodLinks has a handy one to open a random article you saved for later.\nSome examples of upcoming App Shortcuts.\nApp Shortcuts are one-action shortcuts that do one thing quickly; their parameters are pre-configured by developers; their names (and therefore invocation phrases) should be short and memorable so that users can also run into them in other places of the OS.\nApp Shortcuts for Focused Work can also be used from Siri and Spotlight in addition to the Shortcuts app.\nThat’s the other key design trait of App Shortcuts: discoverability. In iOS 16, App Shortcuts can be discovered in a couple additional ways. First off, if you ask Siri to show you something in an app and that also happens to be a pre-installed App Shortcut, it’ll work right away in the assistant. Second, if you search for an app in Spotlight, in addition to seeing the actual app in search results, the system will also show you the first App Shortcut included in it, which is another way to discover these shortcuts. You can, of course, also find App Shortcuts by typing their full name or by expanding the list of results in Spotlight. And third, Apple is replacing the ‘Add to Siri’ button with ‘Siri Tips’ – small and dismissible banners inside apps that tell you about App Shortcuts you can use for the current app.\nAn example of a Siri Tip. I haven’t seen these in third-party apps yet.\nIf you want, you can hide an app’s App Shortcuts from Siri or Spotlight by pressing the ‘i’ button in the app’s dedicated section.\nI can’t know for certain whether Apple’s strategy will be successful: only time will tell if new users will flock to the Shortcuts app thanks to App Shortcuts they discover in iOS 16. I do believe, however, that App Shortcuts are Apple’s best shot to date when it comes to introducing users to the incredible potential of Shortcuts. This approach makes sense, and I find it more valuable than the built-in Shortcuts Gallery because a) App Shortcuts can be discovered throughout the system, not just inside the Shortcuts app and b) they’re inherently personalized for users since they’re based on the apps they have installed.\n\nApp Shortcuts are Apple’s best shot to date when it comes to introducing users to the incredible potential of Shortcuts.\n\nWhat’s also fascinating about App Shortcuts is the opportunity they present in terms of creating a natural progress path for users who want to go from a pre-installed App Shortcut to a custom one. App Shortcuts cannot be modified: wherever you tap on one, you’ll always run it instead of opening the editor view. If you long-press on an App Shortcut in the dedicated section of the Shortcuts app, you’ll see that you can use it as a starting point for a custom shortcut via the ‘Use in New Shortcut’ button.\nUsing an App Shortcut as the starting point for a custom shortcut.\nIf you decide to use an App Shortcut as the foundation for a new, custom shortcut, you’ll be able to build upon the developer’s work and maybe tweak the shortcut a little, or add a menu for multiple options, or perhaps even bring in new actions from other apps.\nThis progressive disclosure of the Shortcuts experience is the aspect I’m most intrigued to witness over the coming months. App Shortcuts weren’t created for someone like me, and that’s exactly the point: we need new users in the Shortcuts app. I hope Apple’s plan works this time.\nYou can also follow our 2022 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.\nAccess Extra Content and PerksFounded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.\nWhat started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.\nClub MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;\nClub MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;\nClub Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.\nLearn more here and from our Club FAQs.\nJoin Now", "date_published": "2022-09-09T09:21:33-04:00", "date_modified": "2022-09-09T15:50:33-04:00", "authors": [ { "name": "Federico Viticci", "url": "https://www.macstories.net/author/viticci/", "avatar": "https://secure.gravatar.com/avatar/94a9aa7c70dbeb9440c6759bd2cebc2a?s=512&d=mm&r=g" } ], "tags": [ "automation", "iOS", "iOS 16", "OS Preview 2022", "shortcuts", "stories" ] } ] }