What is a JSON feed? Learn more

JSON Feed Viewer

Browse through the showcased feeds, or enter a feed URL below.

Now supporting RSS and Atom feeds thanks to Andrew Chilton's feed2json.org service

CURRENT FEED

MacStories

Just another WordPress site

JSON


AppStories, Episode 38 – An Interview with James Thomson, Creator of PCalc and DragThing

Permalink - Posted on 2018-01-18 13:17

On this week's episode of AppStories, we interview James Thomson about the origins of PCalc and DragThing, life as an indie developer, selling apps on the App Store and Mac App Store, and more.

Sponsored by:

  • Spark – The future of email.

Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


‪TEST A Really Long Title A Really Long Title A Really Long Title A Really Long Title A Really Long Title A Really Long Title A Really Long Title TEST STAGING‬

Permalink - Posted on 2018-01-09 12:20

With the transition to iPad Pro as my primary computer fully achieved in 2016 and not surprising anymore, in 2017 I turned my attention to three other key areas of my life: working with the MacStories team, managing my time, and finding my favorite apps among many competing alternatives.

For the first time in several years, I didn't publish a story documenting my journey towards the iPad and iOS in 2017. In many ways, that's a closed chapter of my career: the iPad Pro has convinced millions of people that it can be a suitable replacement for or addition to a Mac; with iOS 11 and its productivity features, Apple only cemented that belief. While part of me misses arguing in favor of the iPad against widespread skepticism, I felt it was time to move on from explaining the "why" of the iPad to helping others get the most out of the device. For this reason, I spent the better part of 2017 covering iOS 11 (first with my wish list, then with an in-depth review), discussing the details of iPad productivity, and creating advanced workflows for Club MacStories.

As much as I like to write in isolation, MacStories is also a team that requires a direction and a business that begets further responsibilities. Learning how to balance the multifaceted nature of my job with my hobbies and personal life (which got busier thanks to two puppies we adopted in April) has been an interesting challenge this year, and one that taught me a lot about allocating my time and attention, as well as the kind of writer I am and aspire to be.

There has been a recurring theme that has characterized my relationship with iOS in 2017: I've made a conscious effort to try as many new apps and services as possible, ensuring I would have a basic knowledge of all the available options on the market for different categories.

As I was settling on a routine and set of apps that worked well for me, I realized that I didn't want to lose the spark of excitement I used to feel when trying new apps in previous years. My job is predicated upon writing about software and having a sense of where our industry is going; while finding something that works and using it for years is great, I don't want to become the kind of tech writer who's stuck in his ways and doesn't consider the possibility that better software might exist and is worth writing about. Even though my experiments didn't always lead to switching to a different app, they made me appreciate the state of the iOS ecosystem and helped me understand my app preferences in 2017.

Thus, I'm going back to basics for my annual roundup this year. In the collection below, you'll find the 75 apps I consider my must-haves – no web services, just apps for iPhone and iPad. Apps are organized by category and, whenever possible, include links to past coverage on MacStories.

As in previous years, you'll find a series of personal awards at the end of the story. These include my App of the Year and Runners-Up; this year, I also picked winners for Feature, Redesign, Update, and Debut of the Year.

Table of Contents

Work Essentials

Ulysses. I moved to Ulysses last year, and, even after considering dozens of other options, I always come back to this app. Ulysses offers a combination of two unique features that no other text editor has: a beautiful reinvention of inline Markdown editing and powerful x-callback-url automation. I love writing and editing longform stories in Ulysses: the app simplifies management of elements such as footnotes and links with dedicated menus that abstract Markdown's syntax; in addition, I can use my custom theme and the typeface I prefer (currently, I'm editing with IBM Plex Mono).

Ulysses and SF Mono.

Ulysses and SF Mono.

Ulysses plays an essential role in the creation of Club MacStories content: thanks to its advanced URL schemes, we've come up with workflows that extract cards via the Trello API and turn them into sections in Ulysses sheets. None of this could be done with other iOS text editors, and it's saved the entire MacStories team several hours this past year. Ulysses truly is one of a kind: thoughtful and minimalistic, yet powerful and extensible. [Review and previous coverage]

Bear. While Apple Notes is my general purpose note-taking app, I realized a few months ago that I was missing a Markdown app that could hold my thoughts and reference material before turning them into drafts for Ulysses. This is why I came back to Bear: I can start outlining a story by adding links and a few images while I'm still taking notes about a topic. Later, when I'm ready to go from note-taking to actual writing, I can send a Markdown version of a note from Bear to Ulysses, which will keep all formatting and attachments intact. It took me a while to understand that I needed a dedicated note-taking app to complement Apple Notes; I'm happy that Bear and Ulysses get along well. [Review]

Things. My move to Things has been a hot topic of conversation on AppStories and Connected, so I won't repeat it in full, but here's the gist: while not as flexible or scriptable as Todoist, Things' design doesn't make me stress over my due tasks and it helps me cope with the anxiety often induced by having too many responsibilities. Things is incredibly polished and it looks amazing on iOS 11, both on the iPhone X and iPad Pro. The app is heavily skewed towards managing tasks in the Today and Upcoming views, though I also find myself organizing projects in sections (a feature I haven't seen anywhere else) and using Quick Find to search for tasks by title and tag.

Initially, I thought I liked Things because it was new and shiny, but with time I've come to understand that it's the app's elegant aesthetic and calming approach that makes it the task manager I need at this stage of my life. Things is the epitome of attention to detail in iOS UI design, and the app for those who don't want to feel guilty about task management. [Review]

Google Docs. Google's collaborative word processor isn't my favorite iOS app – in fact, I often criticize the company's slow adoption of new iOS productivity features such as iPad multitasking and iOS 11 drag and drop. However, of all the similar apps and services I tried over the years, Google Docs remains the most consistent and reliable one. We use Google Docs to prepare topics for AppStories and shows on Relay FM. As an iOS app, Google Docs' interface conventions and text selection mechanism leave a lot to be desired, but I depend on it, and I can't work without it.

Slack. Another case of a fantastic web service with a native iOS presence riddled with bugs. We pay for Slack at MacStories, and it's our primary communication system that has replaced email for all our day-to-day discussions. Alas, there are always some annoying bugs introduced with new versions of Slack for iOS, whether they involve scrolling performance, notifications, unread messages, or other visual elements of the app. Still, Slack the service enables us to save time thanks to integrations with other tools we use (including Zapier and MailChimp), and we even created our custom Storybot assistant to help us with scheduling team assignments and reminders. Despite its problems on iOS, I can't see the MacStories team using anything else.

Workflow. Like many others, I was shocked by the news of Apple acquiring Workflow earlier this year. Workflow is, by far, the most important app on my iPhone and iPad: every automation for every important service in my life, from Trello to Toggl, passes through Workflow. Years of coverage on MacStories should give you an idea of how critical Workflow is to what I do on iOS. While the big-picture future of Workflow is still unclear, it's good to see that Apple is maintaining the app by supporting new iOS technologies and fixing bugs. I hope to see Workflow become a system feature in iOS 12 next year, and I want to believe Apple will remember its importance for everyone who uses an iPad as their primary computer. [Review and previous coverage]

Editorial. Ole Zorn's text editor may not receive the same attention (or updates) it did four years ago, but it still plays a key role in my text editing workflow as it does things no other app can – namely, Markdown automation based on action presets and Python scripting. Whenever I need to edit a long story that spans multiple pages or requires custom syntax for MacStories (such as this one, or my iOS reviews), I turn to Editorial. The (Multi)Markdown workflows I created years ago are still as powerful and stable as they were when Editorial was a younger app, and they save me hours I would otherwise spend manually inserting custom code into my stories. It saddens me that Editorial is essentially on life support now (it hasn't even received basic iOS 11 bug fixes yet), but I hope Zorn will find the motivation to work on it again. [Review and previous coverage]

DEVONthink. I use DEVONtechnologies' app to store reference material for my articles and Club newsletters in the form of PDF documents and web clippings (as .webarchive files). As I explained on MacStories a few months ago, DEVONthink is an advanced file manager with terrific automation features for those who don't want to waste time manually saving documents (of any kind) into specific folders. I keep a searchable database of all our Club newsletters in DEVONthink, and I regularly use workflows to archive webpages and screenshots in the app. With iOS 11, DEVONthink has added support for drag and drop and Files, further speeding up how I can access and save documents from the app's databases.

Working Copy. Here at MacStories, all of our posts for the site and Club sections are collaboratively edited through shared repositories in GitHub. I wrote about the decision to use GitHub for Markdown last year, and, unsurprisingly, the system has been so good for us, we haven't felt the need to try anything else. Working Copy is the best iOS GitHub client for what we need from the service: it supports Markdown syntax highlight, word-based diffs, and developer Anders Borum recently added a way to drag individual revisions of documents out of Working Copy and into other apps.

Working Copy and its file provider extension.

Working Copy and its file provider extension.

With iOS 11, Working Copy gained full support for Files with a file provider extension, which makes it even easier to save articles into repositories and open them in different iOS apps. My entire iOS 11 review was backed up in Working Copy over the course of three months, too. If you want to work with GitHub on iOS, Working Copy is the app to get.

Trello. In 2017, we've continued to use Trello as an editorial calendar and organizational tool for MacStories and Club MacStories. All the sections in our newsletters are based on Trello cards that we either manually create or that are submitted by readers to Google Forms, which Zapier then converts to cards in Trello. Our Trello setup is a complex one and it involves dozens of scripts and workflows, but it's been working well for over a year now, and we trust it. Trello's iOS app isn't great: it doesn't support iOS 11 drag and drop, for instance, but it's passable, and it gets the job done. I'm disappointed by the lack of power-ups on iPad though, and I'd like the company to bring all the features of Trello's web app to iOS.

Gladys. I was hoping someone would make a third-party shelf app inspired by my iOS 11 concept after Apple announced drag and drop at WWDC, but I would have never guessed how many options we ended up having. More importantly, I could have never imagined a utility as powerful and well thought-out as Gladys.

Inspecting dropped items in Gladys.

Inspecting dropped items in Gladys.

Gladys lets you save anything you can drag on iOS 11 and store it in iCloud so all your items are available on both the iPhone and iPad. Think of Gladys as a mix of a file manager, Copied, and my idea of an iPad shelf: literally anything that is draggable on iOS 11 – whether it's text, an image, a link, or a PDF document – can be dropped in the app and stored for future usage. Ideally, you'll want to keep Gladys in the dock and use it in Slide Over, so the ability to save an item you're dragging is only a couple of taps away. Even better though, Gladys deeply integrates with the drag and drop APIs, so each item is represented with multiple versions that you can individually export and save elsewhere.

Gladys is the missing link between the iPad's drag and drop and iOS power users, and it's constantly getting better thanks to frequent updates. Until Apple makes a system-wide clipboard manager/shelf for iOS, Gladys is the app every iPad power user should install and learn to master. [Previous coverage]

Numbers. I moved to Apple's spreadsheet app last year, and while it's been a bumpy road, I'm a fan of what Apple eventually settled on. Numbers now integrates with drag and drop and Files' document browser, which gives its main view a consistent set of recently opened documents as well as access to every spreadsheet stored elsewhere on the system. I'm still using Numbers to track expenses with historical currency conversions; John and I use Numbers' built-in iCloud collaboration to track sponsorship sales and other business-related tasks.

More recently, I used Numbers to visualize battery charging stats for the iPhone X, and I discovered that some of the advanced chart editing features still aren't available on iOS. I'd like Apple to bring full feature parity between Numbers for Mac and iOS next year.

Dropbox. As much as I've tried to consolidate all my file management habits and workflows in iCloud Drive since the advent of Files with iOS 11, there are several things that Dropbox simply does better than iCloud Drive, or that Apple isn't interested in supporting altogether. Dropbox allows me to easily share files as well as entire folders with other people just by generating a shareable link; this is only partially supported by iCloud Drive, and it's not as intuitive either. Dropbox can keep track of file versions, has an API that integrates with Workflow, and it's cross-platform. I love the simplicity of iCloud Drive (which has been working well for me), but I also need the versatility of Dropbox. Thankfully, the service now offers a native Files extension on iOS 11, which gives me the best of both worlds – Apple's UI and Dropbox's excellent service.

Spark. This year, I have come to the conclusion that the email client of my dreams doesn't exist. I'd love to use an email client that has Airmail's customization and app integrations, the fluidity of Apple Mail's interface, the search prowess of Gmail, and the team collaboration features of Polymail. This beautiful Frankenstein monster of an email app will probably never be made, which I understand as I have very specific preferences.

The closest I can get to this vision is Spark for Mac, but its iOS counterpart doesn't offer the same feature set of the desktop version. According to Readdle, however, all of this should be happening with Spark 2.0, a major update that should introduce integrations on iOS in addition to team collaboration. To prepare for that possibility, I went back to Spark a few months ago and rediscovered everything I noted in my original review. Spark is a smart email client with an elegant design and clever features such as natural language search, UI personalization, and snoozing. I like using Spark for iOS: I don't love it, but I feel like, if Readdle's promises live up to the hype, Spark 2.0 could be the app that makes me stop wishing for the perfect email client. [Review]

Fantastical. Like email clients, I've switched back and forth between different calendar apps this year until I returned to my beloved and trusted Fantastical. Flexibits' app is fast, offers natural language input (which I was missing from other clients) and, most of all, it's a well designed piece of software that is always updated for the latest iOS and watchOS technologies – something I'm valuing more and more in the apps I use. Also, Fantastical has easily accessible Move and Duplicate functions for events, which I use a lot. [Review and previous coverage]

iThoughts. I would have liked to give MindNode 5 a try this year, but, unfortunately, two issues keep me from using the app's latest version: its theming options don't allow me to make the kind of wireframe layout I like my mind maps to have, and an unspecified iCloud bug prevents MindNode from uploading documents stored in its iCloud Drive container.

This is how I like my mind maps.

This is how I like my mind maps.

iThoughts, while not nearly as gorgeous or integrated with iOS 11 as MindNode 5, does what I need: my mind maps have a sharp-angled look with different weights and styles of San Francisco for each level, and they're synced with Dropbox. If MindNode gains deeper styling controls and if the developers can figure out my iCloud Drive issues (which I only have with MindNode on all my devices), I'd like to see if it could replace iThoughts to outline my iOS review next year.

Kaleidoscope. This was one of my favorite surprises of 2017. The folks at Black Pixel have brought Kaleidoscope's powerful diff functionalities from the Mac to iOS, crafting a polished text comparison tool that integrates with iOS 11's Files app and drag and drop. Ever since its release in September, I've switched to using Kaleidoscope to keep track of changes in document revisions for all the documents we share with the team on GitHub.

A diff in Kaleidoscope.

A diff in Kaleidoscope.

The ability to drag out individual versions of an article from Working Copy and drop them into Kaleidoscope is exactly what I needed when comparing edits suggested by Ryan and John against my drafts. Kaleidoscope's split-screen design is brilliant, and if you also need to track changes to Markdown text files over time, I can't recommend it enough.

Textastic. Speaking of editing plain text files stored in Working Copy, it doesn't get any better than Textastic's ability to bookmark individual repositories as fast-access folders. Thanks to Files' document picker and Working Copy's document provider, I can pin team repositories in Textastic and make changes to Markdown files using the app's superior text editor. Edits from Textastic are mirrored in Working Copy – no need to manually save or duplicate files after editing them. I don't take advantage of Textastic's more advanced features, but as a front-end to Working Copy's GitHub repositories, it's perfect for my needs.

1Password. AgileBits' password manager may have evolved into a full-featured web service available on multiple platforms, but its native iOS app is as solid as ever. As a 1Password for Families subscriber, I set up shared vaults in 1Password so that my girlfriend and I can share important logins and private documents securely. This year, 1Password 7 introduced a refreshed Favorites screen and Quick Copy – two fantastic additions that, together with iOS 11 drag and drop, have made 1Password an even better iOS citizen. 1Password has been in my Must-Have list since I started this series several years ago, and quite possibly always will be. I trust 1Password and want it to stay around forever. [Review and previous coverage]

Media

Spotify. Another year has passed, and I still can't pick one streaming service between Apple Music and Spotify – there are features I like in both, and each complements the other. My girlfriend and I listen to a lot of music (she's a choreographer and needs to discover new music as well as upload her own mixes to iCloud) and we have family subscriptions to both services. However, I'd pick Spotify as my primary service where I spent the most time listening to music every day.

I love Spotify's support for multiple types of connected speakers.

I love Spotify's support for multiple types of connected speakers.

Spotify still eclipses Apple Music when it comes to music discovery: features such as Discovery Weekly, Release Radar, Daily Mix, and Your Time Capsule put Spotify well ahead of Apple and the company's three weekly mixes. While I prefer Apple Music's design and love its support for lyrics and native Watch playback, I keep discovering more new music and getting superior recommendations thanks to Spotify's advanced intelligence. Spotify also integrates with my Sonos, Amazon Echo, Nvidia Shield TV, Chromecast Ultra, and PlayStation 4 Pro, and I like the freedom of playing my music anywhere. Spotify's iOS apps could be a lot better (especially the boring iPad version), but the quality of its intelligence is unparalleled. [Previous coverage]

SongShift. As someone who often switches between Apple Music and Spotify, it is essential for me to have an easy way to seamlessly move playlists between the two services. SongShift is the best app for the job I've found. After connecting multiple streaming services to the app, SongShift lets you pick a source playlist and copy it to a destination – in my case, that often means copying a Spotify playlist into Apple Music so I have the latest version of my recent music discoveries on both services. Version 3.0 of SongShift added a redesigned UI and the ability to automatically update existing shifts in the background once new songs are added to a playlist, which has made using two services on my iPhone less cumbersome than it used to be. [Review]

Shazam. Recently acquired by Apple, Shazam has been my favorite app to identify songs through the iPhone's microphone for years now. Before the acquisition, Shazam was heavily investing in refreshing its design to bolster artist discovery and recommendations, which has turned the app into more than a button to find the name of a song. Since getting an Apple Watch Series 3 last month, I've also started tagging songs directly from my wrist because the faster hardware in the Series 3 makes the experience almost as quick as on the iPhone. I'm not sure Shazam will be around as an app next year, or if it'll become part of Siri and exclusive to Apple Music. If it does, I'll probably use SoundHound as a third-party utility that integrates with Spotify. [Previous coverage]

Record Bird. While Spotify tends to be pretty good at highlighting new music releases I could be interested in1, Record Bird brings the convenience of a dedicated app whose sole purpose is showing you new releases from artists you like.

What sets Record Bird apart is that it's neither a music streaming service nor a song recognition tool: it's a quasi-RSS feed that highlights music releases with articles from popular music blogs, YouTube videos, and songs you can play by connecting your Apple Music and Spotify accounts. Even better though, Record Bird can show you a digest of new releases if you haven't opened the app in a while and send you notifications for new singles or albums you can't miss. Where Spotify's New Music Friday playlist falls short, Record Bird is always there to help. [Review]

Brain.FM. When people see me working in real-life, they always ask about this app. Brain.FM plays procedurally-generated "music" to help you in different contexts – there are sessions to focus and get some work done, music to help you fall asleep, or meditation audio perfect to relax and clear your mind. The music played by Brain.FM is created by algorithms using sounds that various neuroscientists have observed be effective in stimulating different parts of our brains. The company has a whitepaper available here if you're interested in reading more about the science behind it. Brain.FM has been incredibly useful for me to concentrate when I'm working on a big story: the iPad Pro in Do Not Disturb mode, AirPods on, and the iPhone next to me playing a Brain.FM session is the best way I've found to isolate myself and focus exclusively on getting words into Ulysses. I also recommend giving Brain.FM a try if you have trouble falling asleep and don't want to wake up with rock music playing in the middle of the night.

Overcast. After a summer stint with Apple's Podcasts app for iOS 11, I went back to Marco Arment's Overcast because I can't listen to podcasts without Smart Speed and Audio Boost. Overcast's audio effects are technically superior to every alternative I've tested, and the savings granted by Smart Speed add up over time, allowing me to listen to more episodes, and with better audio quality, than I would otherwise be able to. With version 4.0, Overcast gained support for drag and drop on both the iPhone and iPad, which helps me easily add episodes to my Queue and rearrange them as I see fit. [Review and previous coverage]

Musixmatch. Spotify's lack of built-in lyrics is remedied by Musixmatch, which I've used for a few years now primarily through the widget that displays real-time lyrics for the song currently playing in Spotify. Musixmatch (which is made in Italy) looks great as a third-party client for Spotify too: after connecting your account, you can browse charts and view playlists, starting playback directly from the app.

Musixmatch's latest version looks fantastic on iOS 11.

Musixmatch's latest version looks fantastic on iOS 11.

Musixmatch also supports offline lyrics for Premium users, which I found convenient when traveling, and its latest update has introduced a new iOS 11-inspired design that feels perfect for the app. [Previous coverage]

YouTube. There's no video without YouTube these days, and the app now has a permanent spot on all my Home screens. Over the past few months, I've started using YouTube even more thanks to the built-in "casting" feature to stream videos to my Nvidia Shield TV and Chromecast Ultra; the latter is connected to my LG OLED 4K TV, which, unlike YouTube on the iPhone X, can stream HDR videos. I'd like YouTube to extend mobile 4K HDR playback beyond Android devices in the future.

Plex. A couple of months ago, I consolidated our TV and movie setup in two places: Apple's TV app for content we buy or rent on iTunes, and Plex for everything else. I've always been a fan of Plex, but I hadn't been using it much in the past two years because my Synology NAS (a DS214play model) was too slow to transcode videos to an iOS-compatible format on the fly, resulting in slow loading times. Ultimately, that issue forced me to use a basic video player to open one video file at a time without organizing my media library at all. So after much deliberation, I tackled and fixed the problem with another perspective: I bought an Nvidia Shield TV and started using it as a Plex Media Server instead of my slow NAS.

The Shield is an Android TV device, and, thanks to its Tegra X1 SoC, is fast enough to handle multiple transcoding sessions in Plex as well as stream 4K content. This was a lot of work (I had to configure the Synology as a storage location in the Shield, update Android, then install the beta version of Plex from the Google Play Store), but it was worth it: we now have fast and smooth Plex playback on all our iPads and Apple TV 4K, and couldn't be happier.2

nPlayer. When I need to play a video that is in an odd format or isn't stored on my Synology, I use nPlayer. Any format, codec, streaming protocol, external connection, or aspect ratio you can think of, nPlayer likely supports it. This is an incredibly feature-rich and powerful video player that has never failed to play a file I've thrown at it. nPlayer can connect to your local NAS or Dropbox account, it supports streaming videos via AirPlay, Chromecast, and even other Smart TV protocols, has a large selection of gesture shortcuts and playlist management features, and more. There are too many options and supported formats to mention here, so just know this: if you've been looking for an advanced video player for iOS, this is the one to get.

Couchy. I switched to Couchy as my TV show tracker because its Collection and Calendar views work just like my brain wants them to. In the Collection, Couchy displays shows with unwatched episodes, which are sorted by the most recent one and carry a badge that tells you how many episodes you're behind – this is what the app calls Smart ordering. In the Calendar, Couchy has a feed of past and upcoming episodes so you can get a sense of what you can watch next, and what you missed over the past few days. Couchy integrates with Trakt to sync my library across multiple devices and platforms, and it also offers a handy Statistics screen to feel guilty about the time we've spent watching TV.3

Location

Google Maps. I've tried to use Apple Maps more because, as I stated many times before, I prefer its clean and native aesthetic. The problem, though, is that Google Maps' data is simply better for my area in Rome: I can find more stores and POIs on Google Maps, which also supports public transit information. I'm also a heavy user of Street View, which I consult whenever I'm planning to park my car in a street I'm not familiar with. Until all these features show up in Apple Maps too, I'm going to keep using Google Maps.

Waze. Google's other mapping app, Waze, is the best way to beat traffic in Rome and make last-minute decisions about alternative routes. Waze's power lies in the community of other riders that report accidents, roadwork, and even speed cameras in the app; while I'm driving, even if I know where I'm going, I leave Waze open to receive important alerts and keep an eye on traffic.

Waze's Spotify integration is great for quick access to music controls.

Waze's Spotify integration is great for quick access to music controls.

There are other aspects of Waze I love, such as the Spotify integration to easily play songs from the map view, or the ability to plan drives and get time-to-leave notifications that are more accurate than anything else I've tried as they're based on real-time data. Plus, I'm a fan of Waze's colorful design and its POI data is top notch as well.

Google Trips. It's fair to say I'm partial to Google's data when it comes to location apps. Google Trips automatically assembles a travel itinerary based exclusively on emails you've received in your Gmail account. Google Trips figures out everything on its own: hotel reservations and plane tickets create a new "trip" in the app, where you have a dashboard for things to see in a new city, places to eat, other points of interest, plus useful information such as how to call emergency services or find local pharmacies. Google Trips represents the best (and what some might see as the most unsettling) parts of Google: sitting in the background, making sense of different data points in your email. I use Google Trips every time I travel, and I always discover something new and useful about it.

Lyft. I used to have Uber on my iPhone, but with time I've realized that Uber as a company is a gross entity imbued in a toxic culture and questionable practices, so I switched to Lyft when I went to the U.S. in June for WWDC. To my surprise, Lyft didn't feel like a lesser alternative to Uber – it worked perfectly, and, since the last redesign, Lyft looks fantastic on iOS 11. Lyft supports Apple Pay, and it has an option to round up your fares to the nearest dollar and donate the difference to a charitable cause, including Girls Who Code and the ACLU. I hope Lyft will someday launch in Rome as well.

myTaxi. This doesn't happen often, but I occasionally need to move around Rome without using my car or public transit (usually for meetings during the day or to come back home after a night out clubbing). In order to avoid using Uber in Rome as well, I signed up for myTaxi last year and haven't regretted it. myTaxi essentially provides a service and pretty iPhone app on top of the existing taxi transportation network – here in Rome, all the rides I had with myTaxi were in regular taxis whose drivers had signed up for myTaxi. myTaxi looks nice and is easy to use – I especially like the menu where you can book a car for a specific time and set driver tips and other car options beforehand. myTaxi is also present in other European cities – I used it in Barcelona earlier this year, and it worked well.

Health

Workouts++. While Apple's built-in Workout app does a good job at starting and monitoring workouts on the Apple Watch, it doesn't have a proper iPhone counterpart, and its personalization options are limited. Workouts++, as the name implies, enhances every aspect of workouts on iOS and watchOS: the iPhone app lets you inspect advanced workout stats with charts and other useful metrics, but it also allows you to customize what the Watch app shows you when you're working out.

On the Watch, Workouts++ supports haptic feedback, real-time heart rate monitoring, and even native podcast playback by caching or streaming episodes. Workouts++ is for everyone who wants more from Apple's Workout app. [Review]

AutoSleep. I've been wearing my Watch at night and tracking my sleep with this app since it launched almost two years ago. AutoSleep, developed by David Walsh, makes sleep tracking automatic and effortless: there are no buttons to press on the Watch, and no special modes to engage before you go to sleep. AutoSleep uses data captured by sensors on the Watch to understand when you fell asleep, how restless your sleep was, and what your heart rate was like while you were sleeping. Then, all these data points are visualized in the app (which has a bit of a learning curve due to its advanced controls) and saved in HealthKit for other apps to read. Thanks to AutoSleep, I know that I should make an effort to try and get more sleep each night. [Review]

HeartWatch. What Workouts++ is for workouts, HeartWatch is for monitoring your heart rate and making sense of data captured by the Watch. Also by David Walsh, HeartWatch aggregates all heart rate data points from HealthKit and helps you understand how low or elevated your heart rate is in different moments of the day, or for different contexts (such as sitting down vs. deep sleep). Like AutoSleep, HeartWatch can be somewhat intimidating at first – there's a lot to uncover in this app. HeartWatch offers a unique feature set that can make wearing the Apple Watch and using this app literally lifesaving thanks to its notifications for elevated heart rate. [Review]

Gyroscope. This is my personal dashboard where every HealthKit data point comes together to paint an accurate picture of my life. I use Gyroscope to keep my heart rate, sleep, steps, and workout data in one place so I can refer back to individual days and months of the year to see how active I was.

I also connected Gyroscope to my Instagram, Foursquare, Moves, and Twitter accounts so that, in addition to health data, I can contextualize past events and entire weeks using what I shared or places I visited as reference. Gyroscope continues to expand with new integrations, redefining the "quantified self" movement with an astoundingly futuristic design and intelligence worth the cost of the Pro subscription.

Home

Home. One of the strangest aspects of HomeKit is that certain APIs made available to developers either haven't been built into Apple's own Home app, or they're too difficult to find. Consider, for instance, groups and automation triggers based on value ranges: groups can only be created by navigating into an accessory's detail screen, which is not intuitive at all; range-based triggers can't be created with Apple's app, but if you create them with a third-party HomeKit client, they show up in the Home app. This can get confusing pretty quickly, so instead I use the Home app developed by Matthias Hochgatterer.

A HomeKit UI that makes sense.

A HomeKit UI that makes sense.

His Home app makes sense: there are tabs for rooms, scenes, and groups (no need to drill down into nested detail screens) and you can create multiple widgets by assigning your favorite items to them. Home provides a more conventional interface for managing HomeKit accessories and related services that I find easier to use, and the best part is, everything you create and modify in the app is supported by Apple's Home app and Siri as well.

Logi. We bought a wireless Logi Circle 2 camera a few months ago and installed it by our front door. I like the convenience of a wireless camera, but its battery life hasn't been nearly as good as advertised by Logitech (we have to put in our spare battery every couple of weeks). Also, because the camera has to aggressively conserve battery, it needs to wake up from low power mode every time you launch the Logi app, resulting in tedious loading times. Still, the Logi app has some interesting features: you can scrub through individual events on the right side of the screen, view a timelapse for the entire day, and talk to people and pets in the house using the iPhone's microphone. Unfortunately, the app hasn't been updated for the iPhone X yet.

Canary. I'm not a fan of how Canary handled the switch of certain features to subscription-only, but we still have to use their app to monitor two rooms in our house. Many of the features we used to have available for free are now paywalled, so we can't access them anymore in the app, but I have to admit that since an update Canary released in August, performance has vastly increased. Basically, we still use Canary only because we have the cameras and haven't sold them yet. I'd like them gone by next year so I can upgrade my entire setup to HomeKit cameras.

News

Nuzzel. I've made an effort to spend less time on Twitter over the past year, and Nuzzel has proven to be more useful than ever. Thanks to Nuzzel, I can stay away from my Twitter timeline without missing out on the interesting links people I follow are sharing. Nuzzel is a great way to separate the wheat from the chaff (or, in 2017 terms, find the cool new app/videogame/article among extremist propaganda, harassment, and threats of an impending nuclear war – all of which is tacitly condoned by Twitter). Nuzzel also launched a Pro subscription with ad-free reading and keyword filtering, but I'm not sure that's worth $99/year for me. I've been a free Nuzzel user for years though, so I should probably consider it as a way to support the company. [Review]

lire. Earlier this year, I spent a couple of months re-testing all the major RSS services to ensure Inoreader was still the best option for me. I eventually went back to Inoreader because of its filtering features, which allow me to subscribe to high-volume feeds but trim them down with rules so I only see articles that contain certain keywords in their titles.

While I was switching services, I also tried all the modern RSS apps I could find. I settled on lire, an RSS reader that's been around for years and that never grabbed my attention before. lire has been fully redesigned for iOS 11, taking advantage of Apple's large title design style to neatly indicate different folders and sections. In addition to a clean design that feels good on the iPhone X and iPad Pro, lire has two peculiarities: it supports all the most popular RSS services (including Inoreader) and it comes with its own text extraction tool to load the full text of truncated stories. The full-text option can be enabled on a per-site basis, and it works well in combination with caching for read articles. lire looks native to iOS in a way that the official apps by Inoreader, NewsBlur, and Feedly don't, and it's actively supported by its developer with frequent updates.

Apollo. I'm not a heavy Reddit user – I read and "lurk" a lot, but I don't post much unless someone mentions me in a comment or sends me a DM. Apollo, which is the result of years of work by Christian Selig, is the best Reddit experience I've ever had on any platform.

With a clean design that works well in light and dark mode, customizable gestures, and advanced features such as filters and fast access to my favorite subreddits, Apollo lets me personalize Reddit to my needs and fine-tune it so I'm not overwhelmed by the amount of available topics or distracted by content I don't want to see. There's a lot of good stuff to discover on Reddit besides its memes and questionable threads; Apollo is a product of love that exudes elegance and power through its many personalization options. [Review]

Social

Tweetbot. On multiple occasions this year, I considered giving Twitterrific a try as my main Twitter client. I like Twitterrific's design and The Iconfactory's frequent update cycle. Still, there are features exclusive to Tweetbot that keep me in the app because they represent how I want to use Twitter. Tweetbot's multi-column layout on iPad is a must-have for me: I need to be able to scroll two instances of my mentions while I reply to readers. Also, Tweetbot lets me peek at a tweet's retweet and like counts with 3D Touch, and I can tap on those counts for my own tweets to see who shared and liked them.

Multi-column support in Tweetbot for iPad is still unparalleled on the platform.

Multi-column support in Tweetbot for iPad is still unparalleled on the platform.

Additionally, I like that Tweetbot lets me preview images I've already attached to a tweet I'm composing so I can double-check them in full screen. Tweetbot isn't evolving at the pace I'd like it to, but it's deeply entrenched in the way I use Twitter and I can't use anything else for now. [Review]

Instagram. I'm not a huge Instagram user: I post photos rarely and I don't document my daily life with Stories. I like keeping up with my friends by watching their stories though. I never got into Snapchat too much because none of my friends were using it, but Instagram's adoption of the feature has been a smashing success here in Italy (and elsewhere). While my Facebook has turned into a cringe-inducing feed of people fighting over politics, Instagram has remained fun and lighthearted, and I think Stories played an important role in that.

Linky. Tweetbot gained a share extension this year after Apple's removal of the native Twitter share option from iOS 11, but I still prefer Linky's dedicated extension to share links from other apps. Linky's supercharged share sheet comes with syntax highlighting, clipboard link detection, and an easy way to switch between multiple accounts.

You can drag webpage elements from Safari into Linky's extension.

You can drag webpage elements from Safari into Linky's extension.

In iOS 11, you can even pick up elements from a Safari webpage and drop them into Linky's extension. If I'm not sending tweets from Tweetbot, I use Linky. [Previous coverage]

WhatsApp. I often complain about having to use WhatsApp, even if better alternatives exist, because all my friends are on it and it's the de-facto messaging app in Italy. I should also note, however, that Facebook has been iterating on WhatsApp at a faster pace over the past year. WhatsApp now lets you delete messages you've seen in the last 7 minutes, offers live location sharing and in-chat search, and supports asking Siri to read your latest messages. WhatsApp has been substantially improved in 2017, and even if it still doesn't offer an iPad app, it's not as lackluster as it used to be.

Photo and Video

Google Photos. iCloud Photo Library is my primary photo management service, but I keep Google Photos on my iPhone as a backup option and because its search results are sometimes more accurate than Apple's. I don't pay for Google Photos, but I like the peace of mind of knowing that, if something catastrophic were to happen to Apple's servers, at least I'd have a decent-quality copy of my library. I don't spend a lot of time interacting with Google Photos, but I've occasionally saved recommendations provided by the app's assistant, such as edited photos and animations.

Pixelmator. The only advanced graphics editor I know how to use on iOS. Pixelmator's greatest strength is making complex operations intuitive and consistent with the interactions we'd expect from the iOS platform.

I probably need a Mac Pro for these complex operations.

I probably need a Mac Pro for these complex operations.

I'm not a professional graphic designer, but whenever I need to make edits to an image that Apple's Photos app doesn't support, Pixelmator never disappoints. I'm excited about Pixelmator Pro coming to iPad in the future.

Annotable. If you see an annotated screenshot in my reviews that features arrows, rectangular selections, or magnification loupes, it was edited with Annotable. The app started as a spiritual successor to Skitch, but it has grown into something much more powerful with dozens of annotation tools and advanced controls for color and sizes of onscreen elements. I use the app's redaction and arrow tools almost daily. I'm especially fond of the ability to edit images from Photos through Annotable's extension, which carries all the functionality of the main app. [Review and previous coverage]

GIF Toaster. This lesser known utility is the best way to make GIFs out of videos on iOS, which works particularly well with iOS 11's native screen recording capabilities. GIF Toaster allows you to convert videos to GIFs with a wide range of advanced controls: you can change the FPS value, tweak the range of a video to convert, adjust speed, and even change the orientation of a video and crop it beforehand.

GIF Toaster can generate videos with two types of encoders: a TrueColor one that is slower but more accurate, and a hardware encoder that is much faster, but prone to image artifacts. All the GIFs I use in my stories are created with GIF Toaster's TrueColor encoder, which is fast enough on the iPhone X. GIF Toaster is the kind of power-user app that does one thing incredibly well.

Utilities

AnyFont. This is the app I've been using for years to install custom fonts on iOS. Thanks to AnyFont, I can manually install San Francisco, Nitti, and IBM Plex on my devices and enjoy them in my favorite text editors with just a few taps. [Previous coverage]

Blink. I was a heavy Blink user well before John, its creator, joined MacStories. Blink is the best way to generate iTunes affiliate links for App Store content, which I use in all my stories, including this one. Blink saves me a lot of time every day, and it's the utility behind one of MacStories' key revenue segments. [Review]

Terminology. Greg Pierce's fantastic dictionary and thesaurus app isn't on my devices because of its primary functionality – Apple's built-in dictionary is enough for me. Instead, I use Terminology to save new words I come across while reading. Every few weeks or so, I give my iPhone to my girlfriend and she quizzes me on the meaning of words I saved in Terminology's Favorites screen. As a non-native English speaker, this has been a useful exercise to perform on a regular basis. [Review]

CARROT Weather. I resisted switching to CARROT Weather for a long time because, while I loved its approach that wedded personality to great weather data presentation, I needed to use Weather Underground. With the launch of version 4.0 this year, developer Brian Mueller outdid himself: not only is CARROT as funny as ever, but the app has been redesigned to accomodate even more views and stats with the ability to unlock Weather Underground as a data provider.

I hear that this app is kind and loving and wants to hug us all.

I hear that this app is kind and loving and wants to hug us all.

CARROT now lets me use the weather station by the end of the street where I live, has fully customizable iPhone and Watch apps, and it combines insane dialogue with a flexible forecast UI. CARROT Weather is a case study on how to stand out in a crowded market. [Review and previous coverage]

DS File. We have a Synology NAS at home, and this is the company's official file manager for their DSM software. DS File isn't an amazing app: it just gets the job done, and it doesn't even support the iPhone X resolution yet. I open DS File when I need to move a TV show into the appropriate Plex Media Server location, and that's about it. [Previous coverage]

Deliveries. I may or may not have an Amazon shopping problem, and Deliveries may or may not be responsible for facilitating my habits by making it super-simple to track packages and get notifications for status updates. I use this app more than I like to admit.

1Blocker. I want to support my favorite websites, but trackers that follow me around the web and slow down my iPhone, draining its battery on 4G, are beyond my level of acceptance. 1Blocker makes the web a slightly less terrible place by blocking those creepy banners and scripts that often make webpages unusable, while still allowing me to whitelist websites I like. [Review of Mac version]

SpamHound. I've always had a problem with SMS spam in Italy, and thankfully Apple intervened with iOS 11 by opening up a new extension point for developers to write SMS spam filtering apps. Unlike others, SpamHound – even though it only runs in iPhone compatibility mode on the iPad – lets me sync my SMS filtering rules across devices, and it supports writing complex filters with regex and wildcards.

Opener. As you can probably imagine, I often prefer to use third-party clients in lieu of official apps for popular services. For instance, I prefer Tweetbot to Twitter and Apollo to Reddit. Opener is the glue between Universal Links (which always open in a service's official app on iOS) and third-party clients, allowing me to seamlessly open a URL directly in the app I like to use. I invoke the Opener extension to view twitter.com URLs in Tweetbot on a daily basis.

PCalc. You'd think that most people now open PCalc to play a car game and fling bananas in AR mode, but no, I'm still primarily using it as a calculator. PCalc lets me customize its layout and perform currency and unit conversions from a unified interface, and it's been on my Home screen for years now. [Previous coverage]

Bobby. As more and more apps adopt a subscription model, it's become necessary to keep track of all the subscriptions we pay every month. Bobby solves this problem by providing you with a beautiful dashboard for all your recurring subscriptions.

Bobby supports iCloud backup and sync, has some great touches such as the ability to set custom icons and colors for each subscription, offers a built-in database of popular services, and various customization options. All my subscriptions are tracked in Bobby now. [Review]

Grocery. This utility by Conrad Stoll is a genius take on the classic grocery shopping list app that uses machine learning to automatically sort items for you. As you add items and check them off while you're grabbing them off the shelf at the grocery store, the app learns the order in which you shop. The next time you re-add items to your list, they will be automatically sorted based on your habits. There's a lot to like in Grocery: it looks great on the iPhone X's OLED display, it supports alternate icons (I recommend the strawberry one), and it also auto-completes previously purchased items.

GIPHY. It is of paramount importance for me to be able to find the appropriate Eddy Cue or Jean-Ralphio GIF when I need it. GIPHY is the biggest and most popular GIF search engine around these days, integrated with a variety of services from Twitter to Slack. GIPHY's iPhone app is solid: you can mark GIFs as favorites in your account, browse by category or trending GIFs, and even create your own GIFs by uploading a video from your device.

Lookmark. Originally an alternative wish list for the iTunes Store and App Store, Lookmark has grown into a powerful utility I use to track app updates and price drops. Thanks to a web service that monitors changes to app listings on the App Store, Lookmark can now send you push notifications whenever one of your favorite apps gets an update or goes on sale.

Lookmark even lets you see an app's changelog from an expanded notification.

Lookmark even lets you see an app's changelog from an expanded notification.

This has been a terrific addition for me as I can now only keep the apps I actually use on my devices, but still stay on top of interesting updates for everything else through Lookmark. [Review]

TextExpander. Who's got time to constantly type email addresses or the full names of iOS APIs? I've been using TextExpander since I got my first Mac in 2008; these days, I'm a subscriber to their online service and use its snippets when I'm writing in Bear and Ulysses. My wish is that someday Apple will integrate with TextExpander in their own apps too, as it's considerably more powerful than native text replacements.

Copied. I've long used Copied as my clipboard manager on the iPhone, iPad, and Mac. Copied syncs with iCloud and supports a surprisingly wide range of advanced features, from JavaScript automation to templates and merging. With iOS 11, Copied gained full integration with drag and drop, making it an ideal Split View or Slide Over companion when you want to archive multiple bits of text in the app. I use Copied as a reference tool for text templates and URLs I frequently share over email, and as a way to quickly save interesting links thanks to its widget. [Review and previous coverage]

TransferWise. Since discovering this person-to-person payment service a few months ago, I've been trying to move away from PayPal as much as possible. TransferWise uses the mid-market exchange rate between multiple currencies, thus avoiding PayPal's fees on the custom rate they set. In addition, TransferWise can send money using Apple Pay to withdraw funds from your iOS devices, making it an even more attractive option for secure peer-to-peer payments (that is, until Apple Pay Cash launches for everyone).

PayPal. Unfortunately, I still have to use PayPal to send and receive payments to and from friends who are not on TransferWise. I recommend using the PayPal app on iPhone instead of the website: navigation is more intuitive, sending money is easier, and you can also manage payments using Siri, which I've done a few times this year.

Kpressor. One of Apple's most perplexing decisions in iOS 11 is the absence of archive-related functionalities in the Files app to create and extract .zip files. I used Readdle's Documents file manager to manage .zip on iOS for years, but I wanted to find a dedicated .zip utility after consolidating all my file management in iOS 11's Files. Kpressor does what I need: the app can be used as an extension within Files to decompress a selected archive and save it in-place, or you can share multiple files from the app and compress them with the extension as well. Alternatively, you can open Kpressor to manually pick files to compress with a native Files picker, or open existing archives from Files to extract their contents. In the future, I hope apps like Kpressor can become extensions natively integrated with Files' toolbar and action menu.

PDF Viewer. I switched from Readdle's PDF Expert because PDF Viewer supports iOS 11's document browser, allowing me to manage my PDFs with a consistent working set of files and folders instead of having a separate file manager UI. PDF Viewer is based on the excellent PSPDFKit engine used by thousands of apps; the app has all the basic tools I need to annotate PDFs, but most importantly it's embedded within Files and iCloud Drive. [Review]

Feed Hawk. Most RSS readers on iOS don't offer the ability to subscribe to new feeds directly from the app. Feed Hawk, developed by Golden Hill Software (the same company behind Unread), is a handy utility that uses an action extension to add feeds to your RSS service of choice. After invoking Feed Hawk's action extension in Safari, it'll automatically scan the webpage for RSS feeds and ask you to add them to the service you've configured in the app. Feed Hawk supports the most popular RSS services, including Inoreader and NewsBlur. [Review]

Launcher. I switched to Launcher a few years ago and I still use it to quickly open specific views inside my favorite apps or run workflows for MacStories and personal research.

I should update these launchers for Things.

I should update these launchers for Things.

Launcher lets me customize icons so they have a smaller size or custom artwork, and it uses iCloud to backup and restore its widgets across devices. I'm in the process of rethinking my launchers for Spotify and Things, which I'll write about in the near future. [Review]

Feature of the Year

Apollo's Jump Bar

Most Reddit apps come with clunky navigation systems that require you to dig deep into nested views to get to your favorite subreddits. With its Jump Bar, Apollo rethinks the concept of a favorite page altogether, allowing you to easily switch between different views in the app simply by tapping the title bar, which combines favorites with autocomplete for popular subreddit search results. Apollo's Jump Bar is a fantastic example of how a classic iOS interface element can be reimagined and improved, and it's my favorite feature of the year.

Runner-Up

Lookmark's Notifications

I write about apps for a living, and as such I need to keep tabs on app updates as much as possible. Lookmark makes it possible for me to never miss an update even if I don't have the apps I'm monitoring installed on my iOS devices. If you care about keeping on top of app releases, Lookmark's new notification service is a must-have.


Redesign of the Year

Things 3.0

Earlier this year, Cultured Code relaunched their popular task manager with a focus on elegance and simplicity. The result was a bold aesthetic that, while predating iOS 11, felt instantly at home on Apple's new operating system when it launched. Several months later, Things 3.0 still manages to strike a balance of consistency with the platform while also looking unlike anything else on my devices. Things 3.0 is stunning, and an inspiration for other designers on how to build upon Apple's design language.

Runner-Up

Ulysses 12

With an update that blended Apple's large title approach with a major overhaul of the app's layout, Ulysses 12 is a rare instance of a redesign that feels new without causing any confusion or initial perplexity. It simply makes sense. Ulysses 12's redesign has dramatically improved navigation in the app, and it's remarkable on the 12.9-inch iPad Pro.


Update of the Year

CARROT Weather 4.0

It's hard to stand out in the App Store these days, and it's even harder if your app is part of a crowded category such as weather utilities. CARROT Weather has always been unique, but with version 4.0 developer Brian Mueller managed to fundamentally improve every aspect of the experience – from the app's personality and weather sources to deep customization on iPhone and, with the 4.3 update, even Apple Watch. CARROT Weather's evolution into a mature product that hasn't compromised on its original vision has been outstanding, and Mueller's work in 2017 deserves to be recognized and celebrated.

Runners-Up

Bear 1.3 and 1.4

In the span of two months, Shiny Frog brought advanced drag and drop to Bear with the Drop Bar and support for rich text drag items, then supercharged its tagging system with auto-complete and icons for popular tags. Bear shows how a well thought-out subscription model – not a hastily implemented one – can let small development shops build sustainable productivity apps that are constantly iterated upon.

Workouts++ 2.0

David Smith's advanced fitness app encapsulates everything I want from a utility that keeps track of my workouts: stats, the right amount of customization, and a little more flexibility than Apple's built-in solution. Workouts++ 2.0 may have launched towards the end of the year, but it's clearly one of the best app updates we've seen in 2017.


Debut of the Year

Gladys

An app that flew under everyone's radars when it launched and that somehow managed to become the best drag and drop assistant for the iPad. Gladys is the missing shelf from iOS 11 – an app that can hold anything you throw at it, sync it with iCloud, and keep multiple versions of each item in its library. In just a few months, Gladys has become my go-to app for dealing with all kinds of file attachments, rich text clippings, and images I need to move between multiple apps. If Gladys' update cycle in 2017 is of any indication, we should keep an eye on this app next year.

Runner-Up

Apollo

I didn't think it was possible, but Apollo made me like browsing Reddit again. The years of work that developer Christian Selig poured into Apollo are evident if you use his app just for a few minutes, and it's the kind of passion project with a deep attention to detail that ought to be studied and admired.


App of the Year

Ulysses

I've spent hundreds of hours writing in Ulysses this year. The app is a superb reinvention of Markdown for the modern age of iOS text editing, which enhances the writing experience with features such as inline link and footnote editors, smart paste, glued sheets, and filters. Ulysses is more than a plain text editor: it's a professional writing suite based on plain text and integrated with other iOS apps through automation, drag and drop, and extensions.

Ulysses has defined the past year of writing at MacStories: there's nothing else like it on iOS, and its developers have adopted a business model that allows them to continuously improve, fix, and innovate.

Ulysses represents the modern pro app for iOS and, without hesitation, it is my App of the Year.

Runners-Up

A few honorable mentions for apps that also defined my iOS usage in 2017. These are fantastic pieces of software, showing how the iOS developer community is still vibrant and thriving.


2017

Looking back at the apps I've used in 2017, I see two emerging themes: iOS 11 is reshaping many of my favorite apps, primarily because of iPad multitasking or drag and drop; and, as time goes on, I increasingly value good UI design and have less tolerance for cross-platform apps that don't feel native on iOS. The former is something I expected, as I wrote in last year's conclusion. As for the latter: perhaps, as I approach my thirties, I'm getting older and wiser; personally, I just think the best hardware products Apple has ever shipped – the iPhone X and 2017 iPad Pros – demand the absolute best software they can run.

In hindsight, I was wrong about Workflow having a bigger impact on my iOS usage in 2017 – or, at least, I was off by a year, as I couldn't predict Apple's acquisition. So I'm going to repeat what I wrote at the end of 2016: it'll be interesting to see what an entirely new Workflow made by Apple could unlock for iOS productivity in 2018, and how it could tie into Siri and HomePod as well. Even though its development cycle has slowed down, I remain optimistic about the future of Workflow.

I expect Apple to continue iterating on iPad productivity enhancements next year (I don't think they're done with iOS 11), and, obviously, I will continue discovering and experimenting with new apps. iOS 11 revitalized the App Store, and I'm excited to see what developers will invent next.

As always, let's check back in a year.


  1. I noticed that the app now even displays a prompt for the most important release of the week based on your previous listening habits. ↩︎
  2. We also started experimenting with Plex DVR with a digital tuner we bought a while back, but I think I'll have to update the tuner's software using Windows for the best experience, which I haven't been able to do yet. ↩︎
  3. But should you really feel guilty for watching Parks and Rec from start to finish twice? ↩︎

Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


iOS 11: The MacStories Review

Permalink - Posted on 2017-07-11 15:05, modified on 2017-09-19 00:22

For the second time in three years, the iPad isn’t following in the iPhone’s footsteps. With iOS 11, the iPad is going in its own direction – this time, with no cliffhanger.


App Extensions Are Not a Replacement for User Automation

Permalink - Posted on 2016-12-21 22:03, modified on 2017-01-10 16:36

Here’s a thought experiment. Let’s imagine that Apple decided to combine their engineering resources to form app teams that delivered both iOS and macOS versions of applications.

In such a scenario it may seem logical to retain application features common to both platforms and to remove those that were perceived to require extra resources. Certainly Automation would be something examined in that regard, and the idea might be posited that: “App Extensions are equivalent to, or could be a replacement for, User Automation in macOS.” And by User Automation, I’m referring to Apple Event scripting, Automator, Services, the UNIX command line utilities, etc.

Let’s examine the validity of that conjecture, beginning with overviews of App Extensions and User Automation.

What are App Extensions?

App Extensions (often called simply “extensions”) are essentially application (or framework) plugins that provide the functionality to perform specific tasks, such as manipulate an image, post media items to social services, share items and documents, or filter an audio stream. There are many types of extensions, and you control their use in the Extensions preference pane of the System Preferences application.

But the most important ability of App Extensions is that they can be used by apps other than the one in which they are contained. For example, an application that creates QR codes may also contain an app extension that enables other applications to use the containing application’s functions to create and place QR codes within their own documents.

Extensions physically exist as bundles (with the file extension “appex”) placed within their containing application’s bundle. However, macOS will make them available while you are using other applications by placing them at “extension points,” such as in the Share Menu, or in the Photos application when it is in edit mode.

In the example shown above, extensions included in the Pixelmator application are available as filter and editing options in the Photos application. When an extension is chosen, the Photos interface changes to display the image and extension controls.

You can make edits to the image using the tools provided by the extension, click the “Save Changes” button, and you will be returned to the standard Photos interface.

Extension Points are exposed in multiple locations in macOS, from additional data sharing options added to the Share Menu, to inline content such as the Markup extension in Mail. The chart below lists some of the macOS extension points (types):

How do App Extensions work?

By design, most App Extensions follow specific rules governing how they function. In general:

  • Some app extensions, like content blockers, can run without user interaction. But other extensions, like those that manipulate images, are triggered only by direct user action, such as selecting a menu item, or pressing a button.
  • Once triggered by the user, the extension is provided a copy of the current object or data selected in the hosting application. The extension may display a user interface (if needed), then processes the data, and returns the edited data to the hosting app.
  • By design, App Extensions have restricted access to resources outside of their bundle, and have restrictions placed upon them in regard to their access to the macOS frameworks, networks, and the user environment.

Who creates App Extensions?

App Extensions are created by Apple developers in Xcode using either the Swift or Objective-C programming languages. Non-programmers are not the target audience for developing app extensions.

App Extensions Summary

App Extensions are developer-created application plugins, exposing powerful functionality within multiple applications at pre-determined “Extension Points,” such as in the Share Menu, or in the Today view of the Notification Center. App Extensions are executed within a defined set of parameters and restrictions, and their scope is often limited to processing the selection in the host application.

What is User Automation?

The term User Automation represents multiple technologies and architectures whose purpose is to enable the user to create simple or extended workflows for editing documents, processing data, and performing repetitive tasks.

The technologies of User Automation operate with the user’s level of authority and are not restricted in the scope of their abilities: whatever the user can do, they can do. They have unrivaled access to the frameworks of the operating system, and to the extensive library of UNIX tools that ship with macOS. And most importantly, AppleScript, JavaScript (JXA), and Automator can directly query and control scriptable applications using Apple Events, the inter-application communication technology that has existed on the Mac since System 7.

macOS applications that support Apple Events are said to be “scriptable,” in that they allow the built-in Cocoa scripting frameworks of macOS to communicate with and control the application and its elements. Every “scriptable” application includes a “dictionary” providing detailed information about every scriptable object in the application and the commands used to interact with them. As an example, the dictionary excerpt shown below is from the Keynote application’s dictionary, and describes the slide object of a Keynote presentation document.

slide: [inh. iWork container; see also Compatibility Suite] : A slide in a presentation document.

elements

contains audio clips, charts, iWork items, lines, movies, shapes, tables, text items; contained by document.

properties

base slidemaster slide ) : The master slide this slide is based upon.

body showingboolean ) : Is the default body text item visible on the slide?

default body itemtext item r/o ) : The default text item container for the body text of the slide. Its visibility is determined by the value of the body showing property.

default title itemtext item r/o ) : The default text item container for the title text of the slide. Its visibility is determined by the value of the title showing property.

presenter notesrich text ) : The presenter notes for the slide.

skippedboolean ) : Is the slide skipped?

slide numberinteger r/o ) : The index of the slide from the beginning of the document. Note: skipped slides have a slide number value of -1. Un-skipped slides have a positive slide number value.

title showingboolean ) : Is the default title text item visible on the slide?

transition propertiestransition settings ) : A list of key/value pairs for the properties of the slide’s transition.

responds to

get, set, delete, exists, make, move, duplicate

Using this dictionary, scripts written using the macOS scripting languages can create, move, duplicate, and delete slides, as well as change their master slide, read, set, and edit their displayed text, add images and video elements, and even adjust whether they are skipped or shown. No other architecture or framework in macOS offers the ability to query and control the internal workings of applications like Apple Events. And most of the important productivity applications, such as Microsoft Office, Adobe InDesign, and the excellent suite of apps from the Omni Group, include robust and extensive scripting support, and are the “go-to apps” for implementing powerful user automation workflows.

Apple Events are used to communicate with applications. For communication with the OS and its tools, the User Automation technologies of AppleScript and JavaScript (JXA) have integrated code “bridges” that provide them access to all of the Cocoa frameworks, some of which are listed in this graphic.

Using these frameworks as well as the default UNIX tools of macOS, the scope of what scripts can do expands to an impressive range of abilities.

And since User Automation can freely work between applications and frameworks, creating workflows that involve multiple apps and the transferal and transformation of data are common practice. For example, the workflow shown in the animation below begins with converting a table in a Numbers document into a chart in Keynote, then adding text, generating a QR code, and finally sending the completed Keynote document in an email message. All of these actions are possible with user automation.

Below is the AppleScript script code used to transfer and transform Numbers table data into a new chart in Keynote. As you can see, scripting support (Apple Events) provides direct access to the internal workings of the iWork applications. An Apple Events-based script can deep query the selected table object in Numbers to retrieve a range of data, and then use the retrieved data to create a new slide and chart in Keynote.

tell application "Numbers"
    tell the front document
        set aTable to the first table of active sheet whose class of selection range is range
        set aRange to cell range of aTable
        tell aTable
            copy {column count, row count} to {columnCount, rowCount}
            copy {header column count, header row count, footer row count} to {headerColumnCount, headerRowCount, FooterRowCount}
            set dataColumnCount to columnCount - headerColumnCount
            set dataRowCount to rowCount - headerRowCount - FooterRowCount
            if headerRowCount is 0 then
                set columnNames to {}
                repeat dataColumnCount times
                    set the end of the columnNames to ""
                end repeat
            else
                set columnNames to the formatted value of cells (headerColumnCount + 1) thru -1 of row headerRowCount
            end if
            if headerColumnCount is 0 then
                set rowNames to {}
                repeat dataRowCount times
                    set the end of the rowNames to ""
                end repeat
            else
                set rowNames to the formatted value of cells (headerRowCount + 1) thru (FooterRowCount - 1) of column headerColumnCount
            end if
            set topLeftRangeCellID to the name of cell (headerColumnCount + 1) of row (headerRowCount + 1)
            set bottomRightRangeCellID to the name of last cell of row ((FooterRowCount + 1) * -1)
            set tableDataRange to range (topLeftRangeCellID & ":" & bottomRightRangeCellID)
            set cellValuesArray to value of every cell of tableDataRange
            set chartData to {}
            set x to 1
            repeat ((count of cellValuesArray) div dataColumnCount) times
                set the end of chartData to items x thru (x + (dataColumnCount - 1)) of cellValuesArray
                set x to x + dataColumnCount
            end repeat
        end tell
    end tell
end tell
tell application "Keynote"
    activate
    tell front document
        set aSlide to make new slide with properties {base slide:master slide "Blank"}
        tell aSlide
            add chart row names rowNames column names columnNames data chartData type vertical_bar_2d group by chart column
        end tell
    end tell
end tell

Open in Script Editor

And if you prefer using a scripting language other than AppleScript, the same Apple Event functionality could be delivered via a JavaScript (JXA) script…

var Keynote = Application('com.apple.iWork.Keynote')
Keynote.activate()
var frontDocument = Keynote.documents[0]
var aMasterSlide = frontDocument.masterSlides["Blank"]
var newSlide = Keynote.Slide({baseSlide:aMasterSlide})
frontDocument.slides.push(newSlide)
Keynote.addChart(newSlide,{rowNames:rwNames, columnNames:colNames, data:chartData, type:"vertical_bar_2d", groupBy:"chart row"})

Excerpt shown here, open in script editor to see the full script.

…or by using Automator actions for Keynote.

Automator

And then there’s Automator. The unrivaled tool for creating your own “automation recipes” using a simple drag-and-drop process of stringing together nuggets of predefined functionality (Automator actions) into “workflows.” Saved workflows can be executed at integrated extension points throughout macOS, including: as contextual System Services, as Image Capture and Print to PDF plugins, as Folder Actions, as Script Menu items, as Calendar alarms, and even as stand-alone applets or droplets.

There’s a common misconception that Automator is “visual-AppleScript,” but Automator is actually language-agnostic. Its “actions” can be written in nearly every macOS-supported automation language: AppleScript, AppleScriptObj-C, JavaScript, Shell, perl, python, ruby, and Objective-C. Automator can drive Apple Events, the UNIX utilities, and the Cocoa frameworks, within a single workflow. Automator also has an extensive variable architecture for capturing, generating, and reusing data during the execution of its workflows.

Automator is unprecedented in its scope and abilities in macOS. No application or system service comes close to matching what it can do.

But the most amazing thing about Automator is that powerful automation tools can be created by users without writing a single line of code—like this contextual system service that combines the PDF files selected in the Finder:

Who creates User Automation tools?

Which brings up the question: who creates User Automation tools?

In my 23-years of working with customers and businesses around the globe, it is my observation that the majority of automation solutions are created in-house by customers and employees, motivated to address the challenges, complexities, and redundancies they face in their day-to-day work.

Of course this is not to say that I have not witnessed numerous solutions created by professionals for businesses, some of which involved very elaborate and powerful codebases, but much of the time an automation solution is as simple as a script or collection of scripts created by someone who is not a programmer or developer.

Automation tools aren't limited to scripts and some may display interfaces (such as Automator actions) for interacting dynamically with the user. These tools are usually written in Xcode, using the provided AppleScript application and Automator actions templates. Xcode automation projects can incorporate all of the standard UI elements, outlets, actions, and bindings used in the creation of traditional macOS and iOS applications.

For writing AppleScript or JavaScript (JXA) scripts, you can use the Script Editor application that ships with every copy of macOS, or for composing and editing AppleScript and AppleScriptObj-C scripts you can use feature-rich 3rd-party editors like Script Debugger that include code-completion and step-by-step debugging.

Summary

User Automation involves a variety of languages and technologies. The native Apple Events architecture of macOS provides the means for communicating with applications via scripts or code. The Objective-C bridges for JavaScript (JXA) and AppleScript (AppleScriptObj-C) provide access to the Cocoa frameworks that are the foundation of macOS. And specialized scripting additions (OSAX) deliver access to the UNIX command line and the numerous utilities that come with being a UNIX-based OS. And then there’s Automator, which has access to practically everything.

But putting the technicalities aside, the whole purpose of User Automation is to serve the user of the computer, to enable a motivated customer to use and create automation tools, like scripts, workflows, and applets, without restrictions or the requirement of being a developer proficient in Xcode and Objective-C or Swift. User Automation is for the rest of us. Ah, remember that old chestnut?

Conclusion

Tubes and Wires. I came across this quote the other day:

AppleScript has survived and remained relevant during a turbulent decade-long transition, despite its unbeloved language syntax and technical hurdles, for the simple reason that it solves real-world problems in a way that no other OS X technology does. In theory, AppleScript could be much better; in practice, though, it’s the best thing we have that works. It exemplifies the Mac’s advantages over iOS for tinkerers and advanced users.
—John Gruber, Macworld, Dec 12, 2012, The unlikely persistence of AppleScript

App Extensions and the User Automation technologies have some similarities but many differences. App Extensions are written by developers. User Automation is often written by customers. App Extensions provide restricted manipulation of selected data, while User Automation enables open query and control of applications and frameworks. App Extensions represent a habitat of “approved” developer-created tubes, while User Automation is about connecting wires to application and framework APIs to create a flow. App Extensions exist as app plugins. User Automation technologies are manifested as scripts, workflows, applets, droplets, applications, services, and plugins, available globally and at extension points throughout macOS.

Based upon the information presented in this overview, it is clear that App Extensions do not provide the same abilities and functionality as the User Automation technologies of macOS, and objectively should not be considered a comparable replacement for them. Of course, this conclusion assumes that customers should retain the same level of control over their devices as provided by the current User Automation in macOS. Such is the foundation of the credo that the power of the computer should reside in the hands of the one using it.

But let’s take a step back, and think about this topic differently. Why not have both?

Perhaps it is time for Apple and all of us to think of User Automation and App Extensions in terms of "AND" instead of "OR." To embrace the development of a new cross-platform automation architecture, maybe called “AutomationKit,” that would incorporate the “everyman openness” of User Automation with the focused abilities of developer-created plugins. App Extensions could become the new macOS System Services, and Automator could save workflows as Extensions with access to the Share Menu and new “non-selection” extension points. And AutomationKit could even include an Apple Event bridge so that it would work with the existing macOS automation tools.

App Extensions could become another type of User Automation. What a concept.


Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


iOS 10 Actionable Notifications, the Lock Screen, and 3D Touch

Permalink - Posted on 2016-09-18 11:15

Junjie, developer of Due for iOS, on changes to the Lock screen and actionable notifications in iOS 10:

To my surprise, when users upgraded their iOS 9 devices to iOS 10 this week, I started receiving feedback that they were no longer able to snooze or complete their reminders from their Lock Screen. Many thought I’ve removed the feature from Due, or that there was a bug with Due in iOS 10. Of course, neither of which is the case.
[...]
Unlike iOS 8 and iOS 9, swiping a notification from right to left in iOS 10 no longer reveals the notification actions. Instead, depending on the device that you use, it now displays either View and Clear on non-3D Touch devices, or just Clear on 3D Touch devices.
[...]
So while users can now access all four notification actions in iOS 10, they need to go through an additional, unintuitive step of pressing the View button. However, for users with 3D Touch enabled phones like the iPhones 6s and 7, pressing firmly on the notification will reveal the notification actions menu.

I was talking about this with my girlfriend earlier today, and it's something I didn't consider in my review. For some reason, she can't use 3D Touch. Every time she presses on the screen, she ends up swiping or activating tap & hold accidentally. I don't know what it is about the way she grips the phone or touches the screen – we've tried every setting, and she just can't take advantage of 3D Touch in her daily iPhone usage. She ended up disabling 3D Touch altogether because it's useless to her.

Here's a problem, though: with iOS 10's notification design, this means she can't swipe on a notification and have instant access to actions. There's an extra step:

  1. Swipe notification on the Lock screen;
  2. Tap the new 'View' button;
  3. Tap actions in the expanded notification.

Step 2 is what people who don't use 3D Touch need to go through now, and it feels like a regression. I wish I had mentioned this in my story, but I didn't think of it because I use 3D Touch and pressing notifications is second nature to me.

Perhaps Apple could improve this by automatically expanding a notification with a long swipe. Instead of revealing two buttons – View and Clear – a long swipe to the left could trigger the View button, expand a notification, and avoid the additional tap required for non-3D Touch users in iOS 10.

→ Source: news.dueapp.com


Image Tests

Permalink - Posted on 2016-09-08 02:15, modified at 10:47

CARROT's widget can be customized with two styles.

CARROT's widget can be customized with two styles.


Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


Extra features for inspection

Permalink - Posted on 2016-09-02 17:00, modified on 2017-06-24 16:41

Fusce dapibus, tellus ac cursus commodo, tortor mauris condimentum nibh, ut fermentum massa justo sit amet risus. Nullam id dolor id nibh ultricies vehicula ut id elit. Nulla vitae elit libero, a pharetra augue. Maecenas faucibus mollis interdum. Sed posuere consectetur est at lobortis. Donec id elit non mi porta gravida at eget metus. Praesent commodo cursus magna, vel scelerisque nisl consectetur et.

Etiam porta sem malesuada magna mollis euismod. Sed posuere consectetur est at lobortis. Sed posuere consectetur est at lobortis. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Morbi leo risus, porta ac consectetur ac, vestibulum at eros.

Nulla vitae elit libero, a pharetra augue. Sed posuere consectetur est at lobortis. Vivamus sagittis lacus vel augue laoreet rutrum faucibus dolor auctor. Nulla vitae elit libero, a pharetra augue. Aenean lacinia bibendum nulla sed consectetur.

Cras mattis consectetur purus sit amet fermentum. Praesent commodo cursus magna, vel scelerisque nisl consectetur et. Cras justo odio, dapibus ac facilisis in, egestas eget quam. Donec sed odio dui. Vestibulum id ligula porta felis euismod semper. Morbi leo risus, porta ac consectetur ac, vestibulum at eros.

Donec ullamcorper nulla non metus auctor fringilla. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Cras justo odio, dapibus ac facilisis in, egestas eget quam. Donec ullamcorper nulla non metus auctor fringilla.

Vivamus sagittis lacus vel augue laoreet rutrum faucibus dolor auctor. Nullam id dolor id nibh ultricies vehicula ut id elit. Etiam porta sem malesuada magna mollis euismod. Maecenas sed diam eget risus varius blandit sit amet non magna.

Donec ullamcorper nulla non metus auctor fringilla. Donec sed odio dui. Duis mollis, est non commodo luctus, nisi erat porttitor ligula, eget lacinia odio sem nec elit. Curabitur blandit tempus porttitor. Maecenas sed diam eget risus varius blandit sit amet non magna. Vestibulum id ligula porta felis euismod semper.

Nullam id dolor id nibh ultricies vehicula ut id elit. Donec ullamcorper nulla non metus auctor fringilla. Aenean lavinia bibendum nulla sed consectetur. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus.

Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Duis mollis, est non commodo luctus, nisi erat porttitor ligula, eget lacinia odio sem nec elit. Vivamus sagittis lacus vel augue laoreet rutrum faucibus dolor auctor. Nullam quis risus eget urna mollis ornare vel eu leo.

Cras justo odio, dapibus ac facilisis in, egestas eget quam. Aenean lacinia bibendum nulla sed consectetur. Donec id elit non mi porta gravida at eget metus. Maecenas faucibus mollis interdum. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Nullam id dolor id nibh ultricies vehicula ut id elit.

Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Etiam porta sem malesuada magna mollis euismod. Sed posuere consectetur est at lobortis. Nulla vitae elit libero, a pharetra augue. Etiam porta sem malesuada magna mollis euismod. Maecenas faucibus mollis interdum.

Vivamus sagittis lacus vel augue laoreet rutrum faucibus dolor auctor. Nulla vitae elit libero, a pharetra augue. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Morbi leo risus, porta ac consectetur ac, vestibulum at eros. Nullam id dolor id nibh ultricies vehicula ut id elit.

Aenean lacinia bibendum nulla sed consectetur. Cras mattis consectetur purus sit amet fermentum. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Nullam quis risus eget urna mollis ornare vel eu leo.

Nulla vitae elit libero, a pharetra augue. Donec ullamcorper nulla non metus auctor fringilla. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec id elit non mi porta gravida at eget metus. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Nulla vitae elit libero, a pharetra augue. Morbi leo risus, porta ac consectetur ac, vestibulum at eros.

Donec sed odio dui. Maecenas sed diam eget risus varius blandit sit amet non magna. Donec id elit non mi porta gravida at eget metus. Curabitur blandit tempus porttitor. Morbi leo risus, porta ac consectetur ac, vestibulum at eros.

Cras justo odio, dapibus ac facilisis in, egestas eget quam. Aenean lacinia bibendum nulla sed consectetur. Donec id elit non mi porta gravida at eget metus. Maecenas faucibus mollis interdum. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Nullam id dolor id nibh ultricies vehicula ut id exit.


Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


iOS 10: The MacStories Review

Permalink - Posted on 2016-08-16 21:55, modified on 2016-09-13 10:41

Sometimes, change is unexpected. More often than not, change sneaks in until it feels grand and inevitable. Gradually, and then suddenly. iOS users have lived through numerous tides of such changes over the past three years.

iOS 7, introduced in 2013 as a profound redesign, was a statement from a company ready to let go of its best-selling OS' legacy. It was time to move on. With iOS 8 a year later, Apple proved that it could open up to developers and trust them to extend core parts of iOS. In the process, a new programming language was born. And with last year's iOS 9, Apple put the capstone on iOS 7's design ethos with a typeface crafted in-house, and gave the iPad the attention it deserved.

You wouldn't have expected it from a device that barely accounted for 10% of the company's revenues, but iOS 9 was, first and foremost, an iPad update. After years of neglect, Apple stood by its belief in the iPad as the future of computing and revitalized it with a good dose of multitasking. Gone was the long-held dogma of the iPad as a one-app-at-a-time deal; Slide Over and Split View – products of the patient work that went into size classes – brought a higher level of efficiency. Video, too, ended its tenure as a full-screen-only feature. Even external keyboards, once first-party accessories and then seemingly forgotten in the attic of the iPad's broken promises, made a comeback.

iOS 9 melded foundational, anticipated improvements with breakthrough feature additions. The obvious advent of Apple's own typeface in contrast to radical iPad updates; the next logical step for web views and the surprising embrace of content-blocking Safari extensions. The message was clear: iOS is in constant evolution. It's a machine sustained by change – however that may happen.

It would have been reasonable to expect the tenth iteration of iOS to bring a dramatic refresh to the interface or a full Home screen makeover. It happened with another version 10 beforetwice. And considering last year's iPad reboot, it would have been fair to imagine a continuation of that work in iOS 10, taking the iPad further than Split View.

There's very little of either in iOS 10, which is an iPhone release focused on people – consumers and their iPhone lifestyles; developers and a deeper trust bestowed on their apps. Like its predecessors, iOS 10 treads the line of surprising new features – some of which may appear unforeseen and reactionary – and improvements to existing functionalities.

Even without a clean slate, and with a release cycle that may begin to split across platforms, iOS 10 packs deep changes and hundreds of subtle refinements. The final product is a major leap forward from iOS 9 – at least for iPhone users.

At the same time, iOS 10 is more than a collection of new features. It's the epitome of Apple's approach to web services and AI, messaging as a platform, virtual assistants, and the connected home. And as a cornucopia of big themes rather than trivial app updates, iOS 10 shows another side of Apple's strategy:

Sometimes, change is necessary.

Supported Devices

As more features have been added to iOS over the years, its first-run setup flow has become bloated, if not downright unintuitive.

iOS 10 doesn't take any meaningful steps to simplify the setup of a new iOS device, which is mostly unchanged from iOS 9. The only notable difference is the action required to begin the setup process, which is now "press Home to open". As I'll explore later, there's a reason for this.

Where iOS 10 does break away from the old is in the system requirements needed to install the OS. Most devices from 2011 and 2012 aren't compatible with iOS 10, including:

  • iPhone 4S
  • iPad 2
  • iPad (3rd generation)
  • iPad mini
  • iPod touch (5th generation)
Devices supported by iOS 10.

Devices supported by iOS 10.

Progress, of course, marches on, but there are other notable points in this move.

The iPad 2 – perhaps the most popular iPad model to date – supported iOS 9 (in a highly constrained fashion) despite developers clamoring for its demise. After 5 years of service, Apple is cutting ties with it in iOS 10. By leaving the A5 and A5X CPUs behind, developers are now free to create more computationally intensive iPad apps without worrying about the lack of Retina display on the iPad 2 and the performance issues of the third-generation iPad holding them back.

Look closer, and you'll also notice that Apple is dropping support for all devices with the legacy 30-pin dock connector. If a device can run iOS 10, it is equipped with a Lightning port.

In addition to Lightning, every iOS 10-eligible iPad has a Retina display, but not every device comes with a Touch ID sensor yet, let alone a 64-bit processor, Apple Pay, or background 'Hey Siri' support.

It's going to be a while until Apple can achieve its vision of 64-bit and one-tap payments across the board, but it's good to see them moving in that direction by phasing out hardware that no longer fits what iOS has grown into. iOS 10 is starting this transition today.

The Lock Screen

One of the first interactions with iOS 10 is likely going to be an accidental swipe.

For the first time since the original iPhone, Apple is changing the "Slide to Unlock" behavior of the iOS Lock screen. iOS 10 gets rid of the popular gesture altogether, bringing tighter integration with Touch ID and an overhauled Lock screen experience.

Press to Unlock

Let's back up a bit and revisit Steve Jobs' famous unveiling of the iPhone and Slide to Unlock.

At a packed Macworld in January 2007, Jobs wowed an audience of consumers and journalists by demonstrating how natural unlocking an iPhone was going to be. Apple devised an unlocking gesture that combined the security of an intentional command with the spontaneity of multitouch. In Jobs' words:

And to unlock my phone I just take my finger and slide it across.

We wanted something you couldn't do by accident in your pocket. Just slide it across...and boom.

As the iPhone evolved to accommodate stronger passcodes, a fingerprint sensor, and a UI redesign, its unlocking mechanism stayed consistent. The passcode number pad remained on the left side of the Lock screen; even on the iPad's bigger display, the architecture of the Lock screen was no different from the iPhone.

With the iPhone 6s, it became apparent that Slide to Unlock was drifting away from its original purpose. Thanks to substantial speed and accuracy improvements, the second-generation Touch ID sensor obviated the need to slide and type a passcode. However, because users were accustomed to waking an iPhone by pressing the Home button, Touch ID would register that initial click as a successful fingerprint read. The iPhone 6s' Touch ID often caused the first Home button click to unlock an iPhone, blowing past the Lock screen with no time to check notifications.

Ironically, the convenience of Touch ID became too good for the Lock screen. As I wrote in my story on the iPhone 6s Plus:

The problem, at least for my habits, is that there is useful information to be lost by unlocking an iPhone too quickly. Since Apple's move to a moderately bigger iPhone with the iPhone 5 and especially after the much taller iPhone 6 Plus, I tweaked my grip to click the Home button not only to unlock the device, but to view Lock screen notifications as well. While annoying, the aforementioned slowness of previous Touch ID sensors wasn't a deal-breaker: a failed Touch ID scan meant I could at least view notifications. When I wanted to explicitly wake my locked iPhone's screen to view notifications, I knew I could click the Home button because Touch ID wouldn't be able to register a quick (and possibly oblique) click anyway.

That's not the case with the iPhone 6s Plus, which posed a peculiar conundrum in the first days of usage. Do I prefer the ability to reliably unlock my iPhone with Touch ID in a fraction of a second, or am I bothered too much by the speed of the process as it now prevents me from viewing notifications on the Lock screen?

Apple is making two changes to the unlocking process in iOS 10 – a structural one, with a redesign of the Lock screen and its interactivity; and a behavioral one to rethink how unlocking works.

Apple hopes that you'll no longer need to click any button to wake an iPhone. iOS 10 introduces Raise to Wake, a feature that, like the Apple Watch, turns on the iPhone's display as soon as it's picked up.

Raise to Wake is based on a framework that uses sensors – such as the motion coprocessor, accelerometer, and gyroscope – to understand if a phone has been taken out of a pocket, but also if it's been picked up from a desk or if it was already in the user's hands and its elevation changed. Due to ergonomics and hardware requirements, Raise to Wake is only available on the iPhone 6s/7 generations and it's not supported on the iPad.

Apple has learned from the first iterations of watchOS: Raise to Wake on the iPhone 6s and iOS 10 is more accurate than the similar Watch feature that shipped in 2015. In my tests, Raise to Wake has worked well when taking the iPhone out of my pocket or picking it up from a flat surface; it occasionally struggled when the iPhone was already in my hands and it was tricky for the system to determine if it was being raised enough. In most everyday scenarios, Raise to Wake should wake an iPhone without having to click the Home or sleep buttons.

Raise to Wake is only one half of the new unlocking behavior in iOS 10: you'll still need to authenticate and unlock a device to leave the Lock screen. This is where the iPhone's original unlocking process is changing.

To unlock a device running iOS 10, you need to click the Home button. If the display is already on and you place your finger on the Touch ID sensor without clicking it – as you used to do in iOS 9 – that won't unlock the device. By default, iOS 10 wants you to physically press the Home button.

Bye, slide to unlock.

Bye, slide to unlock.

This alteration stems from the unbundling of fingerprint recognition and Home button click, which are now two distinct steps. Placing a finger on Touch ID authenticates without unlocking; pressing the Home button unlocks.

In Apple's view, while Raise to Wake turns on the display, authentication may be required to interact with features on the Lock screen – such as actionable notifications, widgets, or Spotlight results. With iOS 10, users can pick up an iPhone, view what's new on the Lock screen, and authenticate (if necessary1) without the risk of unlocking it.

From a design standpoint, this change is reflected in the icons and messages displayed to the user on the Lock screen. When the display turns on with Raise to Wake, a padlock icon in the status bar indicates that the user has not yet authenticated with Touch ID. At the bottom, a 'Press home to unlock' message replaces the old 'slide to unlock' one.

Locked.

Locked.

With the display on and after Touch ID authentication, 'Press home to unlock' becomes 'Press home to open' and the status bar lock switches to an 'Unlocked' message.

Unlocked.

Unlocked.

Under the hood, clicking the Home button and placing a finger on Touch ID are two separate actions. However, the wording of 'Press home to unlock' feels like Apple wants you to think of them as one. The entire message is an illusion – pressing the Home button by itself doesn't actually unlock a device – but Raise to Wake combined with the second-generation Touch ID will make you believe in it.

On an iPhone 6s, one click on the Home button is all that's needed to exit the Lock screen – at least most of the time. If the iPhone's display is off because Raise to Wake didn't work (or because you manually locked it while holding it), the experience is similar to iOS 9. Clicking the Home button with a Touch ID-enabled finger will wake up the display and bypass the Lock screen.

You can revert to a pre-iOS 10 unlocking experience if you don't like the new one. First, Raise to Wake can be disabled in Settings > Display & Brightness, and your iPhone will no longer turn on when picked up. Additionally, tucked away in Settings > Accessibility > Home Button, you'll find an option called 'Rest Finger to Open'. When enabled, your iPhone will unlock through Touch ID alone, without having to press the Home button.

It takes some time to get used to the new unlocking behavior of iOS 10. The apparent unification of Home button click and Touch ID makes less sense on devices without the second-generation sensor, where one click is rarely enough and tends to bring up the passcode view for a second attempt. And, nostalgically speaking, I miss the old 'slide to unlock' message, although for reasons that are merely emotional and not related to function.

After three months, Raise to Wake and Press to Unlock have made the overall unlocking experience faster and more intuitive. I now expect my iPhone to know when it's time to wake up and show me the Lock screen, and I don't miss the old unlocking process. Raise to Wake eliminates the need to click a button to wake an iPhone; having to press the Home button to unlock removes the risk of accidentally leaving the Lock screen.

But it all goes back to that accidental swipe. Picture this: you've just upgraded to iOS 10, or you've bought a new iPhone with iOS 10 pre-installed, and, instinctively, you slide to unlock. What you're going to see isn't an error message, or the Lock screen bouncing back, telling you that you need to press the Home button instead. You're going to see the biggest change to the Lock screen – potentially, a better way of interacting with apps without unlocking a device at all.

Slide to unlock, and you'll meet the new Lock screen widgets.

Lock Screen Widgets

Technically, Lock screen widgets predate iOS 10. On both the iOS 8 and iOS 9 Lock screens, users could swipe down to reveal Notification Center and its Today view. However, iOS 10 adds an entirely new dimension to the Lock screen, as well as a refreshed design for widgets throughout the system.

The Lock screen's renovation in iOS 10 starts with three pages: widgets and search on the left, the Lock screen (with notifications and media controls) in the middle, and the Camera on the right. You can swipe to move across pages, as suggested by pagination controls at the bottom of the Lock screen.

The Lock screen's new horizontal hierarchy, with widgets on the left.

The Lock screen's new horizontal hierarchy, with widgets on the left.

The leftmost page, called the Search screen, isn't completely new either. Apple took the functionality of Spotlight search and Proactive of iOS 9, mixed it up with widgets, and made it a standalone page on the iOS 10 Lock screen (and Home screen, too).

From left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.

From left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.

Notably absent from iOS 10's Lock screen is the Camera launcher button. By getting rid of the tiny shortcut in the bottom right corner, Apple has made the Camera easier to launch: swiping anywhere to move between Lock screen and Camera is easier than carefully grabbing an icon from a corner. I've been taking more spontaneous, spur-of-the-moment pictures and videos thanks to iOS 10's faster Camera activation on the Lock screen.

Apple's sloppy swiping for Lock screen navigation has one caveat. If notifications are shown, swiping horizontally can either conflict with actionable buttons (swipe to the left) or open the app that sent a notification (swipe right). You'll have to remember to swipe either on the clock/date at the top or from the edge of the display; such is the trade-off of using the same gestures for page navigation and notification actions.

Where to swipe when notifications fill the Lock screen. (Tap for full size)

Where to swipe when notifications fill the Lock screen. (Tap for full size)

Three changes stand out when swiping right to open the Search screen:

  • There's a search field at the top, shown by default;
  • The clock2 stays pinned to the right3;
  • Widgets have a new design that favors richer, bigger content areas.

Unlike their predecessors, widgets in iOS 10 don't blend in with the dark background of Notification Center. This time, Apple opted for standalone units enclosed in light cells with an extensive use of custom interfaces, buttons, images, and dark text.

Widgets in Notification Center on iOS 9 and iOS 10.

Widgets in Notification Center on iOS 9 and iOS 10.

There's a common thread between widgets and notifications (also redesigned in iOS 10): they're self-contained boxes of information, they sit on top of the wallpaper rather than meshing with it, and they display an app's icon and name in a top bar.

Notifications and widgets. Spot the trend.

Notifications and widgets. Spot the trend.

The new design is more than an aesthetic preference: the makeover has also brought functional changes that will encourage users and developers to rethink the role of widgets.

A widget in iOS 10 supports two modes: collapsed and expanded. The system loads all widgets in collapsed mode by default, which is about the height of two table rows (about 110 points). All widgets compiled for iOS 10 must support collapsed mode and consider the possibility that some users will never switch to the expanded version. Apps cannot activate expanded mode on the user's behalf; switching from compact to expanded is only possible by tapping on a 'Show More' button in the top right corner of a widget.

Compact and expanded widgets.

Compact and expanded widgets.

This is no small modification, as it poses a problem for apps that have offered widgets since iOS 8. Under the new rules, apps updated for iOS 10 can't show a widget that takes up half of the display as soon as it's installed. Any widget that wants to use more vertical space for content – such as a todo list, a calendar, or even a list of workflows – will have to account for the default compact mode.

For some developers, this will mean going back to the drawing board and create two separate widget designs as they'll no longer be able to always enforce one. Others will have to explain the difference to their users. Workflow, which used to offer a widget that could dynamically expand and collapse, is updating the widget for iOS 10 with a label to request expansion upon running a workflow that needs more space.

Workflow's new iOS 10 widget.

Workflow's new iOS 10 widget.

There's one exception: legacy iOS 9 apps that haven't been updated for iOS 10. In that case, the system won't impose compact mode and it won't cut off old widgets (which keep a darker background), but there's a strong possibility that they won't look nice next to native iOS 10 ones.

The same widget in iOS 9 legacy mode and with native iOS 10 support.

The same widget in iOS 9 legacy mode and with native iOS 10 support.

I don't see how Apple could have handled this transition differently. Design updates aside, there's an argument to be made about some developers abusing Notification Center with needlessly tall and wasteful widgets in the past. Compact mode is about giving control to the users and letting them choose how they prefer to glance at information. Want to install a widget, but don't need its full UI? Use it in compact mode. Need to get more out of it? Switch to expanded.

Apple's decision to adopt compact and expanded modes in iOS 10 is a nod to developers who shipped well-designed widgets in the past, and it provides a more stable foundation going forward.

I've been able to test a few third-party iOS 10 widgets that illustrate the advantages of these changes.

PCalc, James Thomson's popular iOS calculator, has a new widget that displays a mini calculator in compact mode with numbers and basic operations split in two rows.

Despite the small touch targets, the compact interface is usable. If you want bigger buttons and a more familiar layout, you can switch to expanded mode, which looks like a small version of PCalc living inside a widget – edge-to-edge design included.

Launcher doesn't modify its widget's interface when toggling between compact and expanded, but the constraints of the smaller layout force you to prioritize actions that are most important to you.

Using compact mode for summary-style UIs will be a common trend in iOS 10. CARROT Weather is a good example: it shows a summary of current conditions when the widget is compact, but it adds forecasts for the day and week ahead when expanded.

CARROT's widget can be customized with two styles.

CARROT's widget can be customized with two styles.

Even better, slots in the compact layout can be customized in the app, and you can choose to use the widget in light or dark mode.

Drafts has an innovative implementation of compact and expanded layouts, too. In compact, the widget features four buttons to create a note or start dictation. When the widget expanded, it grows taller with a list of items from the app's inbox, which can be tapped to resume editing.

In the past, developer Greg Pierce would have had to ask users to customize the widget or make it big by default; in iOS 10, they can switch between modes as needed.

Widgets' ubiquitous placement pushes them to a more visible stage; as soon as more developers adapt4, iOS 10 has the potential to take widgets to the next level.

I believe the new design will play an essential role in this.

The Design of Widgets

Apple advertises legibility and consistency as core tenets of widgets in iOS 10, and I agree: widget content and labels are easier to read than iOS 9. Standalone light cells separate widgets with further precision; I haven't found translucency with the Lock screen wallpaper to be an issue.

In addition, the light design brings deeper consistency between apps and widgets. Most iOS apps have light backgrounds and they employ color to outline content and indicate interactivity. In iOS 10, widgets are built the same way: the combination of light backgrounds, buttons, and custom interfaces is often consistent with the look of the containing app.

In this regard, widgets feel more like mini-apps available anywhere rather than smaller, less capable extras. The line between widget and full app UIs is more blurred than ever in iOS 10.

Apple's new Notes and Calendar widgets showcase this newfound cohesiveness. The Notes widget displays the same snippets of the list in the Notes app. Buttons to create new notes and checklists are also the same. The widget looks and feels like a small version of Notes available anywhere on iOS.

From app to widget.

From app to widget.

The Calendar widget is even more indicative. Glancing at events and recognizing their associated calendar wasn't easy in iOS 9, as they only had a thin stripe of color for the calendar to which they belonged.

The Calendar widget is more contextual on iOS 10.

The Calendar widget is more contextual on iOS 10.

In iOS 10, forgoing a dark background has allowed Apple to show Calendar events as tinted blocks matching the look of the app. Discerning events and the calendars they belong to is easier and familiar.

Consistency of apps and widgets.

Consistency of apps and widgets.

I wouldn't expect every app to adopt a widget design that exactly mirrors the interface users already know, but it can be done. Switching to a light design has given Apple a chance to reimagine widgets for consistency with apps and lively combinations of color, text, and icons. They are, overall, a step up from iOS 9 in both appearance and function.

The new direction also opens up a future opportunity: what is light can be more easily converted to dark. I could see a system dark mode working well for widgets.

The iPad Lock Screen

The iPad's Lock screen doesn't break any new ground, but there are some differences from the iPhone.

On the iPad, notifications are displayed on the left side of the screen when in landscape. They're aligned with the system clock, and they leave room for media controls to be displayed concurrently on the right. Dealing with notifications while controlling music playback is a task well suited for the iPad's larger display.

Unfortunately, Apple doesn't think portrait orientation should warrant the same perks. If a notification comes in while album artwork is displayed on the Lock screen, the artwork will be hidden. Apple decided against using a two-column layout in portrait, which I don't understand: they're already doing it for widgets on the iPad.

No artwork for you, Mr. Portrait.

No artwork for you, Mr. Portrait.

Furthermore, if no music is playing on an iPad in landscape, having notifications aligned to the left for no apparent reason looks odd and seems...unnecessary.

The right side seems cozy.

The right side seems cozy.

Widgets fare a little better. Apple has kept the two-column design first introduced in the Today view of iOS 9; you can still scroll the two lists of widgets independently.

I would have appreciated the ability to further control the resizing and placement of widgets on the iPad, and the Lock screen design seems uninspired. We'll have to make the most of this bare minimum work for now.

Apple's Widgets

iOS 10 sports an increased modularity of widgets. Apple has done away with grouping multiple types of content under Siri Suggestions – most Apple apps/features have their own widget, which can be disabled from a revamped configuration screen.

Widget's new configuration screen.

Widget's new configuration screen.

Here's an overview of what's changed.

Activity

Your Activity rings from the Apple Watch, with a summary of Move, Exercise, and Stand statistics.

Calendar

A mini calendar interface. Events are displayed as colored blocks matching the calendar they belong to. You can tap on an event to open it, and expand the widget to reveal more events.

Favorites

Shortcuts to your favorite contacts with different ways to get in touch with them. New in iOS 10, you can assign iMessage as well as third-party communication apps (messaging and VoIP) to contact entries in Favorites, which will be displayed in the widget.

Mail

...yeah.

...yeah.

The Mail widget is the weakest of the bunch: it only displays shortcuts for VIP contacts. I would have preferred to see a preview of the unified inbox, or perhaps an option to show flagged messages.

Maps

Maps has three widgets: destinations, nearby, and transit. While the latter isn't available for my area (Rome, Italy), the other two have worked inconsistently. I've never seen a nearby recommendation in the widget, despite being around places rich in POIs. The Destinations widget usually tells me how much time it'll take me to drive home, but it doesn't proactively suggest other locations I frequently visit.

Music

The Music widget is an odd one. It displays a grid of what appears to be either recently played music or your all-time most listened albums. The widget doesn't clarify whether it's showcasing albums or individual songs; it uses album artworks with no text labels, and it plays either the most played song from an album, or an entire album starting from the first song.

A nice perk: music starts playing after tapping the widget without opening Apple Music. But it always feels like a lottery.

News

Top Stories from Apple News (shown even if you mute the channel). The widget uses image thumbnails and custom typography matching the bold font of Apple News for headlines.

The best change from iOS 9: news can be disabled by removing the widget.

Notes

A preview of your most recent notes. In compact mode, the widget only shows the last modified note. In expanded mode, you get more notes and buttons to create a new note, a checklist, snap a picture, and create a drawing.

Photos

A collection of Memories created by the new Photos app in iOS 10. Each one can be tapped to view the associated memory in Photos.

Siri App Suggestions

iOS 9's proactive Siri Suggestions are now smaller in scope and they're called Siri App Suggestions. The widget displays 4 app shortcuts (8 in expanded mode), and it doesn't suggest other types of content.

Like News, it can also be removed and be placed anywhere on the Search screen.

Tips

You'd think that the Tips widget is useless – everyone likes to make fun of Tips – but hear me out. In compact mode, the widget shows a tip's snippet; you can tap it and open the Tips app. Switch to expanded mode, though, and you'll be presented with a custom interface with an explanation of the tip and a large animation at the top to show you the tip in action.

The Tips widget looks great, and it's the most technically impressive one on iOS 10.

Up Next

The old Today Summary widget has been renamed Up Next. It displays a smaller version of your next event without the full UI of the Calendar widget. Alas, the Tomorrow Summary widget is gone from iOS 10.

Weather

Perhaps the best example of how widgets can use compact and expanded modes, Apple's Weather widget shows weather conditions for the current location when compact, and a forecast of the next six hours when expanded.

Weather is the widget I've used the most in the past three months to look up forecasts from the Lock screen in just a couple of seconds.

Slide to Glance

The move to apps as atomic units scattered across the system is everywhere in iOS 10, with widgets being the foremost example.

Noticeably absent from iOS 10's widgets is a push for more proactive recommendations. As we'll see later, Apple has shifted its Proactive initiative to run through the OS and inside apps rather than distilling it into widgets.

3D Touch is another illustrious no-show. While notifications have been overhauled to make good use of 3D Touch, pressing on a widget will result in a disappointing lack of feedback. 3D Touch would be a perfect fit for widgets – imagine previewing a full note or reading the first paragraphs of a news story from the Lock screen.

The new widget design and Search screen placement make an iPhone more useful without having to unlock it. Apple has done a good job with their built-in widgets; it's up to developers now to rethink how their apps can take advantage of them. I'm optimistic that everything will turn out better than two years ago.

I unlock my iPhone less thanks to iOS 10's more capable Lock screen. Raise to Wake, Press to Open, widgets, search, and rich notifications make the entire Lock screen experience drastically superior to iOS 9.

Easier to navigate, better structured, less prone to unwanted unlocks. I wouldn't be able to go back to the old Lock screen.

Notifications

iOS 10's rethinking of apps as granular interactions doesn't stop at widgets. With a new framework that can turn incoming notifications into rich, actionable interfaces, Apple wants users to spend less time jumping between apps.

Notifications iOS 9 and 10.

Notifications iOS 9 and 10.

Notifications in iOS 10 share the same design principles of widgets. Rather than being grouped in a list of items on top of a dark background, notifications are discrete light cells that can be pressed (with 3D Touch), pulled down (for incoming banners), or swiped and expanded into a floating card preview.

The anatomy of an expanded notification – whether an app has been updated for iOS 10 or not – has fixed elements that developers can't control. There's a header bar at the top with the icon and name of the app, and a close button on the right to dismiss the notification. Tapping the icon on the left side will open the app that sent the notification.

The standard look of a notification in iOS 10.

The standard look of a notification in iOS 10.

This is true for both iPhones with 3D Touch and devices without it; to expand a notification on an iPad or an older iPhone (or if you don't want to use 3D Touch), you can pull down an incoming notification banner or swipe a notification to the left in Notification Center and tap 'View'.5

New APIs allow developers to take different actions for notifications that have been sent to the user – including ones that have been cleared. First, notifications can be dismissed with a Clear action by swiping on them. Apps can monitor the dismiss action and stop delivering the same notification on other devices.

Additionally, developers can remove, update, and promote notifications that have already been sent. Apple's goal was to prevent Notification Center from being cluttered with old notifications that aren't relevant anymore. If developers implement this API, updating a notification with fresh content should help users see what's changed. Imagine sports scores or live-streaming apps and how they could update notifications. I'm curious to see which services will convert to this behavior instead of spamming users with multiple alerts.

Underneath the header of an expanded notification is the content developers can control, and where the most important changes to notifications are happening.

In iOS 10, notifications can have a title and a subtitle. The title is displayed in a bold font, which helps identifying the subject of a notification. In a Reminders notification, the name of a reminder will be the bold title at the top, with its note displayed as text content below it.

The default look of a notification in iOS 10. Expansion is relative to a notification's placement on screen.

The default look of a notification in iOS 10. Expansion is relative to a notification's placement on screen.

Below the title and subtitle, iOS 10 shows a notification's body text content (same as iOS 9) and actionable buttons. In a welcome change from the past, developers can define more than two notification actions, displayed in a list under the notification's card.6 If an app requires a quick reply upon expanding a notification, the input field will sit above the keyboard – it's not attached to the notification like in iOS 9.

Quick replies in iOS 9 and iOS 10.

Quick replies in iOS 9 and iOS 10.

Design changes alone, though, wouldn't have sufficed to modernize notifications. To reinvent their feel and capabilities, Apple has created two new extension points for developers in iOS 10: Notification Service and Notification Content.

The Notification Service extension doesn't have an interface and runs in the background. Upon triggering a notification but just before delivering it to the user, an app can call the Notification Service extension to augment or replace its payload. This extension is meant to have a short execution time and it's not designed for long tasks. Possible use cases for Notification Service extensions could be downloading an image or media file from a URL before showing a notification, or decrypting an encrypted payload locally for messaging apps that rely on end-to-end encryption.

The Notification Service extension should come in handy given iOS 10's ability to include a media attachment (images, audio, videos, and even GIFs) in both the notification banner and the expanded notification. If they adopt it, apps like WhatsApp and Telegram could omit the "[Contact] sent you an image" standard notification and display a thumbnail in the notification banner (like iMessage does) and a full image preview in the expanded notification.

Notification Content extensions are what users are going to see the most in daily usage, and they motivate iOS 10's notification card design.

A notification in iOS 10 can show a custom view between the header and default text content. Custom views can be anything – an embedded map, a message conversation, media, a calendar view, etc. – and they're managed by the Notification Content extension. Custom views are non-interactive: they can't receive touch events7, but they can be updated in-place in response to a task performed from a notification action. Apps can hide the default content of a notification if the custom view is informative enough.

Service and Content extensions, combined with the expanded design, have turned notifications in iOS 10 into a completely new experience. Notifications are no longer just text: they are custom app UIs delivered to you with rich previews and interactions that can live on longer than a couple of seconds. Notifications in iOS 10 are mini apps in and of themselves.

When you receive an iMessage that contains a photo, the incoming notification can be expanded, either with 3D Touch or a swipe. You'll be treated to a full iMessage conversation UI, living inside the notification, with the same transcript, read receipts, and typing indicators you'd see in the Messages app.

To expand a notification, you can pull it down or press on it.

To expand a notification, you can pull it down or press on it.

Not only can you send a reply – you can keep the iMessage interface open as you keep a conversation going from the notification. It's a fantastic way to check into a conversation without the constraints of a quick reply.

Scroll up in the transcript to view older messages.

Scroll up in the transcript to view older messages.

When you're done, swipe down to dismiss the notification, and you'll be back to whatever you were doing.8

Calendar notifications follow the same concept. If an event with a location attached is coming up, the expanded notification will display the default text content at the bottom, but also a preview of the address with a Maps view at the top.

Thanks to actionable buttons, you can open directions in Maps without launching Calendar. If an upcoming event doesn't have a location, you'll see a preview of your agenda inside the notification.

I tested a version of Workflow optimized for iOS 10, which brings improved notification support with the ability to customize the content displayed in a notification card. In addition to a title, you'll be able to embed pictures, videos, GIFs, and even Maps views into a Workflow notification.

Rich notifications created with Workflow.

Rich notifications created with Workflow.

Pictures are displayed as thumbnails in a notification banner before expanding it; videos can be played inline within the card itself.

And if you often receive messages containing GIFs, iOS 10 will let you preview them directly from a notification.

CARROT Weather has a clever take on rich notifications in iOS 10. The daily digest and severe weather/precipitation alerts can be expanded into dynamic preview cards.

Through a Notification Content extension, the app can embed a custom interface, sounds, and even animations inside the notification card. As a result, viewing CARROT's notifications feels more like using the app rather than reading a plain text summary.


With a new framework and the flexibility granted by extensions, we're going to see a rise of interaction methods fueled primarily by notifications. Of all the places where an app can advertise its functionality on iOS (widgets, keyboards, extensions), a notification is the most direct, contextual way to reach users at an appropriate time.

A notification carries interest and, in many cases, a sense of urgency. iOS 10 transforms notifications from a passive delivery system into an active experience where users engage with an app through UIs, actions, and feedback they're already familiar with. It's a win-win for developers, who can make their apps more useful through richer notifications, and for users, who no longer have to open apps to benefit from their services.

iOS 10's notifications are a new layer on top of apps. They're going to change how we deal with them every day.

The Home Screen

The iPhone 6s brought the first significant adjustment to the iOS Home screen in years – 3D Touch quick actions. With iOS 10, Apple is cautiously expanding the Home screen beyond app shortcuts, but in ways you might not expect.

Search from the Home screen: pull down (left) or swipe right to open the new Search screen.

Search from the Home screen: pull down (left) or swipe right to open the new Search screen.

As in iOS 9, Spotlight search can be accessed from two locations: the Search screen on the left side of the Home screen and by pulling down on app icons. The Search screen on the left mirrors its Lock screen counterpart.

Notification Center has gone through some deeper changes. The segmented control to switch between notifications and widgets at the top is gone, replaced by another set of page indicators. Every time you open Notification Center, iOS 10 will default to showing you notifications in chronological order under a new 'Recent' header – it doesn't remember your position in the two pages. Unfortunately, the option to group notifications by app has also been removed.

Whether by laziness or deliberate design, there's an abundance of ways to activate Spotlight search in iOS 10. Let's round them up:

  • Search from the Lock screen (above widgets);
  • Open the Search screen (left side of the Home screen) and pull down or tap the search field;
  • Pull down on icons on the Home screen;
  • Swipe down to open Notification Center and tap Search above notifications;
  • Swipe right on Notification Center to open widgets and find Search at the top;
  • Use Command-Space on an iPad with an external keyboard and Spotlight will open modally on top of whatever app you're using without going back to the Home screen;
  • Last, and perhaps more perplexingly, there's a hidden way to open Spotlight modally when inside apps on the iPhone 6s. When using an app, swipe down slowly from the status bar until you feel a first haptic feedback, then let go. Instead of opening notifications, the text cursor will focus in the search field. If you don't let go after the first vibration but keep swiping down, you'll open Notification Center. This method doesn't work on the Home screen – only in apps.

That's seven ways to open Spotlight search on iOS 10.

Six shades of Spotlight search on iPhone.

Six shades of Spotlight search on iPhone.

Being able to access search from everywhere – be it on the Home screen, the Lock screen, or when using an app – is convenient. It makes Spotlight pervasive. As Apple continues to grow their search efforts across native apps, web partnerships, and Proactive suggestions, Spotlight's omnipresence will become a valuable strategic asset.

Apple continues to be a steadfast supporter of the Home screen as a grid of icons. In a potential disappointment for those who hoped to see a major Home screen refresh this year, the biggest new feature is an extension of 3D Touch quick actions and widgets, rolled into one.

Quick actions and widgets on the Home screen.

Quick actions and widgets on the Home screen.

Apps that offer a compact widget in iOS 10 can display it alongside quick actions when a user presses the app's icon. The widget is the same used in the Search screen – in fact, there's a button to install it directly from the Home screen.

iPhone Plus models can display quick actions and widgets on the landscape Home screen as well.

iPhone Plus models can display quick actions and widgets on the landscape Home screen as well.

I'm not sure I buy into Apple's reasoning for combining widgets and quick actions – at least not yet. The glanceability of widgets finds its raison d'être on the Lock screen and inside apps; on the other hand, I associate going back to the Home screen and pressing an icon with launching, not glancing. Years of iOS usage trained me to see the Home screen as a launchpad for apps, not an information dashboard.

In three months of iOS 10 – and with plenty of glanceable/actionable widgets to test – I've only remembered to use a widget on the Home screen once (it was PCalc). It's not that having widgets alongside quick actions is bad; it's just forgettable. It's the equivalent of two neighbors being forced to live together under the same roof. Having company can be nice sometimes, but everyone would be better off at their own place.

There are other smaller 3D Touch additions to the Home screen in iOS 10. You can press on folders to bring up a Rename action, and apps inside folders that have unread badges will be listed in the folder's quick action menu.

Folders have also received a visual refresh, with a nicer background blur that shows the grid of icons in the current Home screen page.

On the iPad, Apple didn't bring any improvements to the Home screen in iOS 10, but I'm sure you'll be relieved to know that closing an iPad app no longer adjusts the icon's corner radius on the Home screen.

This relates to a deeper change happening to Home screen animations. Apple has rebuilt the entire SpringBoard animation stack with faster, interruptible animations. Along with a reduced animation curve to launch apps (what was one of the most criticized aspects of iOS 7), you can click the Home button right after tapping an app's icon and the animation will stop, going back to the Home screen in an instant.

You can try the same with a folder: tapping outside of it will cancel the animation instantly in mid-flight. The difference with iOS 9's Home screen animations is staggering.

They're not a "feature", but new animations are the best Home screen change in iOS 10.


It's fair to wonder if Apple will ever desecrate the sanctity of the Home screen and allow users to mix icons and widgets.

Anyone who's ever looked at Android will spot obvious similarities between widgets for Google's platform and what Apple has done with widgets in iOS 10. Apple still believes in the separation of icons and app content; they only added widgets to 3D Touch quick actions and they didn't even allow the iPad Pro's large Home screen to go beyond icons. But for how long?

The iOS Home screen has served us well for years, but as screens keep getting bigger, it's time to do more than a grid of icons with quick actions. The other side of the fence is closer than ever; a final leap wouldn't be too absurd.

Control Center

Since its introduction in 2013, Control Center has become a staple of iOS, providing users with a panel of commonly accessed shortcuts. iOS 10's Control Center is a radical shift from its origins, and a harbinger of how iOS is changing.

Control Center's design has evolved over the years, from the wireframe-like look of iOS 7 to the friendlier, rounder buttons of iOS 9.

Apple wasn't led astray by the expansion of iOS, to the point where cramming more functionality into Control Center turned into a balancing act of prioritizing important controls without sacrificing their purpose.

It was clear that Control Center's original vision couldn't scale to the growing nature of iOS. And so with iOS 10, Apple has torn down Control Center and started from scratch. The single-page mosaic of tiny buttons is no more. The new Control Center breaks up system shortcuts and audio controls in two separate pages, with the addition of a third page for HomeKit (if available). Everything's bigger, spacious, and colorful.

The three pages of Control Center in iOS 10.

The three pages of Control Center in iOS 10.

You still open Control Center with a swipe from the bottom of the display. In iOS 10, swiping pulls up a card with paginated controls underneath it. The design is familiar, yet unmistakably new. Margins across each side convey the card metaphor; controls are bigger and buttons have more padding; there's more color in every card.

After three years of Control Center, the new version in iOS 10 feels lively and friendly; perhaps even more fun. On the other hand, pagination and bigger controls raise a question: has simplicity come at the expense of efficiency in Control Center?

System Controls

A useful exercise to understand Control Center in iOS 10 is to take stock of how much Apple is leaving behind. Let's compare iOS 9's Control Center to the same screen in iOS 10:

The first page of Control Center in iOS 10 has lost audio playback. Initially, that may feel like a downgrade. But let's swipe left and consider what Control Center has gained by separating system and audio controls:

The difference is striking. Giving audio playback its own space lets Control Center present more information for the media being played. It's also more accessible thanks to bigger text labels, buttons that don't need to be carefully tapped, and hardware controls embedded in the same page.

This won't be easy to accept for iOS power users who cherish dense UIs: Control Center buys into a trend followed by many (but not all) parts of iOS 10. Big, bold controls, neatly laid out, spread over multiple views.

The first beneficiary of such clarity is the system controls page. The first row of toggles at the top has kept iOS 9's iconography and arrangement, but each button is color-matched to the setting it activates when toggled.9

Control Center is bleeding...four colors?

Control Center is bleeding...four colors?

I found colored toggles extravagant at first; now, I like that I can glance at those buttons and know which setting is engaged.

Don't forget about landscape mode.

Don't forget about landscape mode.

The brightness slider and the AirPlay, AirDrop, and Night Shift buttons have been enlarged and simplified as well. For one, the slider's puck is more comfortable to grab. The buttons reveal another tendency in iOS 10's semi-refreshed design language: they're actual buttons with rounded borders and they use color to indicate status.

In a change that's reminiscent of Sam Beckett's fantastic concept, you can press on the bottom row of shortcuts to show a list of 3D Touch quick actions. These include three intensity levels for the flashlight, timer options, a shortcut to copy the last Calculator result, and different Camera modes.

As I elaborated before, Control Center was an ideal candidate for 3D Touch actions. However, Apple's implementation in iOS 10 is limited to the bottom row of apps; you can't press on the Bluetooth icon to connect to previously paired devices, nor can you press on the Wi-Fi toggle to connect to a different network. The addition of 3D Touch to the lower end of Control Center shows that Apple recognizes the utility of quick actions for system-wide shortcuts, but they're not fully committed to the idea yet.

Despite some missing features and growing pains to be expected with a redesign, iOS 10's first Control Center page is an improvement. With a sensible reliance on color, a more legible layout, and the first steps toward full 3D Touch support, Control Center's system card is easier to parse, nimble, and intuitive.

"It Also Looks Great on the iPad"

Control Center's design direction has been taken to the extreme on the iPad. Only one page can be used at a time; the AirDrop, AirPlay, and Night Shift buttons are needlessly wide. It doesn't take a design expert to figure that Apple just wanted to ensure basic compatibility with an iPhone feature instead of designing Control Center around the iPad.

Look at it this way: if Control Center didn't exist on the iPhone and Apple decided to introduce it on the iPad today, would it look like this?

The lack of an iPad-first approach was passable in the old Control Center because of its compact design. But with iOS 10, following the iPhone's model has a detrimental effect. Buttons are too big and little care went into optimizing the UI for the iPad's screen. Apple should reconsider what they're doing with Control Center on the iPad instead of upscaling their iPhone designs.

Music Controls

In iOS 10, managing music and audio playback from Control Center is a richer experience, visually and functionally superior to iOS 9.

The page is split in three areas: audio information and, for the first time, artwork at the top; progress, playback controls, and volume in the middle; hardware accessories at the bottom. This is true for Apple Music and Podcasts as well as third-party apps, which don't need to optimize for iOS 10 to show album artwork.

I was skeptical when I saw that Apple moved audio controls to a separate card. The ubiquitous presence of an audio widget was my favorite aspect of Control Center; adding an extra step to reach it didn't seem a good idea. After adjusting to Control Center's audio page in the first month of iOS 10, I went back to iOS 9 and controlling music felt limited and bland.

There are two aspects to Apple's design worth noting. First, Control Center remembers the page you were using before dismissing it. If you swipe up, swipe left to open music playback, then close Control Center, the next time you open it, you'll get the Now Playing card instead of being taken back to the first page. Thanks to this, having audio controls on a separate page hasn't been a problem in my experience, but I wonder if Apple should allow reordering pages as an option.

Second, the purpose of the redesign. With artwork and comfortable UI elements, the page feels like a miniaturized music app rather than a cumbersome mishmash of buttons and sliders. It's almost as if Control Center was reimagined for how normal people like to know what's playing.

From an interaction standpoint, artwork creates a bigger touch target that you can tap to be taken into the app playing audio10; in iOS 9, you had to precisely tap on a song's small title in Control Center. There's a deeper sense of context, too. Previously, it always took me a few seconds to read through a song's information. With iOS 10, I can swipe up and glance at the artwork to see what I'm listening to.

There's a subtle touch I want to mention. When music is playing, artwork is big, it has a drop shadow, and Control Center says 'Now Playing on...' at the bottom with an icon for the device where audio output is happening. Hit pause, and the artwork shrinks, losing the drop shadow, as the 'Now Playing...' message disappears. Tap play again, and the artwork grows bigger with a delightful transition.

Control Center's audio page has two functional problems Apple should address. Song details (title, artist, and album) have been turned into lines of text that don't scroll and get cut off. Try to listen to songs with long titles – say, I've Got a Dark Alley and a Bad Idea That Says You Should Shut Your Mouth (Summer Song) – and you'll be surprised Apple designers didn't consider the issue.

That Says...?

That Says...?

In addition, the ability to "love" songs to train Apple Music has been removed from Control Center (and the Lock screen). I don't understand the decision, as having a dedicated page provides even more room for music controls.

Despite the merits of artwork and more intuitive controls, I don't think Apple added a standalone audio card to Control Center for those reasons alone. To me, the most convincing explanation comes from the hardware menu:

Picking audio accessories in Control Center.

Picking audio accessories in Control Center.

With just a few taps, you can connect to Bluetooth headphones or wireless speakers from anywhere on iOS without opening Settings. There's an obvious subtext: for a device without a headphone jack, an easier way to switch between wireless audio accessories isn't just a pet peeve – it's a necessity.

Audio playback is the clear winner of the new Control Center in iOS 10. Apple freed themselves from the constraints of iOS 9's tiny audio controls, and, after three years, music is claiming the prime spot it deserves in Control Center. The new audio page brings a more engaging, integrated listening experience that paves the road for what's to come.

HomeKit Controls

You can't use the third page of Control Center unless you've configured at least one HomeKit device. I don't own a lot of HomeKit accessories (I have three Hue lights and a few Elgato sensors), but the new Home page has grown so much on me, I'm no longer using any third-party HomeKit widgets.

Besides being available to users with HomeKit devices, Control Center's Home card only displays accessories and scenes that have been marked as favorites in the new Home app. The page doesn't list every HomeKit accessory, nor does it work with third-party home automation devices that don't support HomeKit.

If you meet these requirements, you'll be able to swipe over the Music card to reveal the Favorite Accessories view.

Accessory buttons carry a name and icon assigned in the Home app, and, if supported, a percentage label for intensity (lights have it, for example). A button in the top right lets you switch between accessories and scenes. To turn them on and off, you just tap a button once.

Buttons can be long-tapped to open a detail screen with more options.11 For my Hue lights, holding a button for a fraction of a second reveals a vertical slider for intensity, which can be adjusted without lifting a finger off the screen.

A second layer of navigation is nested into the detail view. With multicolor lights, you can tap on a Colors button below the intensity slider to modify presets and open a color wheel to pick a different shade. The wheel even has a segmented control to switch between color and temperature – a surprisingly deep level of hierarchy for a Control Center page.

Adjusting colors and temperature for lights inside Control Center.

Adjusting colors and temperature for lights inside Control Center.

Unfortunately, accessories that only report basic status messages don't have a useful detail view.

In spite of my limited testing environment, Control Center has become my favorite way to manage HomeKit lights and scenes. It's a testament to Apple's penchant for native integrations: lights turn on immediately because commands don't go through a third-party server, and the entire flow is faster than asking Siri to activate an accessory. I was a heavy user of third-party HomeKit widgets and apps before; on iOS 10, I have no reason to do that anymore thanks to Control Center.

If Apple didn't have big plans for the connected home, they wouldn't have given HomeKit its own section in Control Center. With HomeKit expanding to new accessory lines, I think it's going to be my second most used card after music.

Extended Control

After three years, Control Center is growing up. To make the pendulum swing back towards simplicity, Apple has traded some convenience of the original design for three standalone pages. By unbundling functionality in discrete units, Control Center is more legible, usable, and flexible.

There are missteps. The lack of any kind of user customization is inexcusable in 2016. The bottom row of shortcuts, down to four icons again, still can't be modified to accommodate user-selected apps. And you won't be able to swap toggles at the top for settings you access on a frequent basis.

Half-baked integration with 3D Touch feels like a timid attempt to take Control Center further. The addition of quick actions for apps in the first page is laudable, but why isn't the same true for toggles at the top as well? And if HomeKit accessories can show nested detail views, why can't Apple Music display a lyrics screen, too?

I want to believe that iOS 10's Control Center is foreshadowing the ability for developers to provide their own "app pages" and for users to swap default shortcuts with their favorite ones. More than ever before, Control Center is ripe for extensibility and personalization. Like widgets, I can see a future where we interact with some types of apps primarily through mini interfaces in Control Center.

I wouldn't have expected pagination to be what I wanted, but Apple was right in rethinking Control Center as a collection of pages rather than a complex unified dashboard. The majority of iOS users won't be affected by Apple's design trade-offs; they'll appreciate a screen that doesn't need a manual.

The new Control Center experience isn't a regression; it's a much needed reassessment of its role in the modern iOS.

More 3D Touch

As it's evident by now, Apple has increased the presence of 3D Touch in iOS 10. On top of notifications, Control Center, and the Home screen, 3D Touch actions have been brought to more apps and system features.

Notification Center

Like on the Apple Watch, you can press on the Clear button in Notification Center to clear all notifications in one fell swoop. Finally.

Siri App Suggestions

Apps suggested by Siri support 3D Touch to show the same quick actions available on the Home screen.

Apple Music

Among many changes, Apple Music has been given the extended 3D Touch treatment with a contextual menu for selected items and playback controls. Pressing a song or the bottom player brings up a list of options that include adding a song to a library, liking it, saving it to a playlist, or opening lyrics.

Manage Downloads

When downloading apps from the App Store or restoring a device from an iCloud backup, you can press on an in-progress download to pause it, cancel it, or prioritize it over others.

Share Apps

iOS 10 automatically adds a Share button to an app's quick action menu on the Home screen to share its link with friends. Presumably, this is meant to bolster app discovery and sharing among users.

Beta Feedback

Pressing on the icon of a TestFlight beta app shows a shortcut to send feedback to the developer via Mail.


The pervasive use of 3D Touch in iOS 10 proves Apple wants it to be an essential iOS feature. After using iOS 10, going back to iOS 9 feels like missing several layers of interaction.

This creates an even stronger tension between 3D Touch-capable iPhones and devices without it. Right now, Apple is resorting to swipes and long-taps to simulate 3D Touch on iPads and older iPhones; will they always be able to maintain backwards compatibility without making more features exclusive to 3D Touch?

Messages

iMessage is a textbook example of how a feature can turn into a liability over time.

When it was introduced five years ago, iMessage promised to bring a grand unification of SMS and free, unlimited texting with media attachments. iMessage turned Apple's Messages app into a single-stop solution for conversations between iOS users and those who would later be known as green-bubble friends. It was the right move at the time12, and it allowed Apple to have a communication service as a feature of iOS.

Over the last five years, messaging has outgrown texting. Meanwhile, iMessage (the service) and Messages (the app) have remained stuck in their ways.

Services like Facebook Messenger, WhatsApp, LINE, and WeChat haven't only reached (or surpassed) iMessage in terms of users; as mobile-first messaging apps without SMS' technical (and conceptual) debt, they have been able to relentlessly iterate on design, novel messaging concepts, notifications, and app integrations.

These companies, free of past constraints, have envisioned new ways to communicate. They've grown messaging apps into platforms, enabling others to extend them. And maybe some of the current messaging trends will turn out to be fads, but it's hard to argue against Apple's competitors with their numbers, cultural influence, and progressive lock-in. They're no joke, and Apple knows it.

But I wouldn't ascribe iMessage's slow pace of evolution to its SMS legacy alone. Because of its end-to-end encryption and Apple's strict policy on not storing sensitive user information, iMessage is by nature trickier to extend. Apple's efforts in this area are commendable, particularly when you consider how the aforementioned services diminish in functionality once you add encryption.

However, security hurdles shouldn't be an excuse for iMessage's glaring shortcomings. As laudable as Apple's stance is, most users aren't willing to put up with an app that feels old. They want to liven up conversations with rich graphics and apps. They want messaging to be personal. Technologists won't like this, but, ultimately, people just want a modern messaging app that works.

From a user's perspective, it's fair to say that Apple has been too complacent with iMessage. The service is by no means a failure – it serves hundreds of millions of users every day. But those metrics don't matter when stasis yields something worse than numbers alone: cultural irrelevancy. That iMessage, as many see it, "is just for simple texting".

The time has come for iMessage to take the next step. With a willingness to welcome developers into its most important app, and without giving up on its security ideals, Apple is reshaping how users can communicate, express themselves, and share. With iMessage in iOS 10, Apple is ready to embrace change.

App Changes

Before delving into the bigger enhancements to Messages, I want to touch upon changes to the app's interface and some minor features.

The conversation's title bar has been redesigned to embed the recipient's profile picture. Having a photo above a conversation helps identify the other person; the increase in title bar height is a trade-off worth accepting.

There's new artwork for contacts without a profile picture, too.

There's new artwork for contacts without a profile picture, too.

The profile picture can be tapped to open a person's contact card; and, you can press it to bring up a 3D Touch menu – the same one available in Contacts and Phone with a list of shortcuts to get in touch with that person.

iOS 10 brings a new layout for the bottom conversation drawer. By default, a conversation opens with a narrow text field and three icons next to it – the camera, Digital Touch, and the iMessage app launcher. As you tap into the text field to reply to a message, the three icons collapse into a chevron that can be expanded without dismissing the keyboard.

Apple has also redesigned how you can share pictures and videos. The new media picker consists of three parts: a live camera view to quickly take a picture; a scrollable grid of recent items from your library; and buttons to open the full camera interface or the photo library, accessed by swiping right.

The assumption is that, on iMessage, people tend to share their most recent pictures or take one just before sharing it. The live camera view can be used to snap a photo in a second (you don't even have to tap on the shutter button to take it). Moving the camera and library buttons to the side (hiding them by default) has freed up space for recent pictures: you can see more of them thanks to a compact grid UI.

Some won't like the extra swipe required to open the camera or library, but the live photo view makes it easier to take a picture and send it.

After picking or taking a picture, you can tap on the thumbnail in the compose field to preview it in full screen. You can also tap and hold a picture in the grid to enter the preview screen more quickly.13

Markup inside Messages.

Markup inside Messages.

Here, you have two options: you can edit a picture with the same tools of the Photos app (albeit without third-party app extensions) or use Markup to annotate it. You can tap on the Live Photo indicator to send a picture without the Live part, or press on it to preview the Live Photo.

Speaking of photos, iMessage now lets you send images at lower quality, likely to save on cellular usage. You can enable Low Quality Image Mode in Settings -> Messages.

One of the oldest entries of my iOS wish lists is also being addressed in iOS 10: you can choose to enable read receipts on a per-conversation basis.

If you, like me, always keep read receipts turned off but would like to enable them for important threads, you can do so by tapping the 'i' button at the top of a conversation and then 'Send Read Receipts'. The toggle matches the default you have in Settings and it can be overridden in each conversation.

Richer Conversations

While Messages may not look much different from iOS 9 on the surface, the core of the app – its conversation view – has been refreshed and expanded. iMessage conversations have received a host of new features in iOS 10, with a focus on rich previews and whimsical, fun interactions.

In its modernization of iMessage, Apple started from web links. After years of plain, tappable URLs, Messages is adopting rich link previews, which are inspired by iOS 9's link snippets in Notes, but also more flexible and capable.

Rich links aren't a special setting of the app: the first time you receive a link in an iMessage conversation in iOS 10, it'll appear as 'Tap for Preview' button in the conversation. This is a one-time dialog to confirm you want to load links as rich previews instead of URLs, which also look different from iOS 9.

Loading a rich link for the first time in iOS 10.

Loading a rich link for the first time in iOS 10.

Like in Notes (and other services such as Slack and Facebook), rich previews use Open Graph meta tags to determine a link's title, featured image, audio and video file, or description. A web crawler has been built into Messages: as soon as you send a link, the message's bubble will show a spinner, and, depending on the speed of your Internet connection, it'll expand into a rich message bubble after a second, within the conversation.

Paste, fetch, expand into rich link.

Paste, fetch, expand into rich link.

Rich link previews in Messages use the same technology Apple brought to Notes last year, but they've been designed differently. They're message bubbles with a title and domain subtitle; the upper section, where the featured image of a link is, can grow taller than link snippets in Notes. Web articles tend to have rectangular image thumbnails; podcast episodes shared from overcast.fm are square; and links to iPhone apps shared from the App Store show a vertical screenshot.

Multiple types of shared links in Messages.

Multiple types of shared links in Messages.

Furthermore, the behavior of sharing links differs between Notes and Messages. Allow me to get a bit technical here.

In Notes, only links captured from the share extension are expanded into rich previews; pasting text that contains a link into a note doesn't turn the link into a rich preview.

Notes: rich links and plain URLs.

Notes: rich links and plain URLs.

In Messages, both individual links and a string of text with a link will generate a rich preview. In the latter case, the link has to be either at the beginning or at the end of a sentence. Messages will break up that single string in two pieces: the link's preview, and the string of text without the link. Even sending a picture and a link simultaneously will create two message bubbles – one for the image, another for the link.

The only instance where Messages will resort to the pre-iOS 10 behavior of a rich text (tappable) URL is when the link is surrounded by text:

Unless a link is placed inside a sentence, iOS 10 will never show the full path to the URL – only the root domain. Whether meta tags can't be crawled14 or if a link is successfully expanded, the URL will be hidden. If you need to see the full URL of a link in Messages, you can long-tap the link to show it in a contextual menu.

There are multiple types of link previews in iOS 10. The majority of websites with a social presence (including MacStories) have added support for Open Graph meta tags and Facebook/Twitter cards, and their links will appear with a featured image and a title. Alas, Apple hasn't brought Safari View Controller support to Messages, which doesn't make the experience of following links as seamless as it is on Facebook Messenger.

Twitter links have been nicely formatted by Apple: they have a special light blue background and they display a tweet's text, username, media (except GIFs), and avatar.

Twitter links on iMessage.

Twitter links on iMessage.

For Apple Music, the company has created a rich preview that, in addition to artwork, embeds a native play/pause button to listen to songs without leaving Messages. Unlike other web links, you can't peek & pop Apple Music links, suggesting that it's a custom implementation that uses an underlying URL to assemble a special message bubble.

Apple Music links (left) vs. SoundCloud and Spotify.

Apple Music links (left) vs. SoundCloud and Spotify.

Third-party companies can't take advantage of this – both Spotify and SoundCloud links don't have a playback UI and they're treated as webpages with a featured image.

Other Apple apps with the ability to share links don't fare as well as Apple Music. App Store and iTunes links show a title, an icon and screenshot, and app categories; you can't install an app or watch a movie trailer inside Messages. Links to photo albums shared on iCloud.com don't support rich previews in Messages, and shared notes only come with an icon and the title of a note.

YouTube links get expanded into a playable video preview that you can tap once to play, and tap again to pause. There are no additional controls (let alone a progress bar), but it's great to be able to watch a YouTube clip inline without being yanked to the YouTube app.

Playing YouTube videos inside Messages.

Playing YouTube videos inside Messages.

Messages will even pause playback if you scroll down in the conversation, and resume it as you focus on the video again. It's a nice touch.


Rich link previews embody the idea of stages of change and how Apple often adds functionality to iOS.

Users will see them as a new feature of Messages, which allows everyone in a thread to see a preview of the destination page. In some cases, message bubbles can even play media. I like how links get expanded inline; plain URLs in old iOS 9 message threads feel archaic already.

Link previews also build upon Apple's work with Universal Links and adoption of open standards such as Open Graph and Schema.org for search. The same technologies Applebot and Spotlight have been using for years now power link previews in iMessage.

I'd like to see Apple open up link previews with more controls for developers in the future, but this is a solid start.

Effects

With iOS 10, even how you send a message can be different. The blue 'Send' button has been replaced by an upward-facing arrow; tapping it once sends a regular iMessage as usual.

Within the arrow lies a secret, though. Press with 3D Touch (tap and hold on the iPad), and you'll bring up a 'Send with effect' screen, which lets you send a message with Bubble and Screen effects.

Let's start with bubbles, as I believe they'll be the more popular ones. There are four types of bubble effects, and they support any type of content you can share in Messages – text, emoji, media, and links.

Slam

Your message flies across the screen and is slammed to the ground, causing an invisible shock wave to ripple through adjacent messages.

Best used when you really want to be heard or make a point. Or for shaming a friend with an ugly selfie from the night before.

Loud

A more polite version of Slam that enlarges the message without affecting nearby bubbles.

The way the text shakes briefly inside the bubble suggests this is appropriate to shout something, either in anger or happiness, without necessarily destroying everything around you.

Gentle

Apple's version of a kind, intimate whisper. Gentle starts with a slightly larger bubble containing small text, which will quickly grow back to normal size as the bubble shrinks down.

Personally, I think Gentle is ideal for dog pictures as well as the "I told you so" moments when you don't want to upset the recipient too much. At least you're being gentle about it.

Invisible Ink

I won't explain the ideal use cases for this one, leaving them up to your imagination. Invisible Ink obfuscates the contents of a message and it's the only interactive bubble of the four.

To reveal text hidden by Invisible Ink, you have to swipe over the bubble to remove the magic dust that conceals it. It can be wiped off from notifications, too. Invisible Ink is automatically re-applied after ~6 seconds.

Invisible Ink gives you the time to make sure no one is looking at your screen. Ingenious.


Bubble effects may not appeal to iOS power users, but they're a lot of fun, they're whimsical, and they add personality to conversations.

Bubble effects in iOS 10.

Bubble effects in iOS 10.

From a technical standpoint, the implementation of 3D Touch is spot-on: you can hold down on the Send button and scroll to preview each bubble effect before sending it. If you receive a message with a bubble effect, it'll only play once after you open the conversation – they won't be constantly animating. I've been using them with friends and colleagues, and like them.

Screen effects are a different story. Unlike bubble effects, they take over the entire Messages UI and they play an animation with sounds that lasts a couple of seconds. Screen effects are deliberately over the top, to the point where they can almost be gaudy if misused. Lasers, for instance, will start beaming disco lasers across a conversation.15 Shooting star will cause a star to fly through the screen with a final "ding" sound, while fireworks will put up celebratory explosions, turning the app's interface dark as you gaze into the virtual New Year's night of iMessage.

Here's what they look like:

Balloons

Confetti

Lasers

Fireworks

Shooting Star

My problem with screen effects is that they can be triggered by certain keywords and phrases without any prior warning. Texting "congrats" will automatically fire off the Confetti effect, which is nice the first time, but gets annoying quickly when you find yourself texting the expression repeatedly and being showered in confetti every time. The same is true for "happy new year" and "happy birthday", which will bring up Fireworks and Balloons without the user's consent.

I use screen effects occasionally to annoy my friends and throw confetti when I feel like it – but the automatic triggering feels almost un-Apple in its opaque implementation. There should be an indicator, or a setting, to control the activation of screen effects, or Apple should abandon the idea altogether, letting screen effects behave like the bubble ones following a user's command.16

Screen effects aren't the most exciting aspect of the new iMessage, but they bring some unexpected quirkiness into the app, which isn't bad either. Just use them responsibly.

Digital Touch and Handwriting

When Apple introduced Digital Touch on watchOS in 2014, it was safe to assume it'd eventually find its way to iOS. Two years later, Digital Touch has been built into Messages in iOS 10, gaining a prominent spot between photos and the new iMessage App Store.

Digital Touch can be activated from the heart icon with two fingers – a reminder of its Apple Watch legacy. Tapping the button turns the lower half of the screen into an interactive pad where you can draw, send taps and heartbeats, and annotate photos and videos.

Digital Touch has three sections: a color picker along the left side (where, like on the Watch, you can long-tap a color to pick another one); a drawing area in the middle; and icons explaining Digital Touch features rotating on the right. At the bottom, a chevron lets you open Digital Touch in expanded mode, taking over the conversation in full-screen.

There isn't much to say about the functionalities adapted from watchOS. Sketches are easier to create thanks to the bigger screen, though I think that, to an extent, the constraints of the Watch incentivized creativity. Also, sketches look like images with a black background pasted into conversations: they're animated, but they don't feel as integrated as they used to be on the Apple Watch. They look like simple image attachments on iOS 10.

Taps and heartbeats are the kind of features someone decided to bring over to iOS so they wouldn't go to waste. They fundamentally feel out of place on iOS given the lack of haptic feedback on the wrist and their black background.

When you receive a tap from someone on the Apple Watch, you feel tapping on your wrist. Taps are animated images on iOS 10 and there's nothing special about them. The physical connection is lost. Apple could have made taps part of the conversation view, letting them ripple through bubbles like effects do, or use vibration as feedback; instead, they settled on GIFs.

Heartbeats are even more baffling, as they aren't "real" heartbeats due to the lack of a heart rate sensor on iOS. When you hold two fingers on the screen to send your heartbeat on iMessage17, iOS generates a generic animation that isn't a representation of anyone's heartbeat. The sense of intimacy watchOS fostered thanks to Digital Touch and its heart rate sensor – of knowing that the heartbeat animation represented the actual beating heart of a friend or partner – isn't there on iOS.

And don't get me started on the sadness of swiping down with two fingers to send a heartbreak.

Then there's 3D Touch, which is used in Digital Touch to send "fireballs". If you press on the Digital Touch pad, iOS 10 creates a pulsing fireball that will be sent as an animated image.

That's a fireball.

That's a fireball.

I'm not sure what to make of the fireball – does sending one show you're thinking of someone? That are you're upset with them? That you've realized 3D Touch exists in iMessage? Is it a reference to John Gruber? It's an open-ended question I'll leave to the public to resolve.

The standout Digital Touch feature is one that has been built around the iPhone's hardware. Tap the video icon, and you'll bring up a camera UI to sketch on top of what the camera is seeing. You can also add Digital Touch effects in real-time while recording a 10-second video (to take a picture, tap the shutter icon).

Effects with the Digital Touch camera are fun.

Effects with the Digital Touch camera are fun.

The combination of sketches and kisses with videos is fun and highly reminiscent of Snapchat; I've been using it to send short clips with funny/witty comments or sketches drawn on top of them. Apple should add more iOS-only "stamps" or animations to Digital Touch for photos/video without copying what they've done on watchOS.18

Unrelated to Digital Touch, but still aimed at making conversations more personal, is handwriting mode.

Anyone who's familiar with handwritten signatures in Preview and Markup will recognize it: handwriting can be accessed by tapping the ink button on the iPad keyboard or turning the iPhone sideways. It opens an empty area where you can handwrite a message in black ink using your finger (or Apple Pencil). There's a list of default and recent messages at the bottom (which can be deleted by long-tapping them), and no additional controls.

How handwritten messages look in conversations.

How handwritten messages look in conversations.

I found handwriting mode to be nicer than Digital Touch. Handwritten messages aren't contained in a black image and ink animates beautifully19 into the conversation view, which creates the illusion that someone has written a message for you inside Messages instead of sending an image attachment. It's a better integration than Digital Touch.

Digital Touch on iOS 10 could have used more work. Features that had some reason to exist on watchOS' hardware have been lazily ported to iOS, removing the physical interaction and feedback mechanism that made them unique on the Watch.

I'm not sure the iOS Digital Touch we have today is worth giving up a premium slot as a default iMessage app next to the Camera. It's a "Friends button" scenario all over again. I wouldn't be surprised if that permanent placement becomes customizable next year.

Tapback

iOS 10 brings new options to react to messages, too.

Called Tapback, the feature is, essentially, Apple's take on Facebook's redesigned Like button and Slack's reactions. If you want to tell someone what you're thinking without texting back, you can double tap20 a message – any kind of bubble – to bring up a menu with six reactions: love, thumbs up, thumbs down, ha-ha, exclamation points, and question mark.

Sending a Tapback.

Sending a Tapback.

The interaction of Tapback is delightful. Icons animate when you tap on them, and they play a sound effect once attached to a message. You can't create your own reactions by picking any emoji like on Slack, but, looking at a conversation with a bunch of hearts, thumbs-ups, and ha-has, the feeling is the same.

Tapbacks are especially effective in group threads where everyone can "vote" or express their immediate reactions without typing. A Tapback can be changed at any point during a conversation, but you can only leave one reaction per message.

If what happened in my Slack teams over the past year is of any indication, Tapback should become a useful way to let someone know you've acknowledged or liked their message without writing anything back.

Emoji

Slack's influence on iMessage has propagated to emoji as well. Messages that only contain emoji (no text) will be sent as big emoji (at 3x), so you can truly appreciate the details that make up Apple's most popular characters.

Regular and big emoji.

Regular and big emoji.

I've been a fan of jumbo emoji since Slack rolled them out last year. They're a perfect fit for iMessage. Emoji are expanded in the text field before sending them – I chuckle every time I see a big thinking face about to enter a conversation. Messages will only display up to three big emoji at a time; if you create a message containing four emoji, they'll be sent at normal size.

Emoji improvements don't stop there. Apple must have noticed that users like to write messages and replace words inside them with appropriate emoji, and they're introducing an option to automate the process in iOS 10. Possibly, this innocuous feature (which only works in Messages) is even going to power Apple's Differential Privacy for crowdsourced data collection.

If you write a message in iOS 10 and then open the emoji keyboard, the system will scan words you've entered in the text field and try to match them up with emoji. If a related emoji is found, a word will be highlighted in orange. Tap it, and it'll be replaced with the emoji.

Tap the emoji keyboard to replace words with emoji.

Tap the emoji keyboard to replace words with emoji.

If multiple emoji options are available for a single word, tapping it opens a menu to choose one.

Multiple emoji options.

Multiple emoji options.

I'm not exactly the target audience for this feature (I either only send emoji or put some next to a word), but I recognize that a lot of people treat emoji as substitutes for words. Apple devised a clever and thoughtful way to "emojify" text, letting the OS compensate for a search box still missing from the emoji keyboard.

Under the hood, emoji replacements hinge on a system that has to build up associations and trigger words, follow trends, and adapt for international users and different meanings of the same emoji around the world. Based on what Apple has revealed about Differential Privacy, data on emoji picked by users will be collected in aggregate to improve the accuracy of suggestions.

My understanding is that Apple started from a set of words curated from common expressions and Unicode annotations, and began scaling to millions of users and dozens of languages for over 1800 emoji during the iOS 10 beta stage. In my case, emoji replacements worked well for both English and Italian.

Crowdsourcing this aspect of iMessage makes sense given the popularity and many meanings of emoji. It'll be interesting to see how suggestions will be refined as iOS 10 usage picks up.

iMessage as a Platform

Despite numerous design updates and enhancements to conversations, the most profound change to iMessage isn't the app itself – it's other apps developers will build for it.

Apple is opening iMessage to developers in iOS 10, turning it into a platform that can be extended. The company has created a Messages framework for developers to plug into and build apps, which will be available on the new iMessage App Store.

The stakes are high. For millions of users, their messaging app is a second Home screen – a highly personal, heavily curated gateway to contacts, private conversations, and shared memories. Messaging isn't just texting anymore; it's the touchstone of today's mobile lifestyle, a condensation of everything smartphones have become.

Apple won't pass up this opportunity. Not this time. In opening up their most used app, Apple hopes that developers will take iMessage further with new ways to share and enrich our conversations.

iMessage App Store

Developers can write two types of Messages extensions in iOS 10: sticker packs and iMessage apps. Both can be discovered and installed from the iMessage App Store embedded into the Messages app, and both can be created as standalone apps or as extensions within a containing iOS app.

You can access the iMessage App Store with the apps button next to the input field. Messages will hide the keyboard and bring up a scrollable gallery of all your installed Messages extensions, opening the last used one by default. Apps are organized in pages and you can swipe between them. You can expand the currently selected app with the chevron in the lower right, and browse recent content from all apps via the leftmost page.

Opening the last used iMessage app (left) and the Recents page (right).

Opening the last used iMessage app (left) and the Recents page (right).

There's also a way to view a Home screen of iMessage apps as icons. If you tap on the icon in the bottom left corner, you'll be presented with a grid of oval icons (the shape for iMessage apps) and a '+' button to open the iMessage App Store.

The iMessage app drawer (left) and the new iMessage App Store.

The iMessage app drawer (left) and the new iMessage App Store.

This view has been designed to resemble the iOS Home screen: you can swipe horizontally across apps, you can tap & hold to delete them and rearrange them, and you can even click the Home button to stop wiggling mode.21

I like the idea of an iMessage SpringBoard, but it takes too many taps to open it22, especially if you want to launch an app in a hurry. Apps are tucked away behind three taps, and I wonder how that will impact usability in the long run. Right now, the compact app drawer (with the dots at the bottom) doesn't scale to more than 30 installed apps and it feels like equivalent of the Slide Over app picker from iOS 9; there has to be a faster way to navigate and launch iMessage apps.23

Perhaps a Messenger-like design with top launchers embedded above the keyboard would have been a preferable solution.

Stickers

iMessage stickers can be seen as Apple's response to the rise of third-party "emoji" keyboards that offer selections of sticker-like images, usually in collaboration with brands and celebrities. If you've seen the likes of KIMOJI, Justmoji, PetMOJI, Bitmoji, and literally anything -moji on the App Store lately, you know that's an aspect of iOS Apple could improve for both users and developers.

What some third-party companies try to sell as "custom emoji" aren't really emoji: they are images that can be pasted in conversations.24 Developers don't control the availability of emoji in Apple's keyboard, nor can they alter what is defined as emoji in the Unicode specification. By manipulating the public's perception of what an emoji is, and by leveraging custom keyboards to make their "emoji" look like part of iOS, some developers were able to carve themselves a profitable niche on the App Store. Just ask Kanye West and how they made a million a minute.25

However, I don't blame developers for trying and riding on the coattails of emoji.26 I'd argue that a lot of companies settled on the "moji" suffix because iMessage was the only big messaging service without native sticker support, and emoji were already in the typical iOS user's vocabulary.

Stickers provide an enormous opportunity for developers and, yes, brands to give users fun ways to express their feelings with a wider array of emotions and contexts than emoji alone. Look at LINE, and the massive, multi-million dollar success of Cony and Brown and revenue from their Creators Market; think about Twitter and how they commissioned a set of stickers to be used on photos.

If every major messaging platform has found stickers to be popular and profitable, there must be something to them that appeals to people. With iOS 10, Apple, too, wants a piece of the action and is letting developers create sticker packs for iMessage. The goal is to entice users to personalize their iMessage conversations with stickers, download additional packs, and spread usage with friends. The company plans to do so with a superior experience than custom keyboards, with the prospect of a new gold rush for developers.

Stickers live in the standard sticker browser – the compact view that opens after choosing a sticker pack from the app drawer. This area can have a custom background color and it's where you can interact with stickers.

Two sticker packs.

Two sticker packs.

You can tap on a sticker to place it in the input field and send it individually, or you can peel it off the browser and drag it around in a conversation.

Tapping a sticker to send it (left) and peeling it off (right).

Tapping a sticker to send it (left) and peeling it off (right).

The animation for peeling stickers off the browser and re-attaching them is some of Apple's finest OpenGL work in a while.

You can attach stickers to any message bubble in the transcript: you can put one next to a text message, cover a photo with multiple stickers, or even put a sticker atop another one or a GIF. Want to peel off a sticker and use it on an older message? Drag it over the title bar, wait for the conversation to scroll back, and attach it wherever you want. How about covering your friend's eyes with googly eye stickers? You can do that too.

Things can get out of hand quickly.

Things can get out of hand quickly.

Once a sticker has been placed in a conversation, you can tap and hold it to open the sticker details. This is also how you view all stickers that cover a message bubble, with buttons to download the complete packs on the iMessage App Store27. Here, you can swipe on a sticker to delete it from the selected message bubble if you no longer want to see it.28

Opening sticker details.

Opening sticker details.

You'll come across two kinds of sticker packs. There are the basic ones, which are a collection of images displayed inside a sticker browser. This will probably be the most popular choice for developers, as creating these packs doesn't require a single line of code. If you're a developer and want to sell a sticker pack on the iMessage App Store, all you need to do is drop some image files into an Xcode sticker pack project, add icons, and submit it to Apple.29

Stickers can also be rotated and enlarged using pinch gestures.

Stickers can also be rotated and enlarged using pinch gestures.

The second kind are sticker packs with a custom sticker browser or other additional features. Technically, these are iMessage apps that use the Messages framework for sticker functionality that goes beyond basic drag & drop. For instance, you may see apps where you can assemble your own stickers, or sticker packs with custom navigation elements and In-App Purchases. The sticker behavior in conversations is the same, but these packs require more work from developers.30

From a user's perspective, stickers make iMessage conversations feel different. More lively and fun, but also busier and messier if overused.

I've been able to test about 30 different iMessage sticker packs from third-party developers in the past couple of months. One of the highlights is The Iconfactory, which leveraged their expertise in icons and illustrations to create some fantastic sticker packs for iMessage.

An example of what The Iconfactory has prepared for iOS 10. (Tap for full size)

An example of what The Iconfactory has prepared for iOS 10. (Tap for full size)

From Sunshine Smilies (emoji characters as stickers) and Tabletop RPG (role-playing emoji stickers) to Mystic 9 Ball and Dino, I believe The Iconfactory has found a perfect way to reinvent themselves for the iMessage era. They're a great fit for stickers.

Developer Raul Riera has created what I believe is going to be a popular type of custom sticker app: Emoji Stickers lets you put together your own stickers containing emoji characters.

A custom emoji sticker.

A custom emoji sticker.

You can create concoctions like a monkey wearing a crown or a pineapple pizza. This is done with a custom sticker-assembling UI and built-in emoji from the open source Emoji One set.

Monstermoji, created by Benjamin Mayo and James Byrd, features beautifully hand-drawn monster characters you can attach to messages.

These stickers are unique, and they show how anyone can easily create a sticker pack and release it.

I also like Anitate, a set of 80+ animated stickers by Raven Yu.

Look at that sad pug with bunny ears.

Look at that sad pug with bunny ears.

Anitate's stickers are like animated emoji, redrawn for a flat style with animations. They're fun and I've been using them a lot.

Last, I want to mention Sticker Pals – by far, the most impressive and beautiful sticker pack I've tried. Designed by David Lanham in collaboration with Impending, Sticker Pals features a large collection of animated hand-drawn stickers for various emoji-like objects, symbols, and animals. The illustrations are gorgeous, animations are fun, and there are hundreds of stickers to choose from.

Sticker Pals is a good example of what can be achieved by creating a custom sticker browser with the Messages framework. There are buttons at the top of the browser to switch categories, and each tap corresponds to a different sound effect. Plus, the developers have devised a clever unlocking mechanism for extra stickers with an in-app store and the ability to send stickers as gifts to your friends – all within an iMessage app with a sticker browser.

Judging from the amount of pre-release sticker packs I received during the summer, I have a feeling the iMessage App Store team at Apple is going to be busy over the next few weeks.31


With iMessage stickers, Apple hasn't just created a better way to paste images in conversations. They're stickers in the literal sense – they can be attached anywhere, sometimes with questionable results, but always with a surprising amount of freedom and experimentation. Mixing multiple stickers at once on top of messages could become a new activity of its own32 – I know I've had fun placing them over photos of my friends.

Stickers are often looked down upon by the tech community because they seem frivolous and juvenile. But emoji were met with the same reaction years ago, and they've gone on to reinvent modern communication, trickling into pop culture.

iMessage stickers probably won't have the same global impact of emoji, primarily because they only work in iMessage33 and the service isn't cross-platform. But I also believe that stickers are the perfect addition to iMessage in 2016. Stickers are messaging's lingua franca. Their adoption is going to be massive – bigger than custom keyboards have ever been. Stickers are lighthearted, fun to use, and they make each conversation unique.

Let's check back in a year and see how many sticker packs we have installed.

iMessage Apps

The iMessage platform's opportunity lies in the second type of extensions available to developers: iMessage apps.

Like sticker packs, iMessage apps are installed and managed from the iMessage App Store, they live in the Messages app drawer, and they support compact and expanded mode. They can be standalone apps or extensions within a containing iOS app.

Unlike basic sticker packs, however, iMessage apps have to be programmed. They're actual apps that can present a user interface with their own view controller. iMessage apps can:

  • Offer more control to developers who want to build an interactive sticker browser;
  • Insert text and media files in the input field;
  • Display their custom UI;
  • Access iOS frameworks;
  • Create, send, and update interactive messages.

    With iMessage apps, developers can bring their apps' interfaces, data, and experience into Messages.

    Examples of iMessage apps.

    Examples of iMessage apps.

Because of this, there are no limitations for what an iMessage app should look like. Anything developers can put in a view controller (bearing in mind compact mode and memory constraints) can be an iMessage app. Coming up with a miniaturized app that makes sense in Messages, though, will be just as hard as envisioning Watch apps that are suitable for the wrist.

There are some differences to consider for compact and expanded mode. In compact, apps cannot access the system keyboard and they can't support gestures (horizontal swipes are used to navigate between apps and sticker packs). Only taps and vertical scrolling are available in compact mode.

iMessage apps in compact mode.

iMessage apps in compact mode.

In expanded mode, both the system keyboard and gestures are supported. Developers can ask users to type information in the expanded layout, they can enable deeper gesture controls, and, generally speaking, they have more freedom in what they present to the user. When running in expanded mode, an iMessage extension that has a container app features an icon in the top left to launch the full app.

iMessage apps in expanded mode.

iMessage apps in expanded mode.

The other peculiarity of iMessage apps is that they can create interactive messages with special message bubbles. These bubbles are based on a template with some strict limitations. There's only one layout apps can use. An interactive message can display an image, audio, or video file as the main content; the app's icon is always shown in the top left; at the bottom, developers can set textual properties for the bubble's caption bar including title, subtitle, captions, and subcaptions (the caption bar is optional).

iMessage apps can't alter the standard layout of an interactive message, nor can they inject buttons around it. Any user interaction must be initiated from the bubble itself. iMessage apps can't send interactive messages on the user's behalf: they can only prepare an interactive message and place it in the input field.

When an interactive message bubble is tapped, an iMessage app can bring up a custom interface to let participants view more information on the shared item or continue a task. Keep in mind, though, that if you tap on an interactive message to open it in full-screen when you don't have the iMessage app to view it on your device, you'll be taken to the iMessage App Store to install it.


The best way to understand what iMessage apps can do is to try some. Since June, I was able to test over 20 iMessage apps from third-party developers, and I have a general idea of what we should expect throughout the year.

Supertop's podcast client, Castro, will soon let you share your favorite episodes with an iMessage app. Castro loads a list of episodes you've recently listened to; tap one, and it'll turn into a rich bubble embedding artwork and episode title.

Catro's iMessage app.

Catro's iMessage app.

The best part: you can tap the bubble to open show notes in full-screen (and even follow webpage links inside Messages) and add an episode to your Castro queue. It's a great way to share podcast episodes in iMessage conversations and save them with a couple of taps.

Drafts, Greg Pierce's note-taking app, has added an iMessage app extension to share notes with friends. You can browse all notes from your inbox or switch to Flagged messages.

Drafts' iMessage app.

Drafts' iMessage app.

Drafts places a note's plain text in Messages' input field, ready to be sent. The iMessage app is going to come in handy to share commonly accessed notes and bits of text with colleagues.

Ever wished your GIFwrapped library – carefully curated over the years – was available in iMessage? With iOS 10, you'll be able to paste your favorite GIFs without using a custom keyboard.

Sending GIFs with GIFwrapped on iMessage.

Sending GIFs with GIFwrapped on iMessage.

I've been using GIFwrapped's iMessage app to send GIFs of dogs smiling to my mom and girlfriend. They love them.

Alongside a widget and rich notifications, CARROT Weather is coming to iMessage with an app to share weather forecasts in conversations. It's a solid example of the flexibility granted to apps: CARROT for iMessage carries its custom UI and hilarious sound effects, and it displays rich graphics and animations. It can access your current location from Messages, and it even lets you search for locations in expanded mode, where you can browse a full-screen forecast of the upcoming week – all without leaving Messages.

CARROT creates interactive messages that are prepared in the input field by tapping a Share button. These are bubbles with a custom preview graphic and text labels for location, temperature, and current conditions. If you receive one and tap on it, you'll open CARROT's expanded preview.

Developed by Sven Bacia, Couchy is another iMessage app that sends interactive bubbles that present a full-screen UI when tapped. Couchy is a TV show tracker; on iMessage, it displays a list of recently watched and upcoming show episodes. Pick one, and Couchy will assemble a bubble with the series' artwork and name of the episode.

Couchy's iMessage app.

Couchy's iMessage app.

When you tap a Couchy message, you get an expanded preview of the episode with metadata and artwork fetched from trakt.tv, plus the ability to view the episode in the main Couchy app.

ETA, a navigation app I covered on MacStories before, is based on a similar design, using small snippets in compact mode for your favorite locations. Tap one, and the app will calculate travel time on the spot, preparing a message bubble to share with someone.

ETA's iMessage app.

ETA's iMessage app.

The interactive message can be tapped to view more details about the other person's estimated travel time, as well as get directions to the same address. You can also collaborate on the same travel time and respond with your status (more on collaborative apps below) and search for locations directly from iMessage. ETA is one of the most useful, technically impressive iMessage apps I've tried.

It can get even more advanced than this, though. Snappy, for example, is a web browser for iMessage. You can search Google or paste URLs in a search box, or use search suggestions.

Browse the web inside iMessage with Snappy.

Browse the web inside iMessage with Snappy.

Once you've found a webpage you want to share in a conversation, you can tap a Send button to insert the link in the input field. The link, of course, will expand into a rich preview. Given Messages' lack of Safari View Controller, Snappy can be useful to paste links and view them without leaving the app; it's also a convenient way to look something up on Google while talking to a friend.

Pico, developed by Clean Shaven Apps, can send photos and videos at lower quality with deeper controls than Apple's built-in Low Quality Image Mode for iMessage. After choosing media from the library, Pico opens a dark interface with a preview at the top and quality settings at the bottom. You can choose from four quality presets, compare savings with the original item, and tweak dimensions.

Compating image saving with Pico.

Compating image saving with Pico.

In addition to downscaling, Pico can remove metadata from media, such as location details. The app remembers settings for each conversation, and, overall, it's a great way to save on cellular data with more options than iMessage's default solution.

Touch ID can be integrated with iMessage apps, and Cipher uses the authentication framework to let you send "secret messages" encrypted with AES-256 that don't appear in the transcript as normal text messages. Instead, Cipher generates custom bubbles that hide your text; on the other end, the recipient will have to authenticate with Touch ID (thus confirming the device isn't being used by someone else) to read your message.

You can also send digitally-signed messages to prove it's really you by typing in Cipher and "signing" with your Touch ID.

These are just a few examples of what developers can build with the Messages framework. Starting today, we're going to see an avalanche of iMessage apps, but the best ones will stand out as intuitive utilities suited for sharing.

Collaborative iMessage Apps

Along with single-user apps, Apple has emphasized the ability for developers to build iMessage apps that let users collaborate on a task inside Messages.

In a collaborative iMessage app, an interactive message can be modified by participants in a conversation. As two or more people interact with a message in the same session and update its content, Messages removes the previous message from the transcript, collapsing it into a succinct summary text (so that outdated messages with old data don't pollute the conversation). Only the new message, with the updated content, is displayed at the bottom as normal.

Let's work with a fictional example.

Imagine that you're planning a trip with your friends over iMessage. It's always hard to keep track of everyone's available times, so developer Myke Hurley has created 1-2-3 Trip Planner, an iMessage app that looks into a user's calendar, brings up a custom calendar view in Messages, and lets the user choose up to three available slots in their schedule. Once three times are picked, 1-2-3 Trip Planner generates a message bubble with the user's three choices as a title.

Stephen has created an iMessage conversation with two of his friends, and they want to plan a trip together. Stephen brings up 1-2-3 Trip Planner, finds some available slots in his weekend schedule, selects three of them, and sends a message. The interactive message uses "Available Times – Stephen" in the bubble and the days of the week as title.

Stephen creates the first 1-2-3 Trip Planner bubble.

Stephen creates the first 1-2-3 Trip Planner bubble.

On the other end of the conversation, Christina needs to look at her calendar and pick three available times. When she taps the 1-2-3 Trip Planner bubble, Stephen's choices are displayed alongside her calendar events, and she can agree on a same slot, or pick a different one. She then replies with her preferences, sending another message bubble.

Christina replies with her schedule.

Christina replies with her schedule.

John is the third participant in this conversation. In his iMessage transcript, Stephen's first bubble has been collapsed into a summary text that says "Stephen picked three time slots" and Christina's message says "Stephen and Christina's time slots". John is only seeing the latest message bubble with the choices of both users. When he taps on it, a full-screen interface comes up, showing him a calendar view with his events and the times Stephen and Christina previously picked.

John picks his time slots and the trip is planned.

John picks his time slots and the trip is planned.

John can also agree on previously chosen time slots or pick new ones. When he's done, he sends his reply, and the second message bubble from Christina also turns into a summary text. John's third and final bubble has a title that says "Stephen, Christina, and John". At this point, the three participants are looking at one interactive message; they can look at the results and decide on a time that works for everyone.

Right: what collapsing bubbles into summaries looks like.

Right: what collapsing bubbles into summaries looks like.

Stephen, Christina, and John collaborated on a task within Messages without the back and forth of switching between calendars and texting each other's available times. 1-2-3 Trip Planner has allowed multiple users to easily agree on a shared schedule in less than a minute.

There are two additional aspects worth noting. In my imaginary (but technically accurate) example, 1-2-3 Trip Planner accessed the native iOS EventKit framework; I've tried actual iMessage apps that accessed the camera, location, photos, and the clipboard. Also, Apple is very concerned about user privacy and exposing contact information to iMessage apps. For this reason, the Messages framework doesn't allow apps to see any details about the participants in a conversation, but only local identifiers (alphanumeric strings that don't identify a single person).34

The framework Apple has built into Messages should, in theory35, allow for the creation of moderately complex collaborative apps. Calendar collaboration is just one possible use case; imagine utilities to split bills, todo apps, photo compositions, and even games.

I tested a couple of straightforward collaborative iMessage apps in the past few weeks. The aforementioned ETA iMessage app lets you respond to a friend's travel time with another interactive message.

ETA's bubbles and summaries.

ETA's bubbles and summaries.

Another app is ChibiStudio, which lets you assemble "chibi" avatars either by yourself or with a friend choosing from various pieces of clothing and body traits.

Collaborating on character creation on iMessage.

Collaborating on character creation on iMessage.

When creating a chibi collaboratively, each person can add new details to the character and send an interactive message back. To keep track of progress, the app tells you which items have been added in the title of the message bubble and it collapses previous messages into summaries. I tested ChibiStudio with John, and it was fun.

Do With Me uses collaboration in iMessage effectively, enabling you to create shared todo lists where other people can add and complete items inside a conversation.

John added items to our shared Do With Me list.

John added items to our shared Do With Me list.

I wouldn't use an iMessage todo app as my only task manager, but I think it's useful to have something like Do With Me as an extension of a full task manager to collaborate with others on specific lists (grocery shopping, homework, etc.).

Finally, it wouldn't be a new App Store without a re-interpretation of tic-tac-toe. In xoxo, you'll be able to challenge friends on the classic game with a collaborative iMessage app that uses bubbles and full-screen views to advance the game.

Sometimes, you need a simple iMessage game to kick back.

Sometimes, you need a simple iMessage game to kick back.

The app works surprisingly well, with a good use of summaries in the transcript and captions to describe player moves. It's a nice way to pass the time during a conversation.36


Collaborative iMessage apps are only one part of the story. For single-user iMessage apps, the Messages framework should be enough to create deep, custom experiences unlike anything we've seen before.

The Future of iMessage

When the App Store opened for business, no one could imagine the extent of developers' imagination. No one could predict what the iPhone would become by letting app makers write software for it. And looking back at that moment today, it's evident that our devices are deeply different, and dramatically more powerful, because of apps.

Apple can attain a similar result with the iMessage App Store. iMessage apps create a new avenue for developers to bring any kind of experience into millions of daily conversations. And by plugging into iOS and the App Store, Apple can leverage the scale of an ecosystem other messaging services don't have.

After using iMessage apps for the past three months, I have the same feeling of the early App Store days. It's a new frontier, it's exciting, and developers are just getting started. Compared to companion App Stores like the Watch and Apple TV ones, I think the iMessage App Store will be a hit among iOS users.

Apple needed to modernize iMessage in iOS 10, but they went beyond mere aesthetic and functional improvements to the Messages app. They've opened the door for apps to reimagine what we share and how we share it.

We're on the brink of a fundamental change to iMessage. If Apple plays its cards right, we could be witnessing the foundation of a second app platform within iOS.

Siri

"A delayed game is eventually good, but a rushed game is forever bad", Nintendo's Shigeru Miyamoto once quipped.

Unlike the console Miyamoto was concerned about, modern software and services can always be improved over time, but Apple knows the damage that can be caused by the missteps and perception of a rushed product. With iOS 10's SiriKit, they've taken a different, more prudent route.

Ever since the company debuted its virtual assistant in 2011, it was clear Siri's full potential – the rise of a fourth interface – could only be unlocked by extending it to third-party apps. And yet, as Siri's built-in functionalities grew, a developer SDK remained suspiciously absent from the roster of year-over-year improvements. While others shipped or demoed voice-controlled assistants enriched by app integrations, Siri retained its exclusivity to Apple's sanctioned services.

As it was recently revealed by top Apple executives, however, work on Siri has continued apace behind the scenes, including the rollout of artificial intelligence that cut error rates in half thanks to machine learning. In iOS 10, Apple is confident that the Siri backend is strong and flexible enough to be opened up to third-party developers with extensions. But at the same time, Apple is in no rush to bring support for any kind of app to Siri in this first release, taking a cautious approach with a few limitations.

Developers in iOS 10 can integrate their apps with Siri through SiriKit. The framework has been designed to let Siri handle natural language processing automatically, so developers only need to focus on their extensions and apps.

At a high level, SiriKit understands domains – categories of tasks that can be verbally invoked by the user. In iOS 10, apps can integrate with 7 SiriKit domains37:

  • Audio and video calling: initiate a call or search the user's call history;
  • Messaging: send messages and search a user's message history;
  • Payments: send and request payments;
  • Photos: search photos or play slideshows;
  • Book rides: shared by Maps and Siri. Siri can book a ride or get the status of a booked ride. Maps can also display a list of available rides for an area;
  • Workouts: start, end, and manage workouts;
  • CarPlay: manage vehicle environment by adjusting settings such as climate control, radio stations, defroster, and more.

Inside a domain, Siri deals with intents. An intent is an action that Siri asks an app to perform. It represents user intention and it can have properties to indicate parameters – like the location of a photo or the date a message was received. An app can support multiple intents within the same domain, and it always needs to ask for permission to integrate with Siri.

Siri permissions.

Siri permissions.

SiriKit is easy to grasp if you visualize it like a Chinese box with a domain, inside of which there are multiple types of actions to be performed, where each can be marked up with properties. In this structure, Apple isn't asking developers to parse natural language for all the expressions a question can be asked with. They're giving developers empty boxes that have to be filled with data in the right places.

Imagine a messaging app that wants to support Siri to let users send messages via voice. Once SiriKit is implemented, a user would need to say something like "Tell Myke I'm going to be late using [app name]", and the message would be composed in Siri, previewed visually or spoken aloud, and then passed to the app to be sent to Myke.

Craig Federighi with an example of WeChat in Siri.

Craig Federighi with an example of WeChat in Siri.

This basic flow of Siri as a language interpreter and middleman between voice and apps is the same for all domains and intents available in SiriKit. Effectively, SiriKit is a framework where app extensions fill the blanks of what Siri understood.

The syntax required by SiriKit simultaneously shows the rigidity and versatility of the framework. To summon an intent from a particular app, users have to say its name. However, thanks to Siri's multilingual capabilities, developers don't have to build support for multiple ways of asking the same question.

You could say "Hey Siri, send a message to Stephen using WhatsApp" or "message Stephen via WhatsApp", but you could also phrase your request differently, asking something like "Check with WhatsApp if I can message Stephen saying I'll be late". You can also turn an app's name into a verb and ask Siri to "WhatsApp Stephen I'm almost home", and SiriKit will take care of understanding what you said so your command can be turned into an intent and passed to WhatsApp.

If multiple apps for the same domain are installed and you don't specify an app's name – let's say you have both Lyft and Uber installed and you say "Hey Siri, get me a ride to the Colosseum" – Siri will ask you to confirm which app you want to use.

Apple has built SiriKit so that users can speak naturally, in dozens of languages, with as much verbosity as they want, while developers only have to care about fulfilling requests with their extensions. Apple refers to this multi-step process as "resolve, confirm, and handle", where Siri itself takes care of most of the work.

Developers are given some control over certain aspects of the implementation. From a visual standpoint, they can customize their experiences with an Intents UI extension, which makes a Siri snippet look and feel like the app it comes from.

Customizing Siri extensions is optional, but I'd bet on most developers adopting it as it helps with branding and consistency. Slack, for instance, could customize its Siri snippet with their channel interface, while a workout extension could show the same graphics of the main app. Intents UI extensions aren't interactive (users can't tap on controls inside the customized snippet), but they can be used for updates on an in-progress intent (like a Uber ride or a workout session).

An app might want to make sure what Siri heard is correct. When that's the case, an app can ask Siri to have the user double-check some information with a Yes/No dialog, or provide a list of choices to Siri to make sure it's dealing with the right set of data. By default, Siri will always ask to confirm requesting a ride or sending a payment before the final step.

Other behaviors might need authentication from the user. Apps can restrict and increase security of their SiriKit extensions (such as when a device is locked) and request Touch ID verification. I'd imagine that a messaging app might allow sending messages via Siri from the Lock screen (the default behavior of the messaging intent), but restrict searching a user's message history to Touch ID or the passcode.

Last, while Siri takes care of natural language processing out of the box, apps can offer vocabularies with specific terms to aid recognition of requests. A Siri-enabled app can provide user words, which are specific to a single user and include contact names (when not managed by Contacts), contact groups, photo tag and album names, workout names, and vehicle names for CarPlay; or, it can offer a global vocabulary, which is common to all users of an app and indicates workout names and ride options. For example, if Uber and Google Photos integrate with SiriKit, this means you'll be able to ask "Show me photos from the Italy Trip 2016 album in Google Photos" or "Get me a Uber Black to SFO".

SiriKit has the potential to bring a complete new layer of interaction to apps. On paper, it's what we've always wanted from a Siri API: a way for developers to expose their app's features to conversational requests without having to build a semantic engine. Precise but flexible, inherently elegant in its constraints, and customizable with native UIs and app vocabularies. SiriKit has it all.

The problem with SiriKit today is that it's too limited. The 7 domains supported at launch are skewed towards the types of apps large companies offer on iOS. It's great that SiriKit will allow Facebook, Uber, WeChat, Square, and others to build new voice experiences, but Apple is leaving out obvious categories of apps that would benefit from it as well. Note-taking, media playback, social networking, task management, calendar event creation, weather forecasts – the nature of these apps precludes integration with SiriKit. We can only hope that Apple will continue to open up more domains in future iterations of iOS.

For this reason, SiriKit might as well be considered a public beta for now: it covers a fraction of what users do on their iPhones and iPads. I've only been able to test one app with SiriKit integration over the past few days – an upcoming update to Airmail. Bloop's powerful email client will include a SiriKit extension of the messaging domain (even if email isn't strictly "messaging") to let you send email messages to people in your Airmail contact list.

SiriKit and Airmail with different levels of verbosity.

SiriKit and Airmail with different levels of verbosity.

In using Airmail and Siri together, I noticed how SiriKit took care of parsing natural language and multiple ways to phrase the same request. The "resolve, confirm, and handle" flow was exemplified by the steps required to confirm pieces of data required by Siri – in Airmail's case, the recipient's email address and message text.

Multiple steps in SiriKit.

Multiple steps in SiriKit.

As for other domains, I can't comment on the practical gains of SiriKit yet, but I feel like messaging and VoIP apps will turn out to be popular options among iPhone users.

I want to give Apple some credit. Conversational interactions are extremely difficult to get right. Unlike interface elements that can only be tapped in a limited number of ways, each language supported by Siri has a multitude of possible combinations for each sentence. Offloading language recognition to Siri and letting developers focus on the client side seems like the best solution for the iOS ecosystem.

We're in the early days of SiriKit. Unlike Split View or notifications, it's not immediately clear if and how this technology will change how we interact with apps. But what's evident is that Apple has been laying SiriKit's foundation for quite some time now. From the pursuit of more accurate language understanding through AI to the extensibility framework and NSUserActivity38, we can trace back SiriKit's origins to problems and solutions Apple has been working on for years.

Unsurprisingly, Apple is playing the long game: standardizing the richness of the App Store in domains will take years and a lot of patient, iterative work. It's not the kind of effort that is usually appreciated by the tech press, but it'll be essential to strike a balance between natural conversations and consistent behavior of app extensions.

Apple isn't rushing SiriKit. Hopefully, that will turn out to be a good choice.

Safari

Among various minor enhancements, there's one notable addition to Safari in iOS 10 that points at the possible direction of many iPad apps going forward. Effectively, it's the most important iPad-only feature this year.

Safari for iPad now supports in-app split view to open two webpages at once in landscape mode. Apple named this "Safari split view", but it's not related to the namesake system-wide multitasking mode. Opening two webpages in Safari doesn't engage the Slide Over app switcher.

There are multiple ways to invoke Safari split view. You can tap and hold on the tabs icon in the top toolbar and choose 'Open Split View'. This causes Safari to create two views and bring up the Favorites grid on the right side.

Hold the button to show the menu...

Hold the button to show the menu...

...and enter split view.

...and enter split view.

You can also tap & hold on a link in a webpage, hit 'Open in Split View', and the destination page will load on the right. If split view is already active, holding a link on the right side will offer a similar 'Open on Other Side' option.

If you'd rather tap once to open a webpage in split view, you can perform a two-finger tap on a link to either activate split view (if you're in full-screen) or open a new tab on the other side.

Last, you can drag a tab out of the toolbar and take it to the other side (either left or right). If split view isn't already enabled, the tab will morph into a small preview of the webpage as Safari resizes inwards, showing a gray area that indicates you can drop the page to open it in split view.

It's a polished, fun animation, which also works the other way around to put a tab back on the left and close split view.39

In addition to drag & drop, you can tap and hold the tabs button to merge all tabs in one screen and close split view. Because Safari for iOS 10 supports opening unlimited tabs (both on the iPhone and iPad), this menu also contains an option to close all tabs at once – one of my favorite tweaks in iOS 10.

Close all tabs at once.

Close all tabs at once.

Safari split view is aware of iOS' system-wide Split View. If Safari is in split view and you bring in a second app to use alongside the browser, Safari's split view is automatically dismissed by merging all tabs. When you close Split View and go back to Safari in full-screen, the browser's split view resumes where it left off.

There's nothing surprising about the look of Safari split view: using Size Classes (we meet again, old friend), Safari creates two instances of the same view, each independent from the other and carrying the same controls.40

I've long wished for the ability to view and interact with multiple Safari tabs at once on my iPad Pro. Before iOS 10, developers who recognized this gap in Safari's functionality were able to sidestep Apple's limitations with clever uses of Safari View Controller. The new Safari on the iPad obviates the need for those third-party apps with a native solution. The feature is particularly effective on the 12.9-inch iPad Pro, where you can view two full webpages instead of smaller versions scaled to fit. It's the same feeling of upgrading to Split View on the 12.9-inch iPad Pro from the 9.7-inch model.

The 9.7-inch iPad Pro, of course, shows less content than the 12.9-inch model (left). (Tap for full size)

The 9.7-inch iPad Pro, of course, shows less content than the 12.9-inch model (left). (Tap for full size)

After incorporating Safari split view in my workflow, I wish every document-based iPad app offered a way to split the interface in two panes.

Safari split view is a brilliant showcase of drag & drop to move content across multiple views, too.

Lack of a proper drag & drop framework for iPad apps, especially after the introduction of Split View in iOS 9, is baffling at this point. Multitouch and Split View are uniquely suited to breathe new life into the decade-old concept of drag & drop – just look at macOS and how well the system works even without multitouch. Drag & drop would make more sense on iOS than it ever made on the desktop by virtue of direct content manipulation.

Safari's drag & drop tab behavior is, hopefully, showing a glimpse of the future we deserve. A system-wide drag & drop framework is going to be trickier to pull off than a single browser tab41, but we can keep the dream alive.

More Changes

There are other smaller changes in iOS 10's Safari.

The parsing engine of Safari Reader – Apple's tool to increase the readability of webpages by stripping them of interface elements and ads – has been updated to support display of bylines, publication dates, and article subheads. The extraction of these bits of metadata isn't perfect42, but it's a step up from the previous version.

When Apple introduced Safari View Controller last year, they were adamant about its appearance: because the experience had to be consistent with Safari, developers couldn't modify or style the new in-app web view with their own UI. Third-party apps could set a tint color for the toolbar icons of Safari View Controller to make them match their colors (something we've seen implemented in apps like Overcast and NewsBlur), but that was as far as customization went.

A customized Safari View Controller in Tweetbot for iOS 10, matching the dark theme.

A customized Safari View Controller in Tweetbot for iOS 10, matching the dark theme.

Apple is letting developers customize Safari View Controller with a tint color for view bar backgrounds in iOS 10. In addition to color tinting for UI controls, the color of the entire toolbar can be set to something other than white. This should make the experience within apps more cohesive and the transition between app and web view less jarring.

Speaking of Safari View Controller: Find in Page is now supported in app web views as an action extension.

When hitting Command-T on iOS 10, a new tab opens with the cursor placed in the address bar, ready to start typing. External keyboard users rejoice.

Downloads, a longtime Safari issue, haven't been exactly "fixed" in iOS 1043, but Apple has found ways to circumvent old annoyances. First, hitting a link to a file download (such as a .zip file) now displays proper download progress in the address bar. Then, when the download is complete, the file can be saved to iCloud Drive with iOS 10's new Add to iCloud Drive extension.

Saving a downloaded file from Safari to iCloud Drive is now possible with an extension.

Saving a downloaded file from Safari to iCloud Drive is now possible with an extension.

We still haven't reached the point where Safari automatically downloads files into an iCloud Drive folder, but the idea doesn't seem so far-fetched anymore.

Another limitation of Safari that has been fixed in iOS 10 is video playback. Thanks to a new webkit-playsinline property, web developers can specify videos that can be played inline on the iPhone without opening the full-screen player.

Minimize and expand.

Minimize and expand.

Even if the property isn't specified, playback will commence in full-screen but users can pinch close on the video (or tap a resize button) to keep playing it inline. Being able to shrink videos down makes the iPhone's browsing experience more pleasant.

Furthermore, Safari in iOS 10 brings support for automatic video playback of videos without audio tracks on page load (you may have seen such videos in this review). The change, outlined on the WebKit blog earlier this year, was motivated by the rising popularity of animated GIFs. As WebKit engineers noted, the GIF format itself can be computationally intensive and it's not energy-efficient – part of the reason why online GIF providers have embraced the <video> element with disabled audio tracks to replace GIFs. This change should also help websites that use muted videos as animated backgrounds, which will display correctly on devices running iOS 10.

Speaking of websites, Safari on iOS 10 enables pinch to zoom by default on all sites – even those that have specifically disabled zooming through meta tags. From an accessibility standpoint, I can only applaud Apple's decision.

Moving onto other features, you can search your Favorites and Reading List items by swiping down to reveal a search bar. Reading List doesn't support full-text search, so you'll only be able to search titles and not inside the text of a saved article.

Finally, smarter AutoFill. While iOS 9 could suggest passwords and emails when attempting to fill web forms, iOS 10 takes it a step further and replaces the Passwords button above the keyboard with AutoFill Contact. The new dialog offers multiple options for your own contact card (such as Work and Personal email addresses) with the ability to customize your card's AutoFill without leaving Safari.

Customizing AutoFill.

Customizing AutoFill.

For the first time, you can also auto-fill any other contact on a webpage by hitting 'Other Contact...' and picking an entry from your address book (other contacts can be customized before auto-filling, too).

Apple is taking advantage of QuickType suggestions to speed up AutoFill: if you don't want to use the AutoFill interface, QuickType can suggest names, multiple email addresses, phone numbers, and other information from a contact's card through predictive shortcuts.

The deeper integration of contacts and AutoFill makes it easier to sign up for web services without having to type details. It's another argument in favor of Safari View Controller for apps: developers of apps with an account component will get more flexible AutoFill features for free if they implement Apple's web view in their signup flows. I know I wouldn't want to type contact information (or use an extension) after testing the convenience of AutoFill in iOS 10.

Even without new headline features like Safari View Controller and Content Blockers, Safari remains Apple's most mature and powerful iOS app. This year's refinements are well thought-out and split view is a boon for web multitasking on the iPad. I have no complaints.

Apple Music

Of all system apps, Apple Music is the one that got a dramatic makeover in iOS 10.44

With a redesign aimed at placating concerns about an overly complex interface, and with new algorithmic discovery features, iOS 10's Apple Music is streamlined and more powerful. Beneath a veil of astonishing simplification, the new Apple Music packs significant enhancements to the discovery and listening experience.

It's Dangerous to Go Alone

If you had to list the shortcomings of Apple Music since its debut in iOS 8.3, a hodgepodge of features tacked onto a crumbling pre-streaming foundation would easily sit at #1. With iOS 10, Apple wants its music streaming service to more accessible and intuitive for everyone who's moved past iTunes.

Part of this effort has resulted in a modern design language that does away with most iOS UI conventions in favor of big, bold headlines, larger buttons, and a conspicuous decrease of transparency in the interface. If you're coming from Apple Music on iOS 9 and remember the information-dense layout and translucent Now Playing screen, prepare to be shocked by iOS 10's rethinking of the Music app.

While the bold design has intriguing consequences for the overall consistency of iOS' visuals, its impact on usability is just as remarkable. The new Apple Music is no longer following the interaction paradigms of the iTunes Store and App Store: there's no front page highlighting dozens of top songs and curated recommendations. Instead, it's been replaced by a simplified Browse page with a handful of scrollable featured items at the top and links to explore new music, curated playlists, top charts, and genres.

Removing new releases and charts from the Browse page helped Apple accomplish two goals: give each section more room to breathe; and highlight the most important items (singles, albums, videos, etc.) with a big cover photo that can't be missed.

Being able to discern items with an effective sense of place is the underlying theme of navigation in the new Apple Music. On the one hand, information density has suffered and Apple Music can't show as much content on screen as it used to. On the other, bold headlines, fewer items per page, and larger controls should prevent users (who aren't necessarily well versed in the intricacies of the iTunes Store UI, upon which the old Apple Music was based) from feeling overwhelmed. Apple's new design wants to guide users through Apple Music's vast catalogue, and it mostly succeeds.

Apple Music's For You page on the iPad.

Apple Music's For You page on the iPad.

This is evident in the Library page, where Apple's has switched from a hidden View menu to a set of vertical buttons displayed underneath the "title bar". If you were confused by the taps needed to browse by Artist or view downloaded music in iOS 9, fear no more – Apple has created customizable buttons for you this time.

Big, customizable buttons.

Big, customizable buttons.

The same philosophy is shared by every other screen in Apple Music. Radio, search, even For Your recommendations – they've all moved on from the contortions of their iTunes-like predecessors to embrace clarity and simplicity through white space, large artworks, and big buttons.

Another prominent change from iOS 9 is the removal of translucent panes of glass in the interface. Transparency effects look great in screenshots, but unpredictable color combinations don't make a good case for legibility and consistency.

Apple is ditching translucency in the Now Playing screen altogether. In iOS 10, they've opted for a plain black-and-white design that lays out every element clearly and keeps text readable at all times.

Legible text, large album artwork.

Legible text, large album artwork.

It's not as fancy as iOS 9, but it's also not as over-engineered for the sake of beauty. Album artwork stands out against a white background; buttons are big and tappable; there's even a nice, Material-esque effect when pausing and resuming a song. Alas, the ability to love a song with a single tap from the Now Playing screen is gone.

The new Apple Music is equal parts appearance and substance. The bottom playback widget45 has been enlarged so it's easier to tap, and it also supports 3D Touch.46 Pressing on it reveals a redesigned contextual menu with options for queue management, saving a song, sharing, liking (and, for the first time, disliking), and, finally, lyrics.

Apple Music integrates officially licensed lyrics in the listening experience.[^43] Apple has struck deals with rightsholders to make this happen; even if lyrics aren't available for every song on Apple Music yet, they've seemed to grow during the beta period this summer, and I expect more lyrics to become available on a regular basis for new and old releases.

When lyrics are available, you'll see an additional button in the contextual menu as well as a new section when swiping up on the Now Playing screen. This is where shuffle, repeat, and the Up Next queue live now; I wish Apple had done a better job at hinting this space exists below what you can see. There's no indication that you can swipe up to reveal more options.

Swipe up to reveal repeat, shuffle, Up Next, and lyrics.

Swipe up to reveal repeat, shuffle, Up Next, and lyrics.

Lyrics, unlike Musixmatch, don't follow a song in real-time. They're either displayed in a full-screen popup (if opened from the contextual menu) or above Up Next in plain text.

Lyrics are modal on the iPad when opened from the contextual menu.

Lyrics are modal on the iPad when opened from the contextual menu.

That's not a deal-breaker, though. As a lyrics aficionado (I've also learned English through the lyrics of my favorite songs), this is a fantastic differentiator from other streaming services. Not having to Google the lyrics of what I'm listening and being taken to websites riddled by ads and often incorrect lyrics? I'm not exaggerating when I say that this feature alone might push me to use Apple Music every day again. I was hoping Apple would eventually bring native lyrics to Apple Music, and they delivered with iOS 10.47

Another functional improvement to the Now Playing screen is a built-in menu to control audio output. Taking a cue from Spotify, Apple Music sports a button in the middle of the bottom row of icons to switch between the iPhone's speaker, wired and Bluetooth headphones, and external speakers with just a couple of taps.48

You don't have to open Bluetooth or AirPlay settings to stream music to different devices. This was probably built with the iPhone 7 and AirPods in mind, but it's a feature that makes managing audio faster for everyone.

Available in Settings > Music, a new Optimize Storage option lets iOS automatically remove music from your device that you haven't played in a while. Unlike the similar setting for iCloud Photo Library, you can control the minimum amount of songs you want to keep downloaded on your device with four options:

  • 4 GB (800 songs)
  • 8 GB (1600 songs)
  • 16 GB (3200 songs)
  • 32 GB (6400 songs)

If you have an older iOS device with limited storage, this should be useful to ensure a compromise of available space and offline songs.

Music on iPad

Apple Music for iPad doesn't diverge from the iPhone counterpart much, but there a few differences worth noting.

The Browse page collects featured items, hot tracks, new albums, playlists, and more within a single view, with buttons to explore individual sections placed in a popover at the top.

Instead of taking over the app in full-screen, playing a song opens a Now Playing sidebar. The view is launched from a tab bar split between icons and the playback widget.

The sidebar feels like having Slide Over within Music: it doesn't behave like Safari or Mail's in-app split view, where you can interact with two views at the same time; instead, it's modal and overlaid on top of content.

It's not immediately clear why Apple didn't stick to a full-screen Now Playing view on the iPad if the sidebar still prevents interactions on the other side. Perhaps they realized giant-sized album artwork didn't make sense on the iPad Pro? Maybe the vertical layout lends itself better to peeking at Up Next and lyrics below playback controls? The sidebar is fine, but I'd rather have a real in-app split view in Music too.

The Split View that Music does have on iOS 10 is the system-wide multitasking one. On the 12.9-inch iPad Pro, Now Playing is a sidebar in both Split View layouts when Music is the primary app, but it turns into a full-screen view when Music is the secondary app in Slide Over.

New Discovery

iOS 10 brings discovery features that pit Apple Music against Spotify's algorithmic playlists and personalized curation.

Apple Music's For You page features two personalized playlists in a carousel at the top – My New Music Mix and My Favorites Mix. Both are automatically refreshed every week and are personalized for each user based on their listening habits and favorite songs.

My New Music Mix, refreshed every Friday, showcases new music Apple Music thinks you'll like; My Favorites Mix is a collection of hit singles, deep cuts, and songs related to your favorite artists that is refreshed every Wednesday.

The idea of a personalized mixtape refreshed on a weekly basis isn't new. Spotify was a pioneer in this field with their excellent Discover Weekly, which recently expanded to Release Radar. Spotify's system is powered by a neural network (its inner workings are utterly fascinating) and, as I previously wrote, it delivers impressive personalized picks that almost feel like another person made a mixtape for you.

It's too early to judge Apple's efforts with personalized playlists in iOS 10. They only rolled out two weeks ago, and, in my experience, such functionalities are best evaluated over a longer span of time after judicious listening and "loving" of songs.

My impression, however, is that Apple has succeeded at launching two great ways to discover new music and re-discover old gems every week. My first two My Favorites Mix playlists have been on point, collecting songs (both hits and lesser known ones) from all artists I knew and liked. Apple Music's first two My New Music Mix playlists weren't as accurate as Spotify's Release Radar, but, to be fair, I have been religiously using Spotify for over 9 months now, whereas I just came back to Apple Music. Accuracy may still be skewed in Spotify's favor given my listening's history.

Still, we don't need to wait to argue that algorithmically-generated playlists refreshed weekly are a fantastic addition to Apple Music. As I noted in my story on Spotify's Discover Weekly earlier this year, human curation is inherently limited. Apple has been at the forefront of human-made playlists, but it was missing the smart curation features of Spotify. Apple's two personalized mixes seem more – pardon the use of the term – mainstream than Discover Weekly, but that isn't a downside. Easing people into the idea of personalized playlists made by algorithms and then launching more specific types focused on music aficionados might be a better consumer approach than Spotify. I'd wager Apple is considering a product similar to Spotify's Discovery Weekly – a playlist that highlights back-catalogue songs you might like from artists you're not familiar with.

My New Music Mix and My Favorites Mix already seem very good, and they show that Apple can compete with Spotify when it comes to personalized music curation. As with other algorithmic features launched in iOS 10, Apple's debut is surprisingly capable and well-reasoned.

There are other changes in the For You section. Connect, already an afterthought in iOS 9, has been demoted from standalone view to a sub-section of For You.

Those links don't even open in Apple Music.

Those links don't even open in Apple Music.

Some people must be using Connect (who's leaving those comments?), but I just don't see the incentive for artists to post on it and for users to subscribe. Apple doesn't seem to care about it as a social network, and everyone is better off engaging with fans and following artists on Twitter and Facebook. Unless Apple gives it another try, I don't think Connect can suddenly gain relevancy. Rolling Connect into For You feels like Apple's version of "going to live in a farm upstate".

Playlists and sections recommended in For You have been redesigned and shuffled around. Every section can now be scrolled horizontally to reveal more content; Recently Played and Heavy Rotation tend to float towards the top for easier access; and there's the usual mix of artist spotlights, essentials (they're not called "Intro To" anymore), human-curated playlists, and a new section called New Releases For You.

The refreshed For You in iOS 10.

The refreshed For You in iOS 10.

If you liked Apple's For You section before, you won't be disappointed by iOS 10's refresh. But I believe My New Music Mix and My Favorites Mix will steal the show for many.

I'm still not sure if I want to give Apple Music another try – I've been truly satisfied with Spotify since I moved back to it in January. Apple's updates in iOS 10 are compelling, though. I got used to the "big and bold" design quickly, and I find it easier to parse and more fun than Spotify's boring black and green. Apple Music may sport lower information density, but, at least for me, it's easier to use than Spotify. Personalized playlists are solid already, and I've been keeping My Favorites Mix synced offline for fast access to a collection of songs I know I'm going to like. And then there's lyrics, which is nothing short of a game changer for me.

Apple's firing on all cylinders against Spotify and others in the music streaming industry. It might be time to take Apple Music for a spin again.

Maps

Without new exploration and location editing modes (transit launched in September 2015, and it's slowly rolling out to more cities; crowdsourced POI collection is still a no-go), Apple is making design and third-party apps the focal points of Maps in iOS 10.

Maps' new look removes UI chrome and enhances usability on large iPhones through lowered controls, intuitive buttons, and more proactive suggestions. There are floating buttons to find your position and open Maps' settings. Apple has gotten rid of the search bar at the top and replaced it with a card at the bottom (a floating "sidebar" on the iPad). You can swipe up the card to reveal suggestions below the search field.

The sense is that Apple wanted to ship a smarter, more conversational search feature, which now offers proactive place suggestions. Instead of a handful of recent addresses, Maps now curates a richer list of locations based on recently viewed and marked places, favorites, places you've been to, and addresses you probably want to go next based on proactive iOS features.

A webpage with an address I was viewing in Safari, proactively suggested by Maps.

A webpage with an address I was viewing in Safari, proactively suggested by Maps.

Each suggestion is associated with a relevant icon, so they're prettier and easier to identify. You can even swipe on them to remove them from the list or share them with other apps.

Colorful business and landmark icons are used in search results, which are more lively than iOS 9 and include more Nearby categories. In selected regions, Nearby results can be filtered with sub-categories in a scrollable bar at the bottom of the screen.

Iconography has always been one of the strong suits of Apple Maps, and the company is doubling down on it with iOS 10. Previously, when searching for places that pertained to a specific category such as Restaurants, Maps would drop generic red pins on the map, requiring you to tap on them to open a first popup, then tap again to open a detail view with information about the place. It was a slow, unattractive process that hid useful details from the first step of search results.

iOS 10 improves upon this in two ways. Instead of red pins, multiple search results are dropped on the map with more descriptive pins that suggest what a result is before tapping it. In the restaurant example, you'll end up with icons that contain pizza slices, hamburgers, or a fork and knife, for instance. If two results are close to each other on the current zoom level, they'll be grouped in a numeric orange pin that you can tap to choose one result.

Second, choosing a result uses the iPhone's search panel as a split view to display business information and the map at the same time. As you tap through results, you can preview place details with a card UI at the bottom that shows ratings, distance, and a button to start directions.

The interaction is similar on the iPad. Instead of managing result cards on the vertical axis, they're overlaid horizontally in a sidebar on the left.

By combining these elements with cards that are more comfortable to reach, iOS 10's Maps feels like it's been optimized for humans and nimble exploration. By comparison, the old Maps feels static and arbitrary.

The same philosophy has been brought to navigation. In iOS 10, you can pan freely on the map and re-center navigation with a button.

You can pan around during navigation in iOS 10.

You can pan around during navigation in iOS 10.

Details for the current trip, such as estimated arrival time and distance, are displayed in a bottom card, which, like results, can be swiped up to access more options. These include audio settings, turn-by-turn details, an overview, and, for the first time, en-route suggestions for places you might want to stop by, like gas stations or coffee shops.

More card-like UIs in Maps' navigation. (Tap for full size)

More card-like UIs in Maps' navigation. (Tap for full size)

After selecting a category of suggestions during navigation, Maps will return a list of nearby results and tell you how many minutes each will add to your trip. Select one, confirm that you want to stop by, and Maps will update directions for the new destination. When you're done, you can resume your route to the first destination with a blue banner at the top.

Apple is also going to let developers plug into Maps with extensions. If an app offers ride booking, restaurant reservations, and "other location-related services", it can embed its functionalities in Maps.

Maps extensions, like SiriKit's, are based on intents and developers can provide custom interfaces with an Intents UI extension. The same extensions that allow users to hail a Uber and track status with Siri can be used from Maps to get a ride to a selected place.49 Maps extensions contained inside iOS apps are disabled by default; they have to be activated from Settings > Maps > Extensions.

OpenTable's Maps extension.

OpenTable's Maps extension.

I've only been able to test OpenTable's Maps extension earlier this week, which has limited integration in Rome for a few restaurants. Once enabled, OpenTable's extension adds a button to view more information about a restaurant and make a reservation. You can set table size, pick available times, and enter special requests in Maps. OpenTable will ask you to continue the task in the main app to confirm a reservation, but it's nice to have a way to quickly check times and availability without leaving Maps.

I'm curious to see how ride-sharing and other location-based services available in Italy will implement Maps extensions.


The quality of Apple Maps data for my area still isn't comparable to Google Maps. Apple Maps has improved since iOS 6, but I still wouldn't trust it to guide me through a sketchy neighborhood in Rome at night. At the same time, I prefer the design of Apple Maps and its many thoughtful touches to Google Maps. From my perspective, Apple has created a more intuitive, better designed app without the data and intelligence of Google. It's an odd predicament to be in: while I appreciate Apple Maps' look in iOS 10, I also want navigation to be reliable and trustworthy.

There's a lot to like in Maps for iOS 10 and great potential for developers to elevate location-driven apps to a more contextual experience. The revised interface imbued with proactive suggestions is a step forward from iOS 9; the richer presentation of results makes Maps friendlier and informative. Maps in iOS 10 feels like someone at Apple finally sat down and tried to understand how regular people want to use maps on a phone. The redesign is outstanding.

Apple has perfected Maps' interface and interactions, and now they have a developer platform, too. An underlying problem remains: when it comes to data accuracy, your mileage with Apple Maps may vary.

Home

Apple's home automation framework, HomeKit, is ready for prime time in iOS 10. In addition to a dedicated page of shortcuts in Control Center, HomeKit is getting a native app for accessory management. It's also expanding to new types of accessories, including cameras.

Like iCloud Drive graduated to an app after a framework-only debut, all your HomeKit accessories can be accessed from a Home app in iOS 10. You won't find the complexity of advanced tools such as Matthias Hochgatterer's unfortunately-named Home in Apple's take. Instead, Apple's Home app will greet you with the same bold look of Apple Music and News.

Customizable edge-to-edge photo backgrounds and large buttons command the interface.

Customizable edge-to-edge photo backgrounds and large buttons command the interface.

Home works with any HomeKit accessories previously set up on iOS 9. One of the biggest flaws of the old HomeKit implementation – the inability to set up new accessories without an app from the vendor – has been fixed with iOS 10's Home app, which offers a complete setup flow from start to finish.

Rooms are a section of the app, while your favorite accessories and scenes are highlighted in the main Home dashboard. They're the same shortcuts used in Control Center.

Long-tapping an accessory in a room to open its detail screen.

Long-tapping an accessory in a room to open its detail screen.

Apple offers a collection of suggested scenes to get started – such as "Good morning" or "I'm home" – but you'll want to create your own scenes, choosing from custom icons50 and any accessory action you want.

Most users will only use Home for the initial accessory/scene configuration and to add favorites in Control Center, but there are hidden tricks in the app that are worth exploring (and, like Apple Music, concerning from a discoverability perspective).

You can find a summary of average conditions and statuses at the top of the Home page. You might see humidity, temperature, and door lock status in this message. You can tap Details for an accessory overview.

Your home's wallpaper can be modified by tapping the location icon in the top left and choosing a new one. You can do the same for rooms: after picking a room, tap the list icon in the top left, open Room Settings, and assign a new wallpaper.

Custom wallpapers for multiple rooms are a nice touch: they make the Home app look like your home, but I wish they synced with iCloud.

Some of the app's features are too hidden. To navigate between rooms, you can tap the menu at the top, but you can also swipe between rooms. There's no visual cue to indicate that multiple rooms live on the same horizontal pane. The design language shared by Apple Music and Apple News means both apps have this feature discoverability issue in common.

Similarly, buttons can be pressed with 3D Touch or long-tapped to open a modal view with intensity levels and settings for colors and more.

Color options for lights and group settings.

Color options for lights and group settings.

There's no way of knowing that more functionality lies beyond these "special taps". And that's too bad, because this view lets you manage useful options such as accessory grouping51 and bridge configuration.52

A front-end HomeKit interface has allowed Apple to bring deeper management features to iOS. First up, sharing: if you want to add members to your home, you can invite other people and give them administrative access to accessories. You can allow editing on a per-user basis, and you can also choose to let them control accessories while inside the house or remotely.

Sharing with HomeKit.

Sharing with HomeKit.

This ties into the Home app's second advanced feature – home hubs. What used to be an opaque, poorly documented option in iOS 9 is now a setting screen: your Apple TV or iPad can be used as HomeKit hubs when you're not at home. As long as the devices are plugged into power and connected to Wi-Fi, you can use them as bridges between a remote device and your accessories at home without additional configuration required.

Remote control comes in handy when you consider HomeKit's deep integration with iOS in Siri and Control Center. In my tests, I was able to turn on my espresso machine remotely when I was driving home just by talking to Siri. Control Center's Home page works with remote control: I can turn off my lights with one swipe, or I can check the status of my door anywhere on iOS.53

There's also automation. Third-party HomeKit management apps have long offered ways to set up rules and triggers to automate accessories and scenes based on specific conditions. iOS 10's Home app brings a simpler interface to have accessories react to changes at home in four different ways:

  • Your location changes;
  • A time of day occurs;
  • An accessory is controlled;
  • A sensor detects something.

When creating a new automation, you won't be presented with an intimidating workflow UI. Apple has nicely separated the individual steps behind an automation: first you'll choose the accessory or trigger that will start an automation, then you'll be shown a handful of options. If you want to turn off your lights when the door closes, for instance, you first choose from Door: Open/Closed then move onto selecting scenes or lights.

There's no complicated language to learn for automation. (Tap for full size)

There's no complicated language to learn for automation. (Tap for full size)

I set up some automation rules in the Home app a couple of months ago, and they've been running smoothly since. Every day at 5 AM, lights in my bedroom and kitchen are turned off because I've likely gone to sleep by then. In another automation, my bedroom light turns red if the humidity level rises over 60%.

In the future, I'd like to see the ability to create nested automations with support for presence recognition. Currently, I can't tell the Home app to send me a notification if the main door opens and I'm not at home, or to turn off the lights if it's after sunset and nobody's home.

Last, HomeKit is expanding to new types of accessories. With iOS 10, third-party manufacturers can create:

  • Air accessories: conditioners, heaters, purifiers, and humidifiers.
  • Cameras: they can display live video streams and still images. HomeKit can control settings such as night vision, tilt, zoom, rotation, and mirroring, as well as built-in speaker and microphone preferences.
  • Doorbells: both standard and camera-equipped doorbells are supported. These devices generate an event once the the doorbell is pressed, sending a notification to HomeKit. In the case of doorbells with a camera built-in, iOS 10's rich notifications can display a live video stream without opening an app, and they can embed an intercom button to start a two-way conversation with a person outside.

Cameras and doorbells were two highly requested enhancements to HomeKit. Third-party HomeKit cameras aren't available on the market yet – which is unfortunate, as I couldn't test them for this review – but I plan on buying one as soon as possible.


Apple's plan for the connected home is coming together in iOS 10. Platform fragmentation has been a fundamental problem of third-party smart home devices and hubs: we've all heard the tales of devices being unable to talk to each other, being discontinued after a couple of years, or having to support external APIs to bring some communication into the mix.

With HomeKit, Apple's closed and slower approach is paying off in consistency, durability, and integration with the OS. The Elgato sensors I bought nearly two years ago have worked perfectly with iOS 10 since the first beta. I don't have to worry about companies supporting IFTTT, Wink, or other protocols as long as they work with HomeKit.

In Apple's ecosystem, I can always extend my setup. When you consider extra functionalities such as rich notifications, Siri, remote hubs, and Control Center, it's clear that home automation is best experienced as a tightly integrated extension of our smartphones.

I want to believe that the rollout of HomeKit accessories will continue at a steady pace with a Home app front and center in iOS 10. Even if that's going to be a problem for my wallet.

Apps

As is often the case with new versions of iOS, Apple added a variety of improvements to its suite of apps – some of which, for the first time, can also be deleted from the Home screen.

Mail

On the 12.9-inch iPad Pro, Mail has received a three-panel mode that shows a mailbox sidebar next to the inbox and message content in landscape.

Three-panel view on the iPad Pro.

Three-panel view on the iPad Pro.

This extended mode is optional; it can be disabled by tapping a button in the top left of the title bar. If you were wondering why iPad apps couldn't show more content like on a Mac, this is Apple's answer. It's the right move, and I'd like it to propagate to more apps.

Conversation threading has also been updated in iOS 10 to resemble macOS' conversation view.

In iOS 10, messages in a thread are shown as scrollable cards. Each message can be swiped to bring up actions, and it can be taken in full-screen by tapping on its header (or 'See More' at the bottom).

You can control the appearance of conversation view in Settings > Mail (Contacts and Calendars have received their own separate setting screens, too). Mail lets you complete threads (load all messages from a thread even if they've been moved to other mailboxes) and display recent messages on top. Conversation view makes it easier to follow replies without having to squint at quoted text. It's nicer and more readable; I wish more third-party email clients had this feature.

This willingness to make Mail more desktop-like doesn't apply to smart folders, which are still nowhere to be found on iOS. Instead, Apple hopes that filters will help you sift through email overload.

Filtering an inbox.

Filtering an inbox.

Filters can be enabled with the icon at the bottom left of the inbox. You can customize them by tapping 'Filtered By' next to the icon. Filters include accounts, unread and flagged messages, messages that are addressed to you or where you're CC'd, and only mail with attachments or from VIPs.

Filters aren't a replacement for smart folders' automatic filing, but they can still provide a useful way to cut down a busy inbox to just the most important messages. I wish it was possible to create custom filters, or that Apple added more of them, such as a filter for Today or the ability to include messages from a specific address (without marking it as VIP).

Last, like Outlook, Mail now recognizes messages from mailing lists and lets you unsubscribe with one tap without opening Safari.

Tapping the Unsubscribe button will send a an unsubscribe request as a message on your behalf, which you can find in the Sent folder. In my experience, Mail has done a solid job at finding newsletters and putting its Unsubscribe banner at the top.


Compared to apps like Outlook, Airmail, and Google Inbox, Apple is advancing Mail at a deliberately slow pace. You can't send an email message to extensions with the share sheet (more on this problem here); several macOS Mail functionalities are still missing from the iOS app; and, Google is way ahead of Apple when it comes to smart suggestions and automatic message categorization.

Mail is a fine client for most people, but it feels like it's stuck between longing for desktop features and adopting what third-parties are doing. There's a lot of work left to do.

Look Up and Dictionary

Apple's system dictionary – built into every app via the copy & paste menu – has been overhauled as Look Up, a more versatile interface meshing Spotlight and Safari search suggestions.

The new Look Up in iOS 10.

The new Look Up in iOS 10.

Look Up still provides dictionary definitions for selected words. The dictionary opens as a translucent full-screen view on the iPhone (a modal window on the iPad) with cards you can tap to read thorough definitions. New in iOS 10, the Italian and Dutch dictionaries can display multilingual translations in English, which I've found useful to expand my vocabulary without opening Google or a third-party dictionary app.

What makes Look Up one of the best additions to iOS 10 is the expansion of available sources. Besides definitions, iOS 10 shows suggestions from Apple Music, Wikipedia, iTunes, suggested websites, web videos, news, Maps, and more. These are the same data providers powering suggestions in Safari and Spotlight, with the advantage of being available from any app as long as you can select text.

Like in iOS 9, some results can be expanded inline, such as Wikipedia summary cards, while others take you to a website in Safari. The presentation style is also the same, with rich snippets and thumbnails that make results contextual and glanceable.

Smart data detectors have also been updated with Look Up integration. If iOS 10 finds a potential result in text, it'll be underlined to suggest it can be tapped to open Look Up.

Look Up triggered from a data detector in an email subject.

Look Up triggered from a data detector in an email subject.

In my tests, Look Up suggestions in text showed up in apps like Messages and Mail, and they often matched names of popular artists (e.g. "Bon Iver") or movies.

By plugging into a broader collection of sources, Look Up is more than a dictionary. It's Spotlight for selected text – an omnipresent search engine and reference tool that can take you directly to a relevant result without Google.

I've become a heavy user of Look Up for all kinds of queries. I look up topics on Wikipedia54 from my text editor or Notes without launching Safari. I even use it for restaurant reviews and Maps directions: iOS can pop up a full-screen Maps UI with the location, a button to get directions, and reviews from TripAdvisor. Look Up is a useful, clever addition, and I wish it worked for more types of content. It'd be nice to have POIs from Foursquare and Yelp in Look Up, for example.55

We first saw the potential for deeply integrated search with Spotlight in iOS 9. It's not only a matter of competition between Apple and Google – any suggestion that requires fewer interactions is a better experience for Apple and its users. Look Up makes web search a feature of any app; it's an intelligent continuation of the company's strategy.

Notes

Notes was, together with Safari, the crown jewel of Apple's app updates in iOS 9. This year, Apple is building upon it with subtle refinements and a new sharing feature.

Like Mail, Notes on the 12.9-inch iPad Pro offers a three-panel view. If you spend time moving between folders to manage notes, this should be a welcome change.

Three-panel view in Notes.

Three-panel view in Notes.

When using an external keyboard, you can now indent items in a bulleted list with the Tab key. The same can be done with the copy & paste menu; curiously, Apple labeled the opposite behavior 'Indent Left' instead of 'Outdent'.

When a note refreshes with content added on another device, the new bits are temporarily highlighted in yellow. This helps seeing what has changed when syncing with iCloud.

Note sharing is the big change in iOS 10. Arguably the most requested feature since the app's relaunch in iOS 9, collaboration puts Notes on the same playing field of two established competitors – Evernote and OneNote. In pure Apple fashion, collaboration has been kept simple, it's based on CloudKit, and there's an API for developers to implement the same functionality in their apps.

In iOS 10, every note has a button to start collaborating with someone. Tapping it opens a screen to share a note, which is done by sending a link to an app like Messages or Mail (you can also copy a link or send it to a third-party extension). Once you've picked how you want to share the note's link, you can add people by email address or phone number.56 As soon as the recipient opens the iCloud.com link for the note and accepts it, the note will gain an icon in the main list to indicate that it's a shared one.57

Sharing a note with someone on iMessage.

Sharing a note with someone on iMessage.

Collaborating with someone else on the same note doesn't look different from normal editing. Unlike more capable collaborative editing environments such as Google Docs, Quip, or Dropbox Paper, there are no typing indicators with participant names and you can't watch someone type in real-time. The experience is somewhat crude: entire sentences simply show up after a couple of seconds (they're also highlighted in yellow).

Apple doesn't view Notes collaboration as a real-time editing service. Rather, it's meant to offer multiple users a way to permanently store information in a note that is accessed regularly.

I believe Notes collaboration will be a hit. I can see families sharing a grocery list or travel itinerary in Notes without having to worry about creating online accounts and downloading apps. Colleagues keeping a collection of screenshots and links, teams sharing sketches and snippets of text – the flexibility of Notes lends itself to any kind of sharing in multiple formats.58

Even without the real-time features of Google and Dropbox (and the upcoming iWork update), Notes collaboration works well and is fast. In my three months of testing, I haven't run into conflicts or prompts to take action.

I was skeptical, but Notes collaboration works. In a post-Evernote world, Notes is still the best note-taking app for every iOS user.

Apple News

Like last year, we're going to have a separate story on Apple News. I wanted to briefly touch upon a few changes, though.

Apple News is the third iOS 10 app to sport a redesign centered on bold headlines, sizeable sections, and a more varied use of color.

The app launches to a For You view that does away with a traditional title bar to show the date and local weather conditions. Top Stories is the first section, highlighting 4-5 stories curated by Apple editors. These stories tend to be general news articles from well-known publications, and there's no way to turn them off even if you mute the channel.

Sections in the main feed are differentiated by color, whether they're curated by Apple (such as Trending or Featured) or collected algorithmically for your interests. Bold headlines don't help information density (on an iPhone 6s Plus, you'll be lucky to see more than four headlines at once), but they don't look bad either. The large, heavy version of San Francisco employed in the app makes it feel like a digital newspaper.

Because of my job and preferences in terms of news readers, I can't use Apple News as a replacement for Twitter or RSS. I want to have fine-grained control over my subscriptions, and the power-user tools offered by services like NewsBlur and Inoreader aren't a good fit for Apple News. There are also editorial choices I don't like: the more I keep muting and disliking certain topics (such as politics and sports), the more they keep coming back from Apple's editors or other publications. Apple's staff believes those are the stories I should care about, but I've long moved past this kind of news consumption. I don't have time for a news feed that I can't precisely control and customize.

As a general-purpose news reader, Apple News does a decent job, and the redesign gives sections and headlines more personality and structure. At the same time, Apple News still feels less inspired than Apple Music; the changes in iOS 10 aren't enough to convince me to try it again.

Clock

The Clock app has been updated with two new features aimed at people who use it at night: a dark theme (it looks nice) and Bedtime.

With Bedtime, Apple wants to give users with a morning routine an easy way to remember when it's time to sleep. Like other sleep trackers on the App Store, Bedtime sends a notification a few minutes before bed, and it wakes you up with gentle melodies of growing intensity (you can choose from 9 of them, with optional vibration). The goal is to be consistent, always go to bed and wake up at the same time every day, and get a regular amount of sleep each night.

Bedtime has a good UI design with a dial you can spin to adjust when you'd like to sleep and wake up, and it's integrated with HealthKit to save data under the Sleep Analysis category.

I can't use Bedtime because, as someone who works from home, I never wake up at the same time every day and I don't have kids to drive to school. Bedtime is too optimistic for my bad habits. I think it's a nice enhancement, though, and I bet it'll be quite popular among iOS users.

Health and Activity

If all you ever wanted from the Activity app was a way to stay motivated by comparing your friends' progress to yours, Apple has you covered in iOS 10.

Sharing is now built into Activity: once you've invited a friend to share data with you, the Sharing tab will display activity circles, total completion, and burned calories. You can tap through to see more details, hide your activity, and mute notifications; at any point, you can start an iMessage conversation with a friend – presumably to taunt or motivate them.

In my defense, I haven't been wearing my Apple Watch for the past few weeks.

In my defense, I haven't been wearing my Apple Watch for the past few weeks.

The "gamification" of Activity, combined with the Apple Watch, should help users push towards a daily goal and stay active. It's a feature I plan to test more in depth once I get back into my exercise regimen.59 We'll cover more workout and Activity changes in our review of watchOS 3.

As for Health, Apple has overhauled the app's dashboard with four main sections represented by colorful artwork: Activity, Mindfulness, Nutrition, and Sleep. Each of these primary categories has an explanation video, and there's also a general overview video about the Health app that you can watch by tapping a button at the bottom of the Health Data screen.

In an effort to make browsing Health less intimidating, Apple has simplified how you can view statistics recorded by your iPhone and Apple Watch. There's a new Today page with a scrollable calendar ticker; you can tap any day to see all recorded data points as small previews (which support 3D Touch). Tapping one will take you to the category's detail page, unchanged from iOS 9.

In the top right corner of the Health Data and Today pages, you'll find a user icon to quickly access details such as date of birth, sex, and blood type. You can configure wheelchair use here, as well as export your recorded Health data as a .zip archive containing an XML backup. U.S. residents will be able to sign up to become organ donors with Donate Life (previously announced by Apple) in the Medical ID screen.

As I've been arguing for the past couple of years, the Health app will eventually have to find correlations between categories to help users understand how they're living and what they should improve. Going beyond data collection and graphs should be the ultimate goal to turn Health into an assistant rather than a dashboard of data points. Until that's the case, making the app prettier and easier to use is a good call.

Universal Clipboard

Apple is adding another feature as part of the Continuity initiative launched two years ago: clipboard transfer between devices.

The option, dubbed Universal Clipboard, is designed to have (almost) no interface and "just work" in the background. After you've copied something on one device, pasting on another nearby will fetch what you originally copied on the first device and paste it. Universal Clipboard works with text, URLs, images, and other data types that can be pasted on iOS.

Like other Continuity functionalities, Universal Clipboard uses Apple IDs and peer-to-peer connectivity (Wi-Fi and Bluetooth) to determine devices in your proximity. Universal Clipboard is only engaged when you paste on a second device – it's not constantly pushing your copied items to iCloud or broadcasting them to all devices nearby. Because Universal Clipboard is meant to quickly switch from one device to another, there's a two-minute timeout on copied items – you won't be able to paste an image you copied two days ago on your iPhone in a message thread on the iPad today.

In my tests, Universal Clipboard worked well. It takes about a second to paste text copied from another device. Pasting a photo was the only case where I came across a "Pasting from..." dialog that loaded for a couple of seconds.

Pasting an image with Universal Clipboard.

Pasting an image with Universal Clipboard.

Universal Clipboard's no-configuration approach may concern developers who don't want data copied from their apps to propagate across devices . To ease those qualms, iOS 10 includes an API to restrict the pasteboard to the local device or set an expiration timestamp. I suppose AgileBits and makers of other content-sensitive apps will provide settings to control the behavior of Universal Clipboard and disable it permanently.

In the latest update, Workflow lets you configure Universal Clipboard options.

In the latest update, Workflow lets you configure Universal Clipboard options.

It's not a replacement for dedicated clipboard managers such as Copied and Clips, but Universal Clipboard is ideal if you don't want to think about transferring clipboard contents between devices. When you need it, Universal Clipboard lets you paste a link copied on the iPad into a WhatsApp message on the iPhone, or a photo from the iPhone into Notes on a second device. There are no clipboard entries to organize and no persistent storage of information to worry about. Like opening apps through Handoff, it's a nice option to always have with you.

Calendar

Calendar's new features in iOS 10 are aimed at speed and location.

Data detectors for dates and times in iMessage conversations have been improved so they can pre-fill the event creation screen with details fetched from messages. If you're planning a dinner with friends over iMessage and mention a place and time in the conversation, tapping the data detector should bring up the event creation UI with "dinner" as title and location/time properly assigned. When it works, it's a neat trick to save time in creating events.

When creating an event in the Calendar app, iOS 10 suggests similar events so you can re-add them with one tap.

It's not clear how far back into your history iOS 10 goes looking for old events. Event suggestions are handy – they're not real event templates, but they pre-fill locations and times, too.

Speaking of locations, iOS 10's Calendar can suggest a location to add to an event with one tap. If I had to guess, I'd say that iOS uses old events with the same name and frequent locations to suggest an address. And, if you create an event with travel time, Calendar will use the location of a previous event (not your current location) to calculate how long it'll take you to get there.

Apple's Calendar app isn't as versatile or powerful as Fantastical or Week Calendar. But I'm not a heavy calendar user, and iOS 10's proactive Calendar features have been smart in small and useful increments. I'm going to stick with Apple's Calendar app for a while.

"Removing" Apple Apps

iOS users have long clamored for the removal of pre-installed Apple apps from their devices. Such desire is understandable: Apple has kept adding built-in apps every other year, which hasn't helped the perception that Apple itself is wasting users' storage while also selling 16 GB iPhones as base models.60

iOS 10 adds the ability to remove the majority of pre-installed Apple apps61 from the Home screen. These are:

  • Calculator
  • Calendar
  • Compass
  • Contacts
  • FaceTime
  • Find My Friends
  • Home
  • iBooks
  • iCloud Drive
  • iTunes Store
  • Mail
  • Maps
  • Music
  • News
  • Notes
  • Podcasts
  • Reminders
  • Stocks
  • Tips
  • Videos
  • Voice Memos
  • Watch
  • Weather

There's a catch. By removing an Apple app, you won't delete it from the system entirely – you'll remove the icon and delete user data inside the app, but the core bits of each app – the actual binary – will still live in the operating system. According to Apple, removing every app listed above will only recover 150 MB from your device. If you were hoping to get rid of every Apple app and suddenly gain a couple of GBs of storage, you'll be disappointed.

Deleting a pre-installed Apple app works just like any other app on iOS: tap & hold on the icon, hit Delete, and you're done. For each app, iOS will warn you that you'll either lose your data or access to specific features, such as location sharing with Find My Friends, the Calculator icon in Control Center, or email data stored locally.62

Removing apps based on core system frameworks won't delete data inside them. If you remove Contacts, your contacts won't be deleted; the same applies to Reminders and Calendar, plus other iCloud data and documents. Effectively, you're removing the shell of an app and its settings; deleting Mail, for instance, removes all your accounts from the app. If you remove Reminders and ask Siri to create a reminder, though, it'll still be created and made available to third-party clients via EventKit.

Restoring previously removed Apple apps could have been more intuitive. You have to open the App Store and search for the name of an app, or look for Apple's developer profile page and tap the download button for each app you want to bring back. It would have been nice to have a dedicated area in Settings to view which apps have been removed with an easier way to restore them.

Restoring Apple apps from the App Store.

Restoring Apple apps from the App Store.

Restoring Apple apps is further confirmation of the fact that those apps aren't actually gone – they're just hidden. The download isn't a download: it takes less than a second and it doesn't even show a progress bar. Try this for yourself: remove an Apple app, find Apple's developer page on the App Store, put your device in Airplane Mode, and hit download. The app will reappear on your Home screen without the need for an Internet connection.63

As a company that prides itself on the tight integration of its hardware and software, caving to user pressure on the matter of pre-installed apps must have been, politically and technically, tough for Apple. The result is a compromise: Apple is letting users get rid of those Tips and Stocks apps (among others) that few seem to like, but they also can't completely delete apps because of their ties with the OS.

Some people will complain about this decision. I'm not sure users would like the opposite scenario where entire frameworks are deleted from iOS (if even possible without breaking Apple's code sign), third-party apps lose access to essential APIs, and each download consumes several hundred MBs. Given the architectural complexities involved, the current solution seems the most elegant trade-off.

Photos and Camera

A world-class portable camera is one of the modern definitions of the iPhone. Among many factors, people buy an iPhone because it takes great pictures. And Apple's relentless camera innovation isn't showing any signs of slowing down this year.

But the importance of the iPhone's camera goes deeper than technical prowess. The Camera and its related app, Photos, create memories. Notes, Reminders, Maps, and Messages are essential iOS apps; only the Camera and Photos have a lasting, deeply emotional impact on our lives that goes beyond utility. They're personal. They're us.

iOS 10 strives to improve photography in two parallel directions: the capturing phase and the discovery of memories – the technical and the emotional. Each relies on the other; together, they show us a glimpse of where Apple's hardware and services may be heading next.

Wide Color

Let's start with the technical bits.

The Camera Capture pipeline has been updated to support wide-gamut color in iOS 10. All iOS devices can take pictures in the sRGB color space; the 9.7-inch iPad Pro and the upcoming iPhone 7 hardware also support capturing photos in wide-gamut color formats.

When viewed on displays enabled for the P3 color space, pictures taken in wide color will be beautifully displayed with richer colors that take advantage of a wider palette of options. That will result in more accurate and deeper color reproduction that wasn't possible on the iPhone until iOS 10 (the 9.7-inch iPad Pro was the only device with a wide color-enabled display on iOS 9).

There are some noteworthy details in how Apple is rolling out wide color across its iOS product line, using photography as an obvious delivery method.

Wide color in iOS 10 is used for photography, not video. JPEGs (still images) captured in wide color fall in the P3 color space; Live Photos, despite the presence of an embedded movie file, also support wide color when viewed on the iPad Pro and iPhone 7 (or the Retina 5K iMac).

Apple has been clever in implementing fallback options for photos displayed on older devices outside of the P3 color space. The company's photo storage service, iCloud Photo Library, has been made color-aware and it can automatically convert pictures to sRGB for devices without wide color viewing support.

More interestingly, wide-gamut pictures shared via Mail and iMessage are converted to an Apple Wide Color Sharing Profile by iOS 10. This color profile takes care of displaying the image file in the appropriate color space depending on the device it's viewed on.

Even as a tentpole feature of the iPhone 7, wide-gamut photography isn't something most users will care (or know) about. Wide color is relevant in the context of another major change for iOS photographers and developers of photo editing software – native RAW support.

RAW Capture and Editing

Apple used an apt and delicious analogy to describe RAW photo capture at WWDC: it's like carrying around the ingredients for a cake instead of the fully baked product. Like two chefs can use the same ingredients to produce wildly different cakes, RAW data can be edited by multiple apps to output different versions of the same photo.

RAW stores unprocessed scene data: it contains more bits because no compression is involved, which leads to heavier file sizes and higher performance required to capture and edit RAW. On iOS 10, RAW capture is supported on the iPhone SE, 6s and 6s Plus, 7 and 7 Plus (only when not using the dual camera), and 9.7-inch iPad Pro with the rear camera only, and it's an API available to third-party apps (Apple's Camera app doesn't capture in RAW).

To store RAW buffers, Apple is using the Adobe Digital Negative (DNG) format; among the many proprietary RAW formats used by camera manufacturers, DNG is as close to an open, publicly available standard as it gets.64

At a practical level, the upside of RAW capture is the ability to reverse and modify specific values in post-production to improve photos in a way that wouldn't be possible with processed JPEGs. On iOS 10, RAW APIs allow developers to create apps that can tweak exposure, temperature, noise reduction, and more after having taken a picture, giving professionals more creative control over photo editing.

Things are looking pretty good in terms of performance, too. On iOS devices with 2 GB of RAM or more, the system can edit RAW files up to 120 megapixels; on devices with 1 GB of RAM, or if accessed from an editing extension inside Photos (where memory is more limited), apps can edit RAW files up to 60 megapixels.

Native RAW support opens up an opportunity for developers to fill a gap on the App Store: desktop-class photo editing and management apps for pros. If adopted by the developer community, native RAW capture and editing could enable workflows that were previously exclusive to the Mac. Imagine shooting RAW with a DSLR, or even an iPhone 7, and then sitting down with an iPad Pro to organize footage, flag pictures, and edit values with finer, deeper controls, while also enjoying the beauty and detail of wide-gamut images (which RAW files inherently are).

Shooting RAW in Obscura.

Shooting RAW in Obscura.

I tested an upcoming update to Obscura, Ben McCarthy's professional photo app for iOS, with RAW support on iOS 10. RAW can be enabled from the app's viewfinder; after shooting with Obscura, RAW photos are saved directly into iOS' Photos app.

Editing RAW in Snapseed.

Editing RAW in Snapseed.

Google's Snapseed photo editor imported RAW files shot in Obscura without issues, and I was able to apply edits with Snapseed's RAW Develop tool, saving changes back to Photos. I'm not a professional photographer, but I was still impressed by the RAW workflows now possible with third-party apps and iOS 10.

On the other hand, while Apple has improved developer tools for RAW capture and editing, hurdles remain in terms of photo management. iCloud Photo Library, even at its highest tier, only offers 2 TB of storage; professional photographers have libraries that span decades and require several TBs. The situation is worse when it comes to local storage on an iPad, with 256 GB being the maximum capacity you can buy for an iPad Pro today. Perhaps Apple is hoping that these limitations will push users to rely on cloud-based archival solutions that go beyond what's offered by iCloud and iOS' offline storage. However, it's undeniable that it's still easier for a creative professional to organize 5 TB of RAW footage on a Mac than an iPad.

I have no reason to doubt that companies like Adobe will be all over Apple's RAW APIs in iOS 10. I'm also curious to see how indie developers will approach standalone camera apps for RAW capture and quick edits. There's still work to be done, but the dream of a full-featured photo capture, editing, and management workflow on iOS is within our grasp.

Live Photos

Apple isn't altering the original idea behind Live Photos with iOS 10: they still capture the fleeting moment around a still image, which roughly amounts to 1.5 seconds before a picture is taken and 1.5 seconds after. Photos have become more than still images thanks to Live Photos, and there are some nice additions in iOS 10.

Live Photos now use video stabilization for the movie file bundled within them. This doesn't mean that the iPhone's camera generates videos as smooth as Google's Motion Stills, but they're slightly smoother than iOS 9. Another nice change: taking pictures on iOS 10 no longer stops music playback.

Furthermore, editing is fully supported for Live Photos in iOS 10. Apps can apply filters to the movie inside a Live Photo, with the ability to tweak video frames, audio volume, and size.65 To demonstrate the new editing capabilities, Apple has enabled iOS' built-in filters to work with Live Photos, too.

The key advantage of Apple's Live Photos is integration with the system Camera, which can't be beaten by third-party options. I'd like to see higher frame rates in the future of Live Photos; for now, they're doing a good enough job at defining what capturing a moment feels like.

The photos on our devices are more than files in a library. They're tiny bits of our past. The places we went to; the people we were; those we met. Together, they're far more powerful than memory alone. Photos allow us to ache, cherish, and remember.

Without tools to rediscover and relive memories, none of that matters. A camera that's always with us has enabled us to take a picture for every moment, but it created a different set of issues. There's too much overhead in finding our old selves in a sea of small thumbnails. And what purpose is to a photo if it's never seen again?

Apple sees this as a problem, too, and they want to fix it with iOS 10. With storage, syncing, and 3D Touch now taken care of, the new Photos focuses on a single, all-encompassing aspect of the experience:

You.

Computer Vision

Apple's rethinking of what Photos can do starts with a layer of intelligence built into our devices. The company refers to it as "advanced computer vision", and it spans elements such as recognition of scenes, objects, places, and faces in photos, categorization, relevancy thresholds, and search.

Second, Apple believes iOS devices are smart and powerful enough to handle this aspect of machine learning themselves. The intelligence-based features of Photos are predicated on an implementation of on-device processing that doesn't transmit private user information to the cloud – not even Apple's own iCloud (at least not yet).

Photos' learning is done locally on each device by taking advantage of the GPU: after a user upgrades to iOS 10, the first backlog of photos will be analyzed overnight when a device is connected to Wi-Fi and charging; after the initial batch is done, new pictures will be processed almost instantaneously after taking them. Photos' deep learning classification is encrypted locally, it never leaves the user's device, and it can't be read by Apple.

As a Google Photos user, I was more than doubtful when Apple touted the benefits of on-device intelligence with iOS 10's Photos app. What were the chances Apple, a new player in the space, could figure out deep learning in Photos just by using the bits inside an iPhone?

You'll be surprised by how much Apple has accomplished with Photos in iOS 10. It's not perfect, and, occasionally, it's not as eerily accurate as Google Photos, but Photos' intelligence is good enough, sometimes great, and it's going to change how we relive our memories.

Memories

Of the three intelligence features in Photos, Memories is the one that gained a spot in the tab bar. Memories creates collections of photos automatically grouped by people, date, location, and other criteria. They're generated almost daily depending on the size of your library, quantity of information found in photos, and progress of on-device processing.

Browsing Memories in iOS 10.

Browsing Memories in iOS 10.

The goal of Memories is to let you rediscover moments from your past. There are some specific types of memories. For instance, you'll find memories for a location, a person, a couple, a day, a weekend, a trip spanning multiple weeks, a place, or "Best Of" collections that highlight photos from multiple years.

In my library, I have memories for my trip to WWDC (both "Great Britain and United States" and "Myke and Me"), pictures taken "At the Beach", and "Best of This Year". There's a common thread in the memories Photos generates, but they're varied enough and iOS does a good job at bringing up relevant photos at the right time.

Different types of memories.

Different types of memories.

Behind the scenes, Memories are assembled with metadata contained in photos or recognized by on-device intelligence. Pieces of data like location, time of the day, and proximity to points of interest are taken into consideration, feeding an engine that also looks at aspects such as faces.

Scrolling Memories feels like flipping through pages of a scrapbook. Cover images are intelligently chosen from the app; if you press a memory's preview, iOS brings up a collage-like peek with buttons to delete a memory or add it to your favorites.

Tapping a memory transitions to a detail screen where the cover morphs into a playable video preview at the top. Besides photos, Memories generates a slideshow movie that you can save as a video in your library. Slideshows combine built-in soundtracks (over 80), pictures, videos, and Live Photos to capture an extended representation of a memory that you can share with friends or stream for the whole family on an Apple TV.

Choosing styles for Memories' slideshows.

Choosing styles for Memories' slideshows.

Each video comes with quick adjustment controls and deeper settings reminiscent of iMovie. In the main view, there's a scrollable bar at the bottom to pick one of eight "moods", ranging from dreamy and sentimental to club and extreme. Photos picks a neutral mood by default, which is a mix of uplifting and sentimental; moods affect the music used in the slideshows, as well as the cover text, selection of media, and transitions between items. You can also change the duration of a movie (short, medium, and long); doing so may require Photos to download additional assets from iCloud.

Deeper movie settings.

Deeper movie settings.

To have finer control over Memories' movies, you can tap the editing button in the bottom right (the three sliders). Here, you can customize the title and subtitle with your own text and different styles, enter a duration in seconds, manually select photos and videos from a memory, and replace Apple's soundtrack with your favorite music.66

Below the slideshow, Memories displays a grid of highlights. Both in the grid and the slideshow, Photos applies de-duplication, removing photos similar to each other.67 Apple's Memories algorithm tends to promote pictures that are well-lit, or where people are smiling, to a bigger size in the grid. In Memories, a photo's 3D Touch peek menu includes a 'Show Photos from this Day' option to jump to a specific moment.

As you scroll further down a memory's contents, you'll notice how Photos exposes some of the data it uses to build Memories with People and Places.

The memories you see in the main Memories page are highlights – the best memories recommended for you. In reality, iOS 10 keeps a larger collection of memories generated under the hood. For example, every moment (the sub-group of photos taken at specific times and locations) can be viewed as a memory. In each memory, you'll find up to four suggestions for related memories, where the results are more hit-and-miss.

In many ways, Apple's Memories are superior to Google Assistant's creations: they're not as frequent and they truly feel like the best moments from your past. Where Google Photos' Assistant throws anything at the wall to see what you might want to save, I can't find a memory highlighted by Photos that isn't at least somewhat relevant to me. iOS 10's Memories feel like precious stories made for me instead of clever collages.68

Memories always bring back some kind of emotion. I find myself anticipating new entries in the Memories screen to see where I'll be taken next.

People and Groups

Available for years on the desktop, Faces have come to Photos on iOS with the ability to browse and manage people matched by the app.

There are multiple ways to organize people recognized in your photo library. The easiest is the People view, a special album with a grid of faces that have either been matched and assigned to a person or that still need to be tagged.

Like on macOS, the initial tagging process is manual: when you tap on an unnamed face, photos from that person have an 'Add Name' button in the title bar. You can choose one of your contacts to assign the photos to.

Adding John as a recognized contact.

Adding John as a recognized contact.

As you start building a collection of People, the album's grid will populate with more entries. To have quicker access to the most important people – say, your kids or partner – you can drag faces towards the top and drop them in a favorites area.69

Another way to deal with faces is from a photo's detail view. In iOS 10, you can swipe up on a photo (or tap 'Details' in the top right) to locate it on a map, show nearby photos, view related memories (again, mostly chosen randomly), and see which people Photos has recognized.

Swipe up to view details of a photo, including people.

Swipe up to view details of a photo, including people.

This is one of my favorite additions to Photos.70 Coalescing location metadata and faces in the same screen is an effective way to remember a photo's context.

No matter how you get to a person's photos, there will always be a dedicated view collecting them all. If there are enough pictures, a Memories-like slideshow is available at the top. Below, you get a summary of photos in chronological order, a map of where photos were taken, more related memories, and additional people. When viewing people inside a person's screen71, iOS will display a sub-filter to view people and groups. Groups help you find photos of that person and yourself together.

Due to EU regulations on web photo services, I can't use Google Photos' face recognition in Italy, therefore I can't compare the quality of Google's feature with Photos in iOS 10. What I have noticed, though, is that local face recognition in Photos isn't too dissimilar from the functionality that existed in iPhoto. Oftentimes, Photos gets confused by people with similar facial features such as beards; occasionally, Photos can't understand a photo of someone squinting her eyes belongs to a person that had already been recognized. But then other times, Photos' face recognition is surprisingly accurate, correctly matching photos from the same person through the years with different hairstyles, beards, hair color, and more. It's inconsistently good.

Despite some shortcomings, I'd rather have face recognition that needs to be trained every couple of weeks than not have it at all.

You have to train face recognition when things go wrong.

You have to train face recognition when things go wrong.

You can "teach" photos about matched people in two ways: you can merge unnamed entries that match an existing person (just assign the same name to the second group of photos and you'll be asked to merge them), or you can confirm a person's additional photos manually. You can find the option at the bottom of a person's photos.

The biggest downside of face support in iOS 10 is lack of iCloud sync . Photos runs its face recognition locally on each device, populating the Faces album without syncing sets of people via iCloud Photo Library. The face-matching algorithm is the same between multiple devices, but you'll have to recreate favorites and perform training on every device. I've ended up managing and browsing faces mostly on my iPhone to eschew the annoyance of inconsistent face sets between devices. I hope Apple adds face sync in a future update to iOS 10.

Confirming faces in Photos is a time-consuming, boring process that, however, yields a good return on investment. It's not compulsory, but you'll want to remember to train Photos every once in a while to help face recognition. In my first training sessions, suggestions were almost hilariously bad – insofar as suggesting pictures of Myke Hurley and me were the same person. After some good laughs and taps, Photos' questions have become more pertinent, stabilizing suggestions for new photos as well.

Face recognition in iOS 10's Photos is not a dramatic leap from previous implementations in Apple's Mac clients, but it's good enough, and it can be useful.

Places

Display of location metadata has never been Photos' forte, which created a gap for third-party apps to fill. In iOS 10, Apple has brought MapKit-fueled Places views to, er, various places inside the app.

If Location Services were active when taking a picture, a photo's detail view will have a map to show where it was taken. The map preview defaults to a single photo. You can tap it to open a bigger preview, with buttons to show photos taken nearby in addition to the current one.

Viewing nearby photos.

Viewing nearby photos.

When in full-screen, you can switch from the standard map style to hybrid or satellite (with and without 3D enabled). The combination of nearby photos and satellite map is great to visualize clusters of photos taken around the same location across multiple years. When you want to see the dates of all nearby photos, there's a grid view that organizes them by moment.

Nearby photos make 3D maps useful, too. I seldom use Flyover on iOS, but I like to zoom into a 3D map and view, for instance, photos taken around the most beautiful city in the world.

You can view all places at once from a special Places album. By default, this album loads a zoomed-out view of your country, but you can move around freely (like in Nearby) and pan to other countries and continents. It's a nice way to visualize all your photos on a map, but it can also be used to explore old photos you've taken at your current location thanks to the GPS icon in the bottom left.

As someone who's long wanted proper Maps previews inside Photos, I can't complain. Nearby and Places are ubiquitous in Photos and they add value to the photographic memory of a picture. Apple waited until they got this feature right.

Proactive suggestion of memories and faces only solves one half of Photos' discovery. Sometimes, you have a vague recollection of the contents of a photo and want to search for it. Photos' content search is where Apple's artificial intelligence efforts will be measured up against Google's admirable natural language search.

Photos in iOS 10 lets you search for things in photos. Apple is tackling photo search differently than Google, though. While Google Photos lets you type anything into the search field and see if it returns any results, iOS 10's Photos search is based on categories. When typing a query, you have to tap on one of the built-in categories for scenes and objects supported by Apple. If there's no category suggestion for what you're typing, it means you can't search for it.

Intelligent search in Photos.

Intelligent search in Photos.

The search functionality is imbued with categories added by Apple, plus memories, places, albums, dates, and people – some of which were already supported in iOS 9. Because of Apple's on-device processing, an initial indexing will be performed after upgrading to iOS 10.72

The range of categories Photos is aware of varies. There are macro categories, such as "animal", "food", or "vehicle", to search for families of objects; mid-range categories that include generic types like "dog", "hat", "fountain", or "pizza"; and there are fewer, but more specific, categories like "beagle", "teddy bear", "dark glasses", or, one of my favorites, the ever-useful "faucet".

Examples of categories in Photos' search. (Tap for full size)

Examples of categories in Photos' search. (Tap for full size)

Apple's goal was to provide users with a reasonable set of common words that represent what humans take pictures of. The technology gets all the more impressive when you start concatenating categories with each other or with other search filters. Two categories like "aircraft" and "sky" can be combined in the same search query and you'll find the classic picture taken from inside a plane. You can also mix and match categories with places and dates: "Beach, Apulia, 2015" shows me photos of the beach taken during my vacation in Puglia last year; "Rome, food" lets me remember the many times I've been at restaurants here. I've been able to concatenate at least four search tokens in the same query; more may be possible.

Search token concatenation for more precise results.

Search token concatenation for more precise results.

All this may not be shocking for tech-inclined folks who have used Google Photos. But there are millions of iOS users who haven't signed up for Google's service and have never tried AI-powered photo search before. To have a similar feature in built-in app, developed in a privacy-conscious way, with a large set of categories to choose from – that's a terrific change for every iOS user.

Apple isn't storing photos' content metadata in the cloud to analyze them at scale – your photos are private and indexing/processing are performed on-device, like Memories (even if you have iCloud Photo Library with Optimize Storage turned on). It's an over-simplification, but, for the sake of the argument, this means that iOS 10 ships with a "master algorithm" that contains knowledge of its own and indexes photos locally without sending any content-related information to the cloud. Essentially, Apple had to create its computer vision from scratch and teach it what a "beach" looks like.

In everyday usage, Photos' scene search is remarkable when it works – and a little disappointing when it doesn't.

When a query matches a category and results are accurate, content-aware search is amazing. You can type "beach" and Photos will show you pictures of beaches because it knows what a beach is. You can search for pictures of pasta and suddenly feel hungry. Want to remember how cute your dog was as a puppy? There's a category for that.

I've tested search in Photos for the past three months, and I've often been able to find the photo I was looking for thanks to query concatenation and mid-range descriptions, such as "pasta, 2014" or "Rome, dog, 2016". Most of the time, what Apple has achieved is genuinely impressive.

On a few occasions, Photos' categories didn't contain results I was expecting to be in there, or they matched a photo that belonged to a different category (such as my parents' border collie, recognized as a "bear", or fireworks tagged as "Christmas tree").

That's one adorable bear.

That's one adorable bear.

Understandably, Apple's first take on scene search with computer vision isn't perfect. These issues could be remedied if there was a way to fix false positives and train recognition on unmatched photos, but no such option is provided in iOS 10. The decision to omit manual intervention hinders the ability to let users help Photos' recognition, and it makes me wonder how long we'll have to wait for improvements to the algorithm.

Compared to Google Photos' search, Apple's version in iOS 10 is already robust. It's a good first step, especially considering that Apple is new to this field and they're not compromising on user privacy.

An Intelligent Future

What's most surprising about the new Photos is how, with one iOS update, Apple has gone from zero intelligence built into the app to a useful, capable alternative to Google Photos – all while taking a deeply different approach to image analysis.

Admittedly, iOS 10's Photos is inspired by what Google has been doing with Google Photos since its launch in May 2015. 200 million monthly active users can't be wrong: Google Photos has singlehandedly changed consumer photo management thanks to automated discovery tools and scene search. Any platform owner would pay attention to the third-party asking users to delete photos from their devices to archive them in a different cloud.

Apple has a chance to replicate the same success of Google Photos at a much larger scale, directly into the app more millions of users open every day. It isn't just a matter of taking a page from Google for the sake of feature parity: photos are, arguably, the most precious data for iPhone users. Bringing easier discovery of memories, new search tools, and emotion into photo management yields loyalty and, ultimately, lock-in.

This isn't a fight Apple is willing to give up. In their first round, Apple has shown that they can inject intelligence into Photos without sacrificing our privacy. Let's see where they go from here.

Design

While iOS 10 hasn't brought a sweeping UI redesign, changes sprinkled throughout the interface underscore how Apple has been refining the iOS 7 design language. On the other hand, a new direction for some apps appears to hint at something bigger.

Bold Typography, Larger Controls

Apple Music epitomizes a strikingly different presentation of full-screen views, content grids, and affordances that hint at user interaction.

In iOS 10, Apple Music eschews the traditional title bar in favor of large, bold headlines for first-level views such as For You and Browse.

A new look for title bars in Apple Music.

A new look for title bars in Apple Music.

The use of San Francisco bold in lieu of a centered title bar label is similar to a newspaper headline. The heavy typeface sticks out as an odd choice initially, but it clarifies the structure and increases the contrast of Apple Music – two areas in which the company was criticized over the past year.

The evolution of Apple's Music app. (Tap for full size)

The evolution of Apple's Music app. (Tap for full size)

To group content within a view, or to label sub-views in nested navigation, Apple relies on Dynamic Text to scale text at different sizes. Dynamic Text doesn't affect headlines.

Dynamic Text and Apple Music's new design.

Dynamic Text and Apple Music's new design.

The text-based back button at the top of a sub-view isn't gone, but titles are always displayed in bold next to the content the user is viewing. An album's name, for instance, isn't centered in the title bar anymore; instead, it sits atop the artist's name.

Album titles no longer sit in the title bar – they're part of the content itself.

Album titles no longer sit in the title bar – they're part of the content itself.

The combination of multiple font weights, color, and thicker labels provides superior hierarchy for content displayed on a page, separating multiple types of tappable items. By doing less, the result is a set of stronger affordances.

The visual statement is clear: when you see a black headline or sub-title, it can't be tapped. You'll have to tap on the content preview (artwork, photos) or colored label (artist names, buttons) to continue navigation or perform a task.

This goes beyond fonts. To further limit confusion, Apple Music now displays fewer items per page. Every element – whether it's an album, a text button, or a collection of playlists – is also larger and more inviting to the touch.

Fewer, bigger touch targets.

Fewer, bigger touch targets.

The trade-off is reduced information density and the perception that Apple is babysitting their users with albums and buttons that get in the way too much. It's a case of over-shooting in the opposite direction of last year's button-laden Music app; Apple has a history of introducing new design languages and intentionally exaggerating them in the first version. The new Apple Music is a reset of visual expectations.

This is best exemplified by the Now Playing widget at the bottom of the screen: besides being taller (and hence more tappable), the contextual menu it opens blurs the background and is filled with large, full-width buttons that combine text and icons.

It's impossible to misunderstand what each of these do, and selecting them doesn't feel like playing a tap lottery, as was the case with the old contextual menu of iOS 9. Apple doesn't appear too worried about breaking design consistency with other share dialogs on iOS as long as Apple Music's works better.

The company's newfound penchant for big titles and explaining functionalities ahead of interaction doesn't stop at Apple Music. Apple News makes plenty of use of bold headlines for article titles (where they feel like an appropriate fit) and multiple colors to distinguish sections.

Another non-traditional title bar in Apple News.

Another non-traditional title bar in Apple News.

The Home app adheres to similar principles. There's no fixed title bar at the top of the screen; rather, a customizable background extends to the top of a view, with a large title indicating which room is being managed.

Home has no real title bars either.

Home has no real title bars either.

We can also look outside of apps for a manifestation of Apple's bold design sentiment. In Control Center, splitting features across three pages lets functionality stand out more with bigger, comfortable buttons that aren't constrained by a single-page design. This is evident in the music control page, where album artwork can be tapped to open the app that is currently playing audio.

Finally, let's consider the Lock screen. In addition to redesigned notifications and widgets (which can be simply pressed for expansion), Apple is using thicker fonts and expanded audio controls.

The evolution of audio controls on the Lock screen. (Tap for full size)

The evolution of audio controls on the Lock screen. (Tap for full size)

Bigger song information, larger buttons, and a volume nub that can be grabbed effortlessly. I see these as improvements over the iOS 9 Lock screen.

Buttons

Ah, buttons. The much contested, derided aspect of the iOS 7 design isn't officially changing with iOS 10. According to the iOS Human Interface Guidelines, this is still the default look of a system button in iOS 10:

Across iOS 10, however, we can see signs of Apple moving back to eye-catching buttons with borders and filled states in more apps.

Let's start with Apple Music again. In the iOS 10 version, there are numerous instances of button-y buttons that weren't there in iOS 9.

Buttons that don't have borders or a filled state are still present, but most of them have been redrawn with a thicker stroke to increase contrast with the app's white background.

Messages has an interesting take on buttons. Most of them are consistent with iOS 9, but the two buttons to open the Camera or a pick a photo from the library are displayed as icons inside a square with rounded corners.

Those are two big buttons.

Those are two big buttons.

These replace the textual buttons of the iOS 9 photo picker, where one of them could be confused as the label of the scrollable gallery shown above it.

The same look is used for HomeKit accessories. Device icons are contained in a square that shows the accessory's name, its icon, and intensity level.

The use of highlights and colors helps discerning on-off states for devices that are turned off (black text, translucent button) and on (colored icon, white-filled button).

Filled circles with glyphs are a recurring button type in iOS 10. They're used in a few places:

Circular buttons in iOS 10.

Circular buttons in iOS 10.

  • Spotlight: search shortcuts to FaceTime, message, or call a contact;
  • Camera on iPad: the HDR, timer, and iSight buttons have been updated with the new circular design;
  • Contact cards: this is a notable change given the the ability to add third-party messaging and VoIP app shortcuts for contacts. Apple has moved buttons to get in touch with a user to the top of the card;
  • Maps: the detail card of a point of interest/address has new buttons to call, share, mark as favorite, and add to Contacts.

Other examples of buttons redesigned for iOS include the back button in the status bar to return to an app (it's got a new icon) and variations of 'Get Started' buttons for apps like Calendar and Apple News, which are now filled rectangles.

Updates to buttons in iOS 10 may indicate that Apple heard feedback on text labels that many don't know can be tapped to initiate an action, but we'll have to wait until next year for further proof.

Cards and Stacked Views

In Apple Music, Messages, and Maps, the company has rolled out new types of views that could be interesting if ported to other apps.

First, stacked views. In Music's Now Playing screen and the iMessage App Store, iOS 10 features stacked panels that open on top of the current view, keeping a tiny part of it visible in the background. There's a nice animation for the view that recedes and shrinks from the status bar.

Stacked views.

Stacked views.

Stacked views are an intriguing way to show nested navigation. I wonder if more full-screen views that use a back button in the top left could be redesigned with this layout, perhaps using a vertical swipe to dismiss the foreground panel.

There are plenty of card-like interfaces being used in iOS 10 to supplant full-screen views, popups, and other kinds of panels.

From Maps' search suggestions that slide up from the bottom of the map to Control Center's pages, the Apple TV (and AirPods) setup card, and, in a way, expanded notifications, it feels like Apple has realized it's time to take advantage of bigger iPhone displays to drop modal popups and full-screen views.

Cards in iOS 10.

Cards in iOS 10.

Cards enhance usability with definite boundaries and a concise presentation of content. I like where this is going.

Animations

Contextual animations and transitions have always been part of iOS' visual vocabulary. Several improvements have been brought on this front with iOS 10, including APIs that allow for interactive and interruptible animations.

If developers support the object-based animation framework added to UIKit with iOS 10, they'll be able to have deeper control over interrupting animations and linking them with gesture-based responses. These improved animations (based on UIViewPropertyAnimator) can be paused and stopped, scrubbed (moved forward and back), and reversed at any point in their lifecycle. In short, it means apps no longer have to finish an entire animation if they want to react to other changes.

Apps can feel more responsive – and faster – with interruptible animations. It's not a major change per se, but it's a welcome response to iOS 7's indulgent animation curve.

Emoji Updates

New emoji from the Unicode 9.0 spec aren't available in iOS 10.0 (they'll likely be added in a point release in the near future), but Apple still found a way to ship notable emoji updates that will entice users to upgrade.

Several emoji have been redesigned with less gloss, more details, and new shading. This is most apparent in the Faces category where characters have a more accentuated 3D look. They remind me of emoticons from the original MSN Messenger, redrawn for the modern age.

Redesigned Emoji in iOS 10. (Tap for full size)

Redesigned Emoji in iOS 10. (Tap for full size)

Apple has implemented the ZWJ technique to add more gender-diverse emoji. Technically, these are combinations of multiple existing characters (codepoints) joined in a ZWJ sequence. To users, they'll look like new emoji added to the system keyboard, and Apple didn't miss the opportunity to announce them with a press release.

Alas, Apple's emoji keyboard still doesn't have a search field. If emoji suggestions fail to bring up the emoji you're looking for, you'll still want to keep Gboard installed as a fast way to search for emoji.

Sound Design

In addition to visual tweaks, Apple did some work in the aural department as well.

The keyboard in iOS 10 has distinctive new "pop" sounds for different kinds of keys, including letters, the delete key, and the space bar.73 Some people will find these sounds distracting and cartoon-ish; I think they add an extra dimension to typing on the software keyboard. Because the keyboard has multiple layers of "popping bubbles", you can now hear what you type besides seeing it. I'm a fan.

It is the lock sound, though, that takes the crown for the most surprising sound effect of iOS 10. I still can't decide what it is, but I like it.

Sound design is often underrated. An intelligent use of sound effects can augment the visual experience with context, personality, and just the right amount of whimsy. Whoever is behind the sound-related changes in iOS 10, I want to hear more from them.

A State of Flux?

Apple is continuing to iterate on the design language they introduced two years ago, but they're doing so inconsistently across the system, experimenting with new ideas without fully committing to them. There are multiple design languages coexisting in iOS 10. At times, it's hard to reconcile them.

The most notable changes mentioned above – the bold look of Apple Music and the revised look of buttons – aren't new guidelines for a system-wide refresh. They're isolated test-drives scattered throughout the system without a common thread.

Music, News, and Home have little in common from a functional standpoint, and yet they share the same aesthetic. Does Apple consider these apps the baseline of iOS interfaces going forward? Or should we prepare for an increasingly diversified constellation of Apple apps, each built around a design specifically tailored for it? What types of apps should adopt the "big and bold" style? Should developers read the tea leaves in Apple's app redesigns this year and prepare for updated guidelines twelve months from now?

Taken at face value, what we have in iOS 10 is a collection of design refinements. We also have a clique of apps that look different from the rest of Apple's portfolio, which may portend future change.

Ultimately, we're left asking: where do we go from here?

Proactive

iOS' Proactive assistant, introduced last year as a set of suggested shortcuts for apps based on user habits and context, is expanding to locations and contacts in iOS 10, and gaining a foothold in the system keyboard.

If you're in an iMessage conversation and someone asks you for a contact's phone number or email address, iOS will automatically put that suggestion in the QuickType keyboard for one-tap insertion. It doesn't have to be a reply to an existing message: if you compose a new email and type "[Name]'s phone number is", QuickType will also proactively suggest the phone number from your address book.

Even more impressively, if someone asks "Where are you?" on iMessage, QuickType will show a button to send your current location. Tap it, and a Maps bubble will be sent; the other person can tap it to open a full-screen preview and get directions.74

Sharing your current location from QuickType in iMessage.

Sharing your current location from QuickType in iMessage.

NSUserActivity plays a role in proactive suggestions, too. Apps can push out activities for places and have them appear as suggestions in other apps.

A Yelp suggestion in Maps.

A Yelp suggestion in Maps.

A restaurant listing from Yelp, for example, can be suggested in Maps' search view automatically; an app that displays hotel reviews can mark the location the user is viewing, and if the user switches to a travel planning app, that address can be proactively suggested without the need to search for it again.

Recently viewed places can even be suggested as shortcuts when switching between apps to open directions in Maps.

Maps shortcuts for places viewed in third-party apps.

Maps shortcuts for places viewed in third-party apps.

The system is an ingenious spin on NSUserActivity – a framework that developers were asked to start supporting last year for Spotlight search and Siri Reminders. By leveraging existing APIs and work developers have already put into their apps, iOS 10 can be smarter and use location-based activities as dynamic "bookmarks" in the system keyboard.

When these suggestions work, they're impressive and delightfully handy. In my tests, I received suggestions for addresses listed on webpages in Safari (and properly marked up with schema.org tags) and Yelp inside Maps; iOS 10 suggested addresses for stores and restaurants when I was switching between Yelp, Safari, Maps, and Messages, and it removed suggestions after I closed the webpages in Safari or the listings in Yelp.

I've found other QuickType suggestions to be more inconsistent. When talking in English on iMessage, QuickType was pre-filled with suggestions for trigger sentences such as "Let's meet at" or "We're going to" because I was viewing a location in Maps or Yelp. I couldn't get the same suggestions for different phrases like "Let's have dinner at" or "See you in 10 minutes at".

I couldn't get proactive QuickType suggestions to work in Italian at all. This is an area where Apple's deep learning tech should understand how users share addresses and contact information with each other. I'd expect Proactive to gain more predictive capabilities down the road, such as Calendar or Apple Music integration.

There are more instances of Proactive suggestions in iOS 10 that are subtle, but useful. When searching in Spotlight, QuickType will offer suggestions for locations and other content as soon as you start typing. Previous searches are listed at the bottom of Siri suggestions (and I haven't found a way to disable them, which could be problematic).

Proactive shortcuts and previous searches in Spotlight.

Proactive shortcuts and previous searches in Spotlight.

If you're already looking at a location in Maps or apps that markup addresses correctly, you can invoke Siri and say "get me there" to open directions to the address you're viewing. ETA uses this feature to start directions to a place you're viewing in the app.

Opening directions from ETA with Siri.

Opening directions from ETA with Siri.

It's no Google Now on Tap, but it's easy to see how Apple could soon replicate some of that functionality through various types of NSUserActivity.75

Apple is moving towards making Proactive more than a standalone page of shortcuts. Rather, Proactive is becoming an underlying feature of iOS, connecting an invisible web of activities when and where they make the most sense.

Keyboards

While last year's software keyboard improvements focused on iPad productivity, iOS 10 brings pleasant enhancements that will benefit every iOS user.

Multilingual Keyboard

The most unexpected change in iOS 10 will be as important as copy & paste for millions of international users. iOS 10 adds support for multilingual typing without switching between keyboards.

The annoyance of alternating keyboards isn't an issue everyone can relate to. Most single-language speakers only deal with emoji as a separate "keyboard" that requires switching from the QWERTY layout. Those users probably don't even see emoji as an additional keyboard but just as a special mode of the main system one. Millions of people have never seen iOS' old keyboard system as a problem.

How most English speakers deal with the system keyboard.

How most English speakers deal with the system keyboard.

For speakers of multiple languages, the experience couldn't be more different. As soon as a third keyboard is added to iOS, the emoji face turns into a globe button to switch between keyboards. Tapping it repeatedly cycles between all keyboards; alternatively, holding the globe button brings up a list of them.

How international users switch between keyboards.

How international users switch between keyboards.

Anyone who uses an iOS device to hold conversations in multiple languages is subject to a slower experience. When you're constantly jumping between iMessage conversations, Twitter replies, Facebook, email, Slack, and Notes, and when you're staying in touch with friends in multiple languages, and after you've been doing it every day for years, those seconds spent cycling through keyboards add up. Millions of people see this as one of the biggest flaws of iOS.

In iOS 10, Apple is taking the first steps to build a better solution: you can now type in multiple languages from one keyboard without having to switch between international layouts. You don't even have to keep multiple keyboards installed: to type in English and French, leaving the English one enabled will suffice. Multilingual typing appears to be limited to selected keyboards, but it works as advertised, and it's fantastic.

The idea is simple enough: iOS 10 understands the language you're typing in and adjusts auto-correct and QuickType predictions on the fly from the same keyboard. Multilingual typing supports two languages at once, it doesn't work with dictation, but it can suggest emoji in the QuickType bar for multiple languages as well.

Switching between English and Italian from the English keyboard.

Switching between English and Italian from the English keyboard.

I started testing multilingual typing on my iPhone and iPad Pro on the first beta of iOS. The best part is that there's very little to explain: suggestions retain the predictive nature of QuickType based on the context of the app or conversation, and you can even switch between languages within the same sentence. There's no training or configuration involved: it's as if two keyboards were rolled into one and gained the dynamic context-switching of a multilingual person.

Knowing which languages can work with multilingual typing is a different discussion. Apple hasn't updated their iOS feature availability page with details on multilingual typing yet. My understanding is that only keyboards with support for QuickType predictive suggestions and with a traditional QWERTY layout support multilingual typing. You should be able to mix and match Italian and English, or Dutch and French, or German and Spanish, for instance, but not Chinese and English within the same keyboard due to differences in the alphabet and characters.

I've been having conversations with my family in Italian while talking to colleagues and readers in English. I'm impressed with iOS 10's ability to detect languages on a word-by-word basis. I assumed the system could be confused easily, particularly with typos or words that are similar between two languages, but that only happened a couple of times over three months. Switching mid-sentence between Italian and English (as I often do when talking about work stuff with my girlfriend, for example) is fast and accurate.

Most new iOS features take some time to get used to; multilingual typing isn't one of them. After years spent fighting the keyboard switcher and auto-correct with multiple languages, multilingual typing is a huge relief. It's an elegant solution to a difficult problem, and it makes conversations flow naturally. I'm happy to see Apple catering to users who speak multiple languages with a feature that others will never (understandably) care about.

Multilingual typing has already become an essential feature of my iOS experience. I love it.

Emoji Suggestions

Apple's improvements to typing and QuickType don't stop at text and Proactive – they include emoji as well.

Emoji suggestions in QuickType. (Tap for full size)

Emoji suggestions in QuickType. (Tap for full size)

If you've typed a word or expression that iOS 10 associates with an emoji, such as "pizza" or "not sure", a suggested emoji will appear in QuickType. You can either put the emoji next to the word you've typed (by putting a space after the word and then tapping the emoji) or replace the word with the emoji itself (don't add a space and tap the emoji). If emoji suggestions don't immediately appear in an app, try inserting at least 5 emoji from the emoji keyboard first.76

In my tests, emoji suggestions have been good, often impressive. I've received emoji suggestions in both English and Italian, for a variety of common expressions (like "yum", "love you", or "I'm fine") and with up to three suggestions for a single word (such as "lol"). Popular emoji like the thumbs up/down, clapping hands, and high five can be suggested if you know the trigger word/expression. From this point of view, emoji suggestions are visual text replacements – for instance, I now type "great" and replace the word whenever I want to insert a thumbs up in a message.

However, because the predictive engine is young and there's so many different ways to describe an emoji, the dictionary is still growing. Italian doesn't support as many suggestions as English ("think", for instance, brings up the Thought Balloon emoji in English; the Italian equivalent, "penso", doesn't – but the infinitive form, "pensare", does); some expressions don't show an obvious emoji suggestion (try with "blue heart" or "friends").77

According to Apple, their Differential Privacy technology will be used to understand how iOS users type emoji. Hopefully, such system can learn and improve its emoji definitions over time as it looks at how people in aggregate use emoji in the real world. If it works, it's going to make one of the best tweaks to iOS even better.

Custom Keyboards

Despite the creativity shown by developers, third-party keyboards haven't received much love from Apple since their debut in iOS 8. Even without meaningful improvements to the API, two small adjustments in iOS 10 make using custom keyboards slightly better than iOS 9.

The first change sounds like an Apple engineer remembered about a bug and found the time to fix it. In iOS 10, custom keyboards transition on screen with the same bottom-up slide of Apple's keyboard. Thanks to this, opening a custom keyboard isn't as jarring as before.

Furthermore, iOS 10 lets third-party keyboards display the system keyboard switcher (the globe key menu) with the same options you get in the Apple keyboard.78

Gboard (left) with a custom keyboard switcher; TextExpander updated for iOS 10 (right) with the new system one.

Gboard (left) with a custom keyboard switcher; TextExpander updated for iOS 10 (right) with the new system one.

I still don't think Apple is particularly invested in the idea of custom keyboards (the lack of any new features is telling), but at least they've done the bare minimum to ensure that a third-party keyboard can be used as a primary one without too much struggle. Apple must have recognized the value of some custom keyboards for accessibility purposes, languages iOS doesn't support, and sharing features for messaging apps that aren't iMessage.

The likes of Google and Microsoft benefitting from these improvements is the kind of trade-off Apple will have to consider as they keep opening up iOS for everyone.

The iPad

iPad users who were craving the same attention of last year will be disappointed by iOS 10's scarcity of iPad-only features. There are some iPad changes, but none of them have the impact of Split View or Picture in Picture.

As mentioned before, there are new three-panel modes for Mail and Notes, a Now Playing sidebar in Apple Music, and in-app split view for Safari. There's also a different look for alarms in the Clock app. Everything else is a basic adaptation of iPhone layouts or a refinement of the same views in iOS 9.

Apple brought a few tweaks to the Camera viewfinder in iOS 10. On the iPhone, the camera flip button has been moved to the bottom, which makes it easier to switch between rear and iSight camera as you don't have to reach to the top of the screen. On the iPad, most of the interface has been redrawn with circular buttons on the right and a persistent zoom slider on the left.

The bigger iPad-only interface changes in iOS 10 can be collected in a single gallery:

Moving on to other features, Spotlight search invoked from an external keyboard with Command-Space will now open on top of the app(s) you're currently using without exiting back to the Home screen. When in Split View, this can be used as a quicker app switcher for the primary app on the left side.

Spotlight now opens modally on top of the apps you're using.

Spotlight now opens modally on top of the apps you're using.

It's nice to use a Spotlight that behaves more like the Mac. Unfortunately, apps (including Apple's Messages, Mail, and Notes) don't restore cursor position after dismissing Spotlight. If you're typing a message, open Spotlight, and then close it, you'll have to tap the screen to focus the cursor on the last active app and continue typing.

There are more external keyboard enhancements that are steps in the right direction. A Home screen icon has been added to the Command-Tab app switcher, so you can return to the Home screen without having to use Command-H. And, Command-tilde (~) can now move backwards in the app switcher, like on macOS.79 Last, you can take a screenshot with Command-Shift-3, which will be saved in the Photos app.

A Home screen shortcut has been added to the Command-Tab app switcher.

A Home screen shortcut has been added to the Command-Tab app switcher.

I'd be remiss if I didn't mention Playgrounds. Apple hasn't brought a full Xcode suite to the iPad, but the more streamlined Playgrounds environment feels like a better solution to introduce a new generation of iOS users to programming. Playgrounds isn't a built-in app (it's available from the App Store), and it's got some surprising innovations in terms of code interactions and in-app multitasking. It's also more powerful than you imagine if you know your way around Swift and native iOS frameworks. We'll have a separate story on Playgrounds later this week.

On Hold

The lack of deeper iPad improvements in iOS 10 amplifies problems Apple still hasn't fixed.

On the 12.9-inch iPad Pro, the Home screen is still a wasteland of icons that don't take advantage of the space offered to them. This year, the contrast is especially harsh given how iPhones with 3D Touch have received Home screen widgets in addition to quick actions.

Managing multiple files at once across different apps is still a task that will test the endurance of the most patient users. The Open In menu, untouched in iOS 10, continues to be limited to moving one file at a time from one app to another. The new 'Add to iCloud Drive' extension doesn't help when even a basic task such as saving multiple attachments from an email message isn't supported.

More importantly, it's obvious that Split View could be so much more. Having the clipboard and extensions as the sole data sharing mechanisms between two apps feels too limited when iOS is clearly suited for a system drag & drop framework. And that's not to mention the Slide Over app picker – unchanged from last year and in desperate need of a redesign.

Apple says that "there are great iPad features" in iOS 10, but that's not accurate. There are great iOS features in this update, and, sure, they also work on the iPad, but the iPad-only changes are minor and sparse – with the sole exception of Safari. iOS 10 doesn't share the same commitment to the iPad as iOS 9, when Apple was willing to reinvent the device's most fundamental aspects. In many ways, this feels like a regression to the days of iOS being barely "optimized" for the iPad.

iOS 10 is by no means "bad" on the iPad, it's just not particularly exciting or what the platform deserves right now. If Apple is planning their own tick-tock schedule for iOS releases going forward, the iPad's tock had better be a good one.

More Extensions

Following last year's focus on iPad, built-in apps, and performance, iOS 10 marks Apple's return to opening up the platform to developers with extensions. After Messages, Maps, and Siri, iOS 10 has got a few more extensibility tricks up its sleeve that are also significant.

Markup

After its debut in Mail with iOS 9, Apple's Preview-like annotation tool has graduated to a system extension for images and documents.

Using Markup in Photos.

Using Markup in Photos.

The tools available in Markup haven't changed. You can draw colored lines of varying thickness80, add magnification loupes, and place text annotations. Notably, Markup can be used in Photos as an editing extension; it doesn't offer the advanced tools of Annotable, but it should be enough for most users.

Add to iCloud Drive

Following iOS 9's inconsistent use of an iCloud Drive extension (which was only available for attachments in Mail), iOS 10 makes "Add to iCloud Drive" a system-wide option that can be used anywhere, for any file.

Add to iCloud Drive is an action extension that copies a file passed to it into iCloud Drive. It works for individual files shared from apps as well as media from Photos.

Unfortunately, the extension is hindered by questionable design decisions. When saving a file, the dialog box shows every folder and sub-folder in your iCloud Drive without a way to collapse them. There's no quick way to open a specific destination: you'll have to scroll a seemingly endless list of folders every time you want to save a file. There are no recent locations, no bookmarks, no search. No person who deals with documents on iOS would ever want to save them with an interface like this.

I appreciate Apple making iCloud Drive a system extension, but its design is amateur hour. It makes me wonder if anyone at Apple has ever used iCloud Drive with more than a handful of folders. It's such an obvious misstep, it almost looks like a joke.

VoIP Apps and CallKit

Apple is granting third-party developers access to another part of the OS through extensions: telephony.

For years, VoIP apps for audio and video calling have been relegated to a second-class experience. Apple created an API six years ago to bless VoIP apps with background execution privileges, but without a framework to integrate calls with the rest of the system, apps still needed to maintain their own contact lists and use standard push notifications for incoming calls.

iOS' old VoIP calling experience.

iOS' old VoIP calling experience.

It was too easy to miss a call from apps like Skype or WhatsApp; accepting a call from a third-party app was also slow and confusing (why would you pick up a call from a banner alert?). Plus, developers couldn't get access to functionalities such as blocked contacts, which remained exclusive to Apple's Phone app.

All this is changing with CallKit, a framework that elevates third-party VoIP apps to a front-seat spot on iOS, allowing them to plug into advanced controls that have complemented Apple's Phone and FaceTime services for years.

The CallKit framework permits an incoming call from a third-party VoIP app to take over everything else (including the Lock screen) with a full-screen view, just like Apple's Phone and FaceTime apps. In a foremost example of dogfooding, Apple itself has adopted CallKit in all of their telephony services.

CallKit's interface and behavior are consistent with Phone and FaceTime calls on iOS, with some differences. The calling UI is the same as Apple's, with a label that describes which app the call is happening with, and the icon of the app replacing the dialer button. Tapping the icon takes users directly to the app for additional features. Developers can customize the in-call UI with a camera icon that indicates whether an app supports video calling or not.

Like Phone and FaceTime, CallKit boosts the priority of third-party VoIP apps. Other apps can't interrupt a call during a CallKit session; routing for Accessibility features, CarPlay, and Bluetooth connections is handled by the system automatically without developers having to optimize for them.

A demo CallKit app on iOS 10.

A demo CallKit app on iOS 10.

CallKit's integration with iOS' calling infrastructure goes beyond a shared UI. VoIP apps built with CallKit get access to the same block list and Do Not Disturb settings used by Apple's apps, they can support switching between multiple calls, and they can even appear in Contacts via the Recents and Favorites views.

Apple doesn't seem to be religious about pushing users to FaceTime anymore. If iOS 10 sees that the same contact is also registered with other VoIP services, buttons to initiate calls through third-party apps will be embedded in the contact card.81 Users only need to give an app permission to be used as a Service Provider, and it'll be promoted to a first-class calling experience by iOS 10.82

Apple's embrace of third-party services with CallKit isn't an admission of defeat. Rather, it's a recognition of the fact that millions of people use iPhones to communicate with their friends and families through apps that aren't FaceTime – that the App Store has reinvented communications beyond FaceTime and iMessage.

As platform owners, Apple understands that they have to help customers who are seeking alternative calling services. With CallKit, they've created a secure and consistent framework that takes advantage of every feature that makes an iPhone the ultimate communication device.

Using VoIP apps through CallKit feels and works like any other normal phone call. It's refreshing to see this happen, and it's a testament to the power of Apple's extensibility APIs. I'm looking forward to seeing WhatsApp, Skype, and others update their apps for CallKit.

Call Directory

Call Directory is a surprising inclusion in the CallKit framework. With this extension type, apps can label phone numbers for incoming calls on the Lock screen.

Apple described the use case for call directory extensions at WWDC: spam calls. According to the company, robo-callers and spam calls are particularly problematic in China (though I can vouch for their annoyance in Italy, too), and they've set out to improve upon this problem by letting developers maintain a database of phone numbers known to be spam.

Craig will tolerate no spam.

Craig will tolerate no spam.

In Apple's examples, a company like Tencent could build a call directory extension. When a call from a spam number comes in, the extension could add a label that identifies it as potential spam so the user can decide to reject the call without answering it.

Call Directory is another instance of Apple letting developers take over key bits of iOS in areas where the company doesn't want to be involved.

Everything Else

With the breadth and depth of iOS, it's impossible to list every single change or new feature. Whether it's a setting, a briefly documented API, or a subtle visual update, there are plenty of details and tidbits in iOS 10.

Differential Privacy

As a branch of cryptography and mathematics, I want to leave a proper discussion of Apple's application of differential privacy to folks who are better equipped to talk about it (Apple is supposed to publish a paper on the subject in the near future). See this great explanation by Matthew Green and 'The Algorithmic Foundations of Differential Privacy' (PDF link), published by Cynthia Dwork and Arron Roth.

Here's my attempt to offer a layman's interpretation of differential privacy: it's a way to collect user data at scale without personally identifying any individual. Differential privacy, used in conjunction with machine learning, can help software spot patterns and trends while also ensuring privacy with a system that goes beyond anonymization of users. It can't be mathematically reversed. iOS 10 uses differential privacy in specific ways; ideally, the goal is to apply this technique to more data-based features to make iOS smarter.

From Apple's explanation of differential privacy:

Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.

If Apple's approach works, iOS will be able to offer more intelligent suggestions at scale without storing identifiable information for individual users. Differential privacy has the potential to give Apple a unique edge on services and data collection. Let's wait and see how it'll play out.

Speech recognition

iOS has offered transcription of spoken commands with a dictation button in the keyboard since the iPhone 4S and iOS 5. According to Apple, a third of all dictation requests comes from apps, with over 65,000 apps using dictation services per day for the 50 languages and dialects iOS supports.

iOS 10 introduces a new API for continuous speech recognition that enables developers to build apps that can recognize human speech and transcribe it to text. The speech recognition API has been designed for those times when apps don't want to present a keyboard to start dictation, giving developers more control.

Speech recognition uses the same underlying technology of Siri and dictation. Unlike dictation in the keyboard, though, speech recognition also works for recorded audio files stored locally in addition to live audio. After feeding audio to the API, developers are given rich transcriptions that include alternative interpretations, confidence levels, and timing details. None of this is exposed to the microphone button in the keyboard, and it can be implemented natively in an app's UI.

Same API, different interfaces.

Same API, different interfaces.

There are some limitations to keep in mind. Speech recognition is free, but not unlimited. There's a limit of 1 minute for audio recordings (roughly the same of dictation) with per-device and per-day recognition limits that may result in throttling. Also, speech recognition usually requires an Internet connection. On newer devices (including the iPhone 6s), speech recognition is supported offline, too. User permission will always be required to enable speech recognition and allow apps to transcribe audio. Apple itself is likely using the API in their new voicemail transcription feature available in the Phone app.

Voicemail transcription in iOS 10, possibly using speech recognition as well.

Voicemail transcription in iOS 10, possibly using speech recognition as well.

I was able to test speech recognition with new versions of Drafts and Just Press Record for iOS 10. In Drafts, my iPhone 6s supported offline speech recognition and transcription was nearly instantaneous – words appeared on screen a fraction of a second after I spoke them. Greg Pierce has built a custom UI for audio transcription inside the app; other developers will be able to design their own and implement the API as they see fit. In Just Press Record, transcripts aren't displayed in real-time as you speak – they're generated after an audio file has been saved, and they are embedded in the audio player UI.

I'm looking forward to podcast clients that will let me share an automatically generated quote from an episode I'm listening to.

Do Not Disturb gets smarter

Do Not Disturb has a setting to always allow phone calls from everyone while every other notification is being muted.

An Emergency Bypass toggle has been added to a contact's editing screen for Ringtone and Text Tone. When enabled, it'll allow sounds and vibrations from that person even when Do Not Disturb is on. If you enable Emergency Bypass, it'll be listed as a blue button in the contact card to quickly edit it again.

Tap and hold links for share sheet

Apple is taking a page from Airmail (as I hoped) to let you tap & hold a link and share it with extensions – a much-needed time saver.

Parked car

I couldn't test this because I don't have a car with a Bluetooth system (yet), but iOS 10 adds a proactive Maps feature that saves the location of your car as soon as it's parked. iOS sends you a notification after you disconnect from your car's Bluetooth, dropping a special pin in Maps to remind you where you parked. The feature also works with CarPlay systems.

Spotlight search continuation

Searches for app content that began in Spotlight can now continue inside an app with the tap of a button.

Drafts uses the new Spotlight search continuation API to let users continue looking for content on the app's own search page. Maps has also implemented search continuation to load places in the app.

Better clipboard detection

iOS 10 brings a more efficient way for apps to query the system pasteboard. Instead of reading clipboard data, developers can now check whether specific data types are stored in the pasteboard without actually reading them.

For example, a text editor can ask iOS 10 if the clipboard contains text before offering to import a text clipping; if it doesn't, the app can stop the task before reading the pasteboard altogether. This API should help make clipboard data detection more accurate for a lot of apps, and it's more respectful of a user's privacy.

Print to PDF anywhere

A hidden feature of iOS 9 was the ability to 3D Touch on the print preview screen to pop into the PDF version of a document and export it. iOS 10 makes this available to every device (with and without 3D Touch) by pinching on the print preview to open Quick Look.

Videos cellular playback quality settings

If you use Apple's Videos app to stream movies and TV shows, you can now choose from Good and Best Available settings. I wish this also affected playback quality of YouTube embeds in Safari.

HLS and fragmented MP4 files

Apple's HTTP Live Streaming framework (HLS) has added support for fragmented MP4 files. In practical terms, this means more flexibility for developers of video player apps that want to stream movie files encoded in MPEG-4.

I tested a version of ProTube – the most powerful third-party YouTube client – with HLS optimizations for iOS 10. The upcoming update to ProTube will introduce streaming of videos up to 4K resolution (including 1440p) and 60fps playback thanks to changes in the HLS API.

If your favorite video apps use HLS and deal with MP4 files, expect to see some nice changes in iOS 10.

Touch ID for Apple ID settings

Settings > iTunes & App Store > View Apple ID no longer requires you to type a password. You can view and manage your account with Touch ID authentication. This one deserves a finally.

No more App Store password prompts after rebooting

In a similar vein, the App Store will no longer ask you for a password to download a new app after rebooting your device. You can just use Touch ID instead.

Continuity Keyboard for Apple TV

If your iPhone is paired with an Apple TV, you'll get a notification whenever the Apple TV brings up a text field (such as search on the tvOS App Store).

You can press (or swipe down) the notification on iOS to start typing in the quick reply box and send text directly to tvOS. A clever and effective way to reduce tvOS keyboard-induced stress.

App Store categories, iPad, and search ads

The App Store's Explore section, launched with iOS 8 and mostly untouched since, has been discontinued in iOS 10. Categories are back in the tab bar, with the most popular ones (you can count on Games always being there) available as shortcuts at the top.

Apple had to sacrifice the Nearby view to discover apps popular around you, but categories (with curated sections for each one of them) seem like the most popular choice after years of experiments.

On the iPad, the App Store now supports Split View so you can browse and search apps while working in another app.

This has saved me a few minutes every week when preparing the App Debuts section for MacStories Weekly.

Apple is also launching paid search ads on the App Store. Developers will be able to bid for certain keywords and buy paid placements in search results. Ads are highlighted with a subtle blue background and an 'Ad' label, and they're listed before the first actual search result – like on Google search.

App Store ads. I'm not sure about these.

App Store ads. I'm not sure about these.

It's too early to tell how beneficial App Store ads will be for smaller studios and indie developers that can't afford to be big spenders in search ad bids. Apple argues that the system is aimed to help app discovery for large companies and small development shops alike, but I have some reservations.

As a user, I would have liked to see Apple focus on more practical improvements to App Store search, but maybe the company is right and all kinds of developers will benefit from search ads. We'll follow up on this.

New 'Add to Favorites' UI

Similar to the 3D Touch menu for a contact card, the view for adding a contact to your favorites has been redesigned with icons and expandable menus.

More responsive collection views

Expect to see nice performance improvements in apps that use UICollectionView. iOS 10 introduces a new cell lifecycle that pre-fetches cells before displaying them to the user, holding onto them a little longer (pre-fetching is opt-out and automatically disabled when the user scrolls very fast). In daily usage, you should notice that some apps feel more responsive and don't drop frames while scrolling anymore.

Security recommendations for Wi-Fi networks, connection status

If you connect to a public Wi-Fi network (such as a restaurant hotspot), iOS 10 will show you recommendations to stay secure and keep your wireless traffic safe. There's also better detection of poor connectivity with an orange "No Internet Connection" message in the Wi-Fi settings.

Accessibility: Magnifier and Color Filters

There are dozens of Accessibility features added to iOS every year. I want to highlight three of them.

A new Magnifier app allows you to use the iPhone's camera to magnify what's around you and zoom into objects or text. The Magnifier isn't another Apple app on the Home screen: if enabled in the Settings, a triple-click on the Home button will launch Magnifier as a custom app (it even shows up in the multitasking switcher) with options to control zoom level, color filters, color inversion, and turn on the camera flash. You can opt to adjust brightness and contrast automatically based on ambient light.

iOS 10's new Magnifier app.

iOS 10's new Magnifier app.

While in Magnifier, you can move the camera around and apply filters in real-time. If you don't want to hold up your iPhone for more than a few seconds, you can capture a still frame to zoom into the image and adjust colors.

iOS 10's Magnifier is technically impressive and it's going to help millions of people with vision impairments. I'd suggest everyone to keep it installed as a quick way to use the iPhone's camera as a magnifier – it's incredibly well done and convenient.

Under Display accommodations, a Color Filters menu can help users with color blindness or who have difficulty reading text on the display. Apple has included filters for grayscale, protanopia, deuteranopia, tritanopia, and color tint. It's also a good reminder for developers that not all users see an app's interface the same way.

Finally, you can now define custom pronunciations to be used when iOS reads text aloud. Available in Settings > Accessibility > Speech > Pronunciations, you'll be able to type a phrase and dictate or spell how you want it to be pronounced by the system voice.

Dictating a pronunciation is remarkable as iOS automatically inserts it with the phonetic alphabet after recognizing your voice. You can then choose to apply a custom pronunciation to selected languages, ignore case, and pick which apps need to support it.

10

iOS 10 is characterized by an intrinsic duality: an acknowledgement of the platform's maturity; and a relentless, yet disciplined pursuit of what's next. Both depend on each other, and they're the lens through which iOS 10 is best explained.

The iMessage App Store, SiriKit, rich notifications, CallKit, and Maps extensions are a display of Apple's willingness to let apps be more than disconnected silos. iOS 10 is continuing what iOS 8 started: third-party apps are becoming system features.

It's not just a matter of nurturing developer goodwill: the App Store ecosystem can be leveraged to increase the functionality of iOS, building features that appeal to how people want to use their iPhones and iPads. For Apple, such effort is a nod to the App Store's strengths and progress. For developers and users, it means apps can have ramifications in the most important parts of iOS.

At the same time, allowing apps to reach further into iOS shows how the concept of "app" itself is evolving.

When different features of an app can be experienced throughout the system, the app becomes more of a collection of services, broken into atomic units. They're pervasive. Providing apps with more extensibility hooks results in moving more interactions away from the traditional app experience and into single-purpose mini interfaces. Whether it's an interactive notification, a widget, an iMessage app, or a SiriKit extension, iOS 10 has a clear vision of apps as contextual helpers in addition to being standalone utilities. It's only reasonable to expect Apple to follow this path going forward.

Signs of maturity include fixing what isn't working, too. The redesigned Apple Music makes the case for a simplified streaming interface that addresses what many found confusing in its debut release. The pagination of Control Center is a welcome enhancement to its capabilities as much as it's an admission of its original complexity. I'd argue that letting users remove Apple apps falls under the same category.

Alas, not every glaring problem has been remedied by iOS 10. File management continues to feel like a chore due to cumbersome document providers, and Apple managed to ship an incomprehensible iCloud Drive extension that doesn't help at all. Mail is lagging behind a competition that is shipping useful integrations and modernized email features. The Slide Over app picker – one of the worst design decisions of iOS 9 – is still with us.

The most disappointing aspect of iOS 10, in fact, is the treatment the iPad received, with uninspired adaptations of iPhone UIs and a lack of attention that's in stark contrast with last year. In iOS 10, the iPad feels like a second-class citizen again, put on hold in the backseat, waiting for resources to be devoted to it. Perhaps all this will be resolved as Apple's plans on iPad updates are revealed, but we can't know yet. Today, iOS 10 isn't the big milestone for iPad users that iOS 9 was.

An acceptance of iOS' grown-up status – and the responsibility that comes with it – isn't the sole driver of its advancements. iOS 10 demonstrates how, at a fundamental level, change is the only constant in Apple's software. Ironically, the company's approach to change is what hasn't changed at all: it's iterative, divisive, farsighted, often surprising, and, frankly, never boring.

Looking at iOS 10's features in isolation, we can spot every shade of change that has steered Apple so far. The need to make iMessage a platform and rethink Control Center. The patient expansion of the extensibility framework, done gradually – some might say too slowly – to ensure good performance and security. The first steps towards AI as a feature of our devices, built in a unique Apple way around privacy and laying the groundwork for the future.

But these changes are more than discrete improvements. They're no islands. As the tenth anniversary of the iPhone and its software draws closer, it's time we take a holistic view of what iOS has become. iOS' changes are simply a reflection of our own changes – whether it's how much time we spend messaging with friends, how many pictures we take, the sensors we put in our homes, or the music we listen to. The memories we cherish, our conversations, the songs we listen to.

Apple understands that, beyond technology, to improve iOS is to realize how much our lifestyles have changed. How software, after all, is nothing but our extension. From such perspective, iOS is never quite finished – it can only be relevant.

And even at its tenth version, iOS is still forging ahead.

Credits

This review wouldn't have been possible without the help, feedback, and existence of the following people, animals, beverages, and pieces of software:

  • My girlfriend Silvia, for her patience, love, and design skills
  • My two dogs, who are adorable
  • Alessandro Vendruscolo, who squashed many bugs and brought this web layout to life
  • John Voorhees
  • Graham Spencer
  • Brett Terpstra
  • Myke Hurley
  • Stephen Hackett
  • Frank Towers, who created 1-2-3 Trip Planner (don't tell developer Myke Hurley, though)
  • CGP Grey
  • Jeremy Burge
  • _David Smith
  • Casey Liss
  • John Gruber
  • Diego Petrucci
  • Workflow, Pythonista, Scrivener, iThoughts, and Editorial – essential apps that helped me create this story
  • Sketch and Meng To's Angle Mockups
  • Every app developer who sent me betas
  • Every engineer at Apple who always makes reviewing iOS each summer fun
  • @TiccisEspresso, for a daily dose of energy
  • Every Club MacStories member
  • And finally, every MacStories reader, for allowing me to do what I love. Thank you.

  1. For instance, deleting a Mail message from a notification on the Lock screen requires the user to authenticate with Touch ID or passcode. ↩︎
  2. It has a thicker font in iOS 10, a subtle but noticeable change from iOS 9. ↩︎
  3. A detail you can't miss: swiping up on the Search screen will make the clock move to the status bar, next to the padlock. Simple and tasteful. ↩︎
  4. There is one technical aspect I wish Apple handled differently. There's no way for apps to request a temporary exception to programmatically expand a widget when it would be appropriate, collapsing it again when a task is finished.

    As an example, consider the Workflow widget and running a workflow from compact mode. If the workflow needs to display a longer list of items while it's executing, it won't possible for the app to ask iOS to temporarily expand the widget until the workflow is complete, reverting it back to compact in the end. The user will have to tap a workflow, notice that the list is being cut off by compact mode, and manually toggle expanded mode. As of iOS 10.0, compact and expanded modes aren't dynamic, and I think Apple could add some flexibility to the API without taking control away from the user. ↩︎
  5. On iPads, older iPhones, and if you have 3D Touch disabled in Settings, notification banners have a "handle" to suggest you can drag them downwards to expand them. ↩︎
  6. In iOS 10, an action can also invoke the system keyboard to type a response. ↩︎
  7. Except Messages notifications (where the conversation transcript can be scrolled) or the play button of videos embedded in notifications. ↩︎
  8. Some actions, such as replying with media or opening an iMessage app, require the notification to launch the Messages app. In the API, apps can specify which actions can be performed from a notification, and which ones need to be managed from the main app. ↩︎
  9. Rotation Lock is the only toggle that doesn't carry the color of the screen in Settings where it belongs, but it looks nice and stands out nevertheless. ↩︎
  10. If no audio is playing, Control Center shows a button for the last app that played audio. ↩︎
  11. Interestingly, Apple isn't using 3D Touch to expand accessories into detail views. A long tap on a button triggers haptic feedback and pops into a view, but you can't apply multiple levels of force to watch the button expand and shrink (like you can in the first page of Control Center). It's not the only case of Apple coupling haptic feedback (once exclusive to 3D Touch) with long taps in iOS 10, though. ↩︎
  12. Our original iMessage review mentions BBM as a competing service. That's a long time ago. ↩︎
  13. You can even hold a photo, drag it around, and drop it in a conversation to send it. ↩︎
  14. In which case, the link preview will be smaller and only display the domain name. ↩︎
  15. It's kind of harsh on the iPad as lasers don't affect the message list, which remains white. ↩︎
  16. The only way I've found to disable screen effects is to enable Reduce Motion, which unfortunately deactivates other animations throughout the system (including bubble effects). When Reduce Motion is turned on and someone iMessages you with an effect, you'll get a separate text message saying "Sent with [effect name]". My friend Stephen does this ironically with fake effect names sometimes. ↩︎
  17. You can also tap with two fingers to send kisses, which weren't available on watchOS before. ↩︎
  18. Perhaps even open up selfie effects to developers? ↩︎
  19. A detail that I love: look closely at how ink spreads out on the "page" once it's absorbed. Realistic. ↩︎
  20. You can also tap & hold a message to show Tapback plus Copy and More buttons. ↩︎
  21. Which is a nice use of a private API by Apple (the same is true when deleting recent handwritten messages in handwriting mode). Third-party apps can't override the standard behavior of the Home button, which always exits an app when clicked once. There's another Apple precedent for this: clicking the Home button while configuring Touch ID won't exit Settings, but it'll show you a message instead. ↩︎
  22. A long tap on the apps icon next to the input field would be a nicer way to open the grid. ↩︎
  23. In testing over 50 iMessage apps, I also ran into performance issues with the iMessage app drawer dropping frames when swiping between pages and other visual glitches. I don't think installing a lot of iMessage apps will be an edge case given their novelty factor. ↩︎
  24. A simple way to prove this: real emoji can coexist with text in the same string because emoji are Unicode characters. You can't send text and a KIMOJI in the same message on iOS because text and images are two separate entities. ↩︎
  25. They did. ↩︎
  26. I'm sorry, Jeremy↩︎
  27. If someone sends you a sticker from a pack you don't have installed, Messages will show a 'From' button underneath it to take you to the iMessage App Store. This should help discovery of sticker packs as they propagate across users. ↩︎
  28. Note: deleted stickers do not sync across devices with iCloud. ↩︎
  29. Stickers can be static images or animated illustrations: iOS 10 supports PNG, APNG, JPEG, and GIF. Stickers can be displayed at three sizes (the default being medium at 136x136 points), they can't be smaller than 100x100 points, and they have a maximum file size of 500 KB. These options should give developers plenty of room for experimentation. ↩︎
  30. Developers can also include a sticker pack extension inside an existing iOS app. For example, KIMOJI could continue to ship their standalone app and offer both a custom keyboard and an iMessage sticker pack as separate extensions inside it. I'd expect apps that already offer custom "emoji" keyboards to go down this route. ↩︎
  31. I'm curious to see how Apple will handle the inevitable copyright claims for sticker packs featuring popular characters. ↩︎
  32. It's already catching on↩︎
  33. Sticker lock-in. It's a thing. ↩︎
  34. In our case, 1-2-3 Trip Planner wouldn't know about anyone's name – it'd only see their identifiers and the interactive message from the current session.

    Identifiers are unique to each user's device and they are scoped to the iMessage app currently in use; if John removes 1-2-3 Trip Planner from his device and reinstalls it, the app will attribute a different set of identifiers to each participant. Apps can store these identifiers and Messages will match them to local contact names – that's how 1-2-3 Trip Planner can use labels such as "Stephen's Available Times". The "Stephen" part is a decoded identifier. ↩︎
  35. In practice, the use of local identifiers and the fact that apps don't see the contents of individual messages but only a representation of objects in a session could hinder the feasibility of collaborative apps. We'll have to see if Apple's privacy-conscious approach will allow developers to program collaborative environments spread across multiple devices and instances of the same conversation. ↩︎
  36. It reminds me of when I was in high school and didn't pay attention in my physics class, playing tic-tac-toe with a friend on my notebook. It's a time-filler. ↩︎
  37. There's actually an eighth domain – restaurant reservations – but it requires official support from Apple. It also works with Maps in addition to Siri. Intents for restaurant reservations include the ability to check available times, book a table, and get information for reservations and guests. Apple has set up a webpage to apply for inclusion here↩︎
  38. Which is used to continue an intent inside an app if it can't be displayed with a Siri snippet, such as playing a slideshow in a photo app. ↩︎
  39. If split view is active and you initiate drag & drop, a tab won't expand to the preview as you cross over the separator in the middle of the screen. You can, however, slide it horizontally and watch as existing tabs move on the X-axis to show you where the new tab will be placed. This also works to rearrange tabs when split view isn't active. ↩︎
  40. The only action that takes over the other side is the share sheet, which can only be active in one Safari view at a time. ↩︎
  41. Such system would have to provide a unified UI for moving content across apps in Split View, consistently offer feedback to the user, and handle conversion of data formats between apps (say, dragging rich text into a plain text editor or a photo into Safari's address bar). It's probably nothing that Apple hasn't already figured out at least since the days of Drag Manager on System 7. ↩︎
  42. I came across articles where Reader couldn't fetch the author's name, for example. ↩︎
  43. There's still no download manager like on macOS. ↩︎
  44. I'll refer to the app as Apple Music, even if it can be used without the streaming service, because I've been streaming music for years and I no longer have a local music library to manage. ↩︎
  45. The only place where translucency lives on, given how content scrolls under it. ↩︎
  46. Unfortunately, Apple removed the progress bar from the bottom widget, which makes it harder to see your position in a song at a glance (you have to either use Control Center or open Now Playing). ↩︎
  47. I'm nitpicking, but I've found some mistakes in Apple's lyrics. Nothing major – things like inverted prepositions or abbreviated verbs that shouldn't have been – but worth mentioning. I noticed it about 10 times out of hundreds of songs I tried. It's probably something Apple can't fix because they're licensing lyrics (I'd love to know from whom). ↩︎
  48. When recording an iOS device's screen with QuickTime on macOS, the menu says "System Capture". ↩︎
  49. The user retains the ability to preview a Uber car arriving to their location on the map and make a payment with Apple Pay. ↩︎
  50. Props to Apple for including cartoonish versions of the party and heart emoji. ↩︎
  51. New in iOS 10, this allows you to group multiple accessories together as if they were a single accessory. ↩︎
  52. If accessories require an additional wireless bridge, such as Philips' Hue lights, you'll be able to quickly open bridge settings, assign it to a room, and exclude it from favorites because it has no user-facing features of its own. ↩︎
  53. Depending on the speed of your Internet connection, iOS might need a few seconds to ping a remote HomeKit hub. ↩︎
  54. I've noticed that Wikipedia results aren't always suggested, even for topics that are available on Wikipedia. My understanding is that Look Up (and other search suggestions on iOS) attempt to find the most popular/relevant result for the current query. For instance, Look Up will suggest an artist, but not always an artist's album or single. ↩︎
  55. These sources would have to be sanctioned by Apple. ↩︎
  56. Oddly enough, Family Sharing hasn't been baked into Notes collaboration at all. ↩︎
  57. There's a yellow badge in the note's list to tell you that a shared note has been modified since you last opened it. ↩︎
  58. Shared notes can't be locked with passcode or Touch ID. ↩︎
  59. The summer's really not a good time to test new Activity functionalities for me. ↩︎
  60. Which isn't the case anymore with the iPhone 7↩︎
  61. The Game Center app, on the other hand, is gone for good. With iOS 10, Game Center is only a framework apps can use – you'll see Game Center appear inside games as a system feature. I wouldn't know how to motivate Apple's decision other than they never really paid much attention to Game Center and its many technical woes. ↩︎
  62. If you try to remove the Apple Watch app and an Apple Watch is currently paired to your iPhone, iOS will tell you to unpair the Watch first. ↩︎
  63. Apple has already confirmed that, due to iOS' security model, you won't be able to update individual apps through the App Store, as some claimed earlier this year. Updates to system apps will be bundled with OS updates, as it's always been. ↩︎
  64. In addition to unprocessed RAW capture, iOS 10 supports simultaneous delivery of RAW and processed images in JPEG. ↩︎
  65. The duration and timing of a Live Photo can't be edited by apps – drastic modifications to the nature of a moment isn't what Apple wants third-party apps to do. ↩︎
  66. From what I've been able to try, any Apple Music track downloaded for offline listening should be supported in Memories. ↩︎
  67. You can turn this off in the grid view to show every item assigned to the memory by toggling Show All/Summary. ↩︎
  68. I do wish Apple's Photos could proactively generate animations and collages like Google does, but it's nothing that can't be added in the future. ↩︎
  69. You can also 3D Touch a face and swipe on the peek to favorite/unfavorite or hide it from the People album. ↩︎
  70. If you'd rather not use the Photos app to browse faces, you can ask Siri to "show me photos of [person]", which will open search results in Photos. These are the same results you'd get by typing in Photos' search field and choosing a person-type result. ↩︎
  71. We need to go deeper. ↩︎
  72. According to Apple, Memories, Related, People, and Scene search are not supported on 32-bit devices – older iPhones and iPads that don't meet the hardware requirements for image indexing. ↩︎
  73. Scrolling date pickers have received a subtle new sound effect, too. ↩︎
  74. In the future, Apple could allow apps to mark activity types such as "topic", "song", "actor", "movie", and more to let Siri look up content displayed on screen. ↩︎
  75. My interpretation is that iOS wants to make sure it'll suggest emoji in an app where you typically use them. I guess you wouldn't want emoji suggestions in your bank's iPhone app. ↩︎
  76. What's interesting about emoji suggestions is that Apple's isn't only relying on Unicode names and annotations. They're maintaining their own list of definitions and expressions, which is likely the product of years of refinement. I'd love to see a full list of emoji trigger words and check how frequently it'll be updated. ↩︎
  77. Apple is advising developers to position the globe key in the same spot as the default keyboard. They've also noticed that developers include a button to manage settings directly in the keyboard, and they're suggesting to put it where the system dictation key would be. I expect every custom keyboard to be updated with revised layouts after iOS 10. ↩︎
  78. If you're European and don't have a tilde character on your keyboard, try it on one of the original, American-based Smart Keyboards for the iPad Pro. ↩︎
  79. When drawing in Markup, you can press (3D Touch) on the screen for thicker lines, though there's no haptic feedback to accompany the increase in pressure. ↩︎
  80. Based on the same Intents framework used by SiriKit. ↩︎
  81. Apps that have requested permission will be displayed under Settings -> Phone. ↩︎

Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


Apple&#8217;s Data Collection in iOS 10

Permalink - Posted on 2016-08-02 19:50

Ina Fried, writing for Recode, got more details from Apple on how the company will be collecting new data from iOS 10 devices using differential privacy.

First, it sounds like differential privacy will be applied to specific domains of data collection new in iOS 10:

As for what data is being collected, Apple says that differential privacy will initially be limited to four specific use cases: New words that users add to their local dictionaries, emojis typed by the user (so that Apple can suggest emoji replacements), deep links used inside apps (provided they are marked for public indexing) and lookup hints within notes.

As I tweeted earlier this week, crowdsourced deep link indexing was supposed to launch last year with iOS 9; Apple's documentation mysteriously changed before the September release, and it's clear now that the company decided to rewrite the feature with differential privacy behind the scenes. (I had a story about public indexing of deep links here.)

I'm also curious to know what Apple means by "emoji typed by the user": in the current beta of iOS 10, emoji are automatically suggested if the system finds a match, either in the QuickType bar or with the full-text replacement in Messages. There's no way to manually train emoji by "typing them". I'd be curious to know how Apple will be tackling this – perhaps they'll look at which emoji are not suggested and need to be inserted manually from the user?

I wonder if the decision to make more data collection opt-in will make it less effective. If the whole idea of differential privacy is to glean insight without being able to trace data back to individuals, does it really have to be off by default? If differential privacy works as advertised, part of me thinks Apple should enable it without asking first for the benefit of their services; on the other hand, I'm not surprised Apple doesn't want to do it even if differential privacy makes it technically impossible to link any piece of data to an individual iOS user. To Apple's eyes, that would be morally wrong. This very contrast is what makes Apple's approach to services and data collection trickier (and, depending on your stance, more honest) than other companies'.

Also from the Recode article, this bit about object and scene recognition in the new Photos app:

Apple says it is not using iOS users’ cloud-stored photos to power the image recognition features in iOS 10, instead relying on other data sets to train its algorithms. (Apple hasn’t said what data it is using for that, other than to make clear it is not using its users photos.)

I've been thinking about this since the keynote: if Apple isn't looking at user photos, where do the original concepts of "mountains" and "beach" come from? How do they develop an understanding of new objects that are created in human history (say, a new model of a car, a new videogame console, a new kind of train)?

Apple said at the keynote that "it's easy to find photos on the Internet" (I'm paraphrasing). Occam's razor suggests they struck deals with various image search databases or stock footage companies to train their algorithms for iOS 10.


Want more from MacStories?

Club MacStories offers exclusive access to extra MacStories content, delivered every week.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now


Enhanced eBook: Storytelling Through Dance

Permalink - Posted on 2016-08-02 19:48

I've been a fan of Keone and Mari's dance videos since my girlfriend introduced me to their YouTube channel a while back (she's a dancer, and an avid follower of their work). I often watch choreography videos, but what Keone and Mari create is exceptionally unique. In addition to being masters of their craft, every video they produce blends choreography and story – revealing a deeper meaning to the song they are dancing to. Truly, Keone and Mari's videos are works of art.

Now, Keone and Mari have set out to produce a multimedia eBook to combine "dance, writing, music, film, design, photography, and technology to tell a story". I'm not a dancer myself, but, knowing their work, I'm intrigued:

We’re best known for our work as dancers and choreographers, yet we’ve always had a dream to use dance as a medium between different/collaborative art forms. With Mari’s creative writing degree and our love for various art forms we never saw dance being completely independent of the different crafts. Our dream is this: An enhanced eBook with storytelling through dance.

Enhanced eBooks are digital books that include immersive, interactive, and interesting features like video, music, audio narration, animation, photography, and more. While dance videos online have become a norm in the millennial age, we hope to give dance a new home within this enhanced ebook. Imagine following a movement-driven story, that’s accompanied by originally produced music, partnered with interactivity - like flipping through photos, learning a dance, or potentially dictating where in the story you’d like to go next. The imaginative possibilities are truly there. The hope for this creative and visual novel is to have it available on your devices to download or stream.

I would love to see this project happen. Keone and Mari have launched a Kickstarter campaign seeking $45,000 in funding to cover costs for production, artists, and design. With 11 days left, over $24,000 have been pledged, and there are some great rewards for dancers such as tutorials, Q&As, and even private dance lessons.

If you're interested, you can contribute to the campaign and check out more details on their Kickstarter page.

→ Source: kickstarter.com