Browse through the showcased feeds, or enter a feed URL below.
Apple news, app reviews, and stories by Federico Viticci and friends.
Permalink - Posted on 2018-06-23 11:38
It wasn’t long after Apple changed the mechanisms of its MacBook keyboards that reports of sticky keys and other problems surfaced. Over time as anecdotal evidence mounted, it became apparent that the problem was widespread, but of course, only Apple knew exactly how common the issues were.
Now, in response to the keyboard problems, Apple has begun a keyboard service program to fix or replace keyboards with faulty butterfly switch mechanisms. From Apple’s support page about the program:
Apple has determined that a small percentage of the keyboards in certain MacBook and MacBook Pro models may exhibit one or more of the following behaviors:
- Letters or characters repeat unexpectedly
- Letters or characters do not appear
- Key(s) feel "sticky" or do not respond in a consistent manner
The program covers MacBooks and MacBook Pro models from 2015 onward. Service is free of charge for four years after the first retail sale of the computer. To check if your model is covered, visit Apple’s support page for a complete list of eligible models.
My MacBook Pro’s keyboard hasn’t failed, but I know several people whose keyboard has, and I’ve had a few occasions where keys would become sticky for a short period. If my keyboard ever fails, I expect it will be at the most inopportune time, but at least that hassle and frustration won’t come with a big price tag too.
Club MacStories offers exclusive access to extra MacStories content, delivered every week; it's also a way to support us directly.
Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.Join Now
Permalink - Posted on 2018-06-21 13:04
Documents by Readdle has been on the App Store a long time. Before Apple released its Files app, Documents filled the gap with features that made it indispensable for accessing files on iOS devices and doing things like unzipping an archived folder. Although the stock Files app has taken over many of my day-to-day needs for file handling, Documents continues to evolve and adapt, providing tools that aren’t in Files.
Today, for instance, Readdle added WiFi file transfers between a Mac and iOS device to Documents. The system is easy to use and more flexible than AirDrop, making it something to keep in mind, especially when you are moving large numbers of files between a Mac and iOS device.
The WiFi transfer functionality is located in the Services tab of Documents. Tapping that option opens a camera view with instructions to open docstransfer.com on your computer. There, you’ll find a QR code. Scan it, and Documents creates a local WiFi connection between your computer and iOS device. To maintain the connection, Documents must remain open on your iOS device.
In your browser, you’ll see all the files you have stored in Documents on your iOS device. Drag in files to add them to your iOS device or select one or more files in your browser to download them to your computer. With Chrome, you can also drag whole folders into Documents.
The system works with a Mac or PC and over a lightning cable as well as WiFi. In fact, Documents can also transfer files between two iOS devices.
In my tests over a couple of days, I found Documents’ WiFi transfer functionality to be fast and reliable. To test the speed of WiFi transfers, I used it to move 16 RAW images from my Mac to my iPhone X, which together were about 250 MB of data. The process took roughly one minute with Documents. I tried the same files with AirDrop too, which was a little faster, but not much.
With AirDrop available on iOS devices and Macs, you may wonder what the benefit of Documents’ WiFi transfer is. The primary advantage of Documents, as with so many of its features other features, is flexibility. Unlike AirDrop, Documents can move files over a cable, it works with PCs, and it can handle different file types at the same time.
That last point is the one I foresee using Documents for the most. If you try airdropping a collection of mixed file types, the action fails with an error. In the same situation, Documents works like a champ moving any arbitrary collection of files you want.
I’ve become a regular user of Files and find it does the trick most of the time, but I’m glad utilities like Documents are available for those occasions when my needs fall outside the mainstream that Apple’s app covers. If you’ve ever found yourself frustrated by the process of moving files around on iOS, take a look at Documents’ new WiFi transfer system; it’s very well done.
The Documents update, which is scheduled for release today, is available on the App Store as a free download.
Club MacStories offers exclusive access to extra MacStories content, delivered every week; it's also a way to support us directly.
Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.Join Now
Permalink - Posted on 2018-06-20 22:26
Instagram announced a video service today that is available as a standalone app called IGTV. The new service will be available soon from a button in the top right-hand corner of the Instagram app’s main screen too.
IGTV features vertical video that is longer than is available in Instagram’s Stories feature. Currently, channels created by new accounts and ones with fewer followers are limited to uploading videos that are 15 seconds to 10 minutes long, but TechCrunch reports that eventually all accounts will be able to upload videos up to one hour long.
When you first open the app, it opens to a ‘For You’ section of videos from people you follow on Instagram along with a selection of popular content. The currently-selected video dominates the top two-thirds of the screen. The bottom third of the screen is a horizontally-scrolling, tabbed thumbnail interface for picking other videos. The included tabs are ‘For You,’ ‘Following,’ ‘Popular,’ and ‘Continue Watching,’ which are self-explanatory. You can also swipe between videos in a tab the same way you would in Instagram Stories.
Swiping down dismisses the thumbnails and other UI, so the video dominates the screen. A tap on the video reveals play/pause controls, a scrubber to advance or rewind the video, and buttons to mark videos as favorites, comment, share it with other Instagram contacts, copy a link to the video, report it, or hide it. Tapping the title of the video displays its description, which can include URLs that open in Safari View Controller. TechCrunch says users will be able to subscribe to channels, though that doesn’t seem to be implemented in this initial release.
Although there is currently no advertising in the app, that is coming based on Instagram CEO Kevin Systrom’s comments during the event today. According to the TechCrunch report on the event:
“There’s no ads in IGTV today,” says Systrom, but he says it’s “obviously a very reasonable place [for ads] to end up.” He explained that since creators are investing a lot of time into IGTV videos, he wants to make that sustainable by offering them a way to monetize in the future.
Overall, I like what I’ve seen in the short time I’ve been using IGTV. Only a couple of the accounts I follow have posted videos so far, but I expect that will change as creators experiment with this new outlet. One big disappointment from a design standpoint though, is that the app does not support full-screen iPhone X video.
Club MacStories offers exclusive access to extra MacStories content, delivered every week; it's also a way to support us directly.
Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.Join Now
Permalink - Posted on 2018-06-19 15:00
If you've ever used Anchor to make a podcast, you know just how easy it is. That ease of use, however, has historically meant sacrificing any access to editing tools that most podcasters need. Today that changes, however, as Anchor is introducing basic editing tools as part of the debut of its iPad app.
Anchor for iPad resembles its iPhone counterpart in most ways: the basic recording workflows and tools are all available here, so existing Anchor users won't have any trouble navigating the app. One major change is that the iPad version loses access to the full database of Anchor podcasts for listening. While I'd prefer the iPad app retain full functionality from the iPhone, the message behind this move is clear: the iPad app is all about content creation, not consumption.
This different focus between iPad and iPhone versions extends to the fact that the new editing tools are currently iPad-exclusive. Anchor says it may expand them in the future to its web and iPhone apps, but for now at least, if you want to edit your Anchor show without needing an external app, you'll need an iPad on hand.
Anchor's editing tools are extremely simple, which is fitting for an app that prides itself on simplicity. When assembling an episode, you can tap the '...' icon next to each clip to see two editing options: the first enables trimming the clip to the exact start and end time you prefer, while the second lets you split recordings at several points into multiple clips. The interface for these features is intuitive and user-friendly, with clearly labeled buttons and a nice pinch to zoom feature for making the most minute changes possible. There's not much more that needs saying about these tools, except that they should lead to higher quality podcasts, while drastically reducing the number of times Anchor users need to re-record something.
Anchor for iPad arrives with all the bells and whistles that iPad professionals expect from quality apps: support for Split View, drag and drop, and all sizes of iPad Pro, along with great UI layout in both landscape and portrait orientations.
Drag and drop can be used to easily drop in recordings stored in places like the Files app, making for convenient import of audio you recorded in another app. You can also use drag and drop as a quick way to add recordings from your Anchor library to just the right spot in an episode you're working on. Finally, it's put to use inside the episode builder for quick clip rearrangement. All of these touch-powered options make podcast creation feel more intuitive and accessible than ever for a novice like myself.
The iPad app does a great job adapting to your current screen setup, providing easy access to all necessary tools even in cramped conditions. Even in 50/50 Split View on the 12.9-inch iPad Pro, you're able to access all panels from the full screen app and work with no compromises. The only situation Anchor doesn't work well is as the smaller app in a 70/30 split – you can't access the episode builder or your profile unless Anchor's given more screen real estate.
Podcasting from the iPad has always been difficult due to various system limitations, and despite years of major new iOS updates, there haven't been any meaningful improvements on that front. What Anchor has built feels like the way iPad podcasting is meant to be done – no fuss, no frills, just an easy to use tool that gives you just enough power to make a great podcast. Anchor's iPhone app was already a solid way to record and publish shows, but the iPad feels like a much more fitting work environment for when you want to take the time to create something special.
Anchor is available as a free download on the App Store.
Permalink - Posted on 2018-06-19 12:00, modified at 12:01
On this week's episode of AppStories, we discuss the new and updated apps announced at WWDC 2018.
→ Source: appstories.net
Permalink - Posted on 2018-06-18 20:48
When writing my review, I needed a way to navigate between the different sections, and all of the subheadings I had created. I had developed an action to navigate to each of the markdown headers, which I was happy with at the time. It was nice to have that functionality to switch around where I was in my review.
Well, I’m happy to say that I have been Sherlocked.
Drafts 5.2 came out while I was in San Jose for WWDC, and I've been meaning to check out the new features since I started getting back into a normal routine. Tim Nahumck, of course, has a great overview of the changes in this version of Drafts, along with some useful examples you can download.
As Tim points out, the ability to navigate headers of a Markdown document through a dedicated "section popup" is a terrific addition to Drafts. Few text editors designed for people who write in Markdown get this right; one of the reasons I still keep Editorial on my iOS devices is because it lets me navigate longer pieces with a header navigation tool. However, the implementation in Drafts 5 is more powerful, modern, and can be controlled with the keyboard (you can invoke the switcher with
⌘\ and, just like Things, dismiss it with
⌘. without ever leaving the keyboard).
→ Source: nahumck.me
Permalink - Posted on 2018-06-18 19:18
Recovered from WWDC, Myke previews his summer's work, Federico shares what he knows about Shortcuts and Stephen gets super nerdy about Dark Mode in macOS Mojave.
On the latest episode of Connected, I go into more detail on Shortcuts and discuss my initial plans and goals for this year's iOS review. You can listen here.
→ Source: relay.fm
Permalink - Posted on 2018-06-18 17:00, modified on 2018-06-19 14:21
WhenWorks is a new iOS app (and web service) from John Chaffee, creator of BusyCal. While it's still related to calendars, it serves an entirely different purpose than BusyCal: making it easy for people to schedule time with you for meetings, lunches, podcasts, or anything else for which you'd usually do the back-and-forth availability dance.
In WhenWorks, you create an event type, which serves as a template for scheduling the same type of event with multiple people (or repeatedly with the same person). You define the meeting type, the duration, your default availability, and various other settings that ensure you're getting the meeting you need. Then you get a link that you can send to the other person, and they get a list of available times. They pick one, the meeting is added to your calendar, and each participant receives a confirmation email with an invite to add to their own calendar.
WhenWorks integrates directly with your calendar, automatically blocking off times you're already busy. You can even define buffer times around meetings to ensure that you have some prep time before appointments (and downtime after).
You can also add a longer description, questions to ask the guest while they're scheduling, and a note that gets added to the event in the calendar. Guests also get links for rescheduling and canceling appointments.
If you're familiar with services like Calendly, this will all sound familiar. I've actually been using Calendly for quite a while to schedule my guests on my podcast Systematic. WhenWorks provides every bit of the functionality of a Calendly pro account at a lower price. What sets WhenWorks apart from other services and the reason I'm excited to switch to it is the fact that it's an iOS app with access to native calendaring:
Only the event host (you, the user) needs to have WhenWorks installed. Invited guests get access to a web interface to schedule, no app or account needed.
WhenWorks is using a Freemium business model. All features are available for 14 days, and after that the only limitation is that you can only schedule up to five events per month. Upgrading to Pro (US $5/mo or $50/year) gives you unlimited events.
Permalink - Posted on 2018-06-18 15:34
At its core, SaneBox is about making sure that only your most important messages hit your inbox. Other messages are safely stored in automated folders like the @SaneLater, @SaneBulk, and @SaneNews folders for reviewing later.
But email sorting is just the tip of the iceberg. With custom folders, custom snooze settings, and @SaneReminders, SaneBox takes email management to the next level.
Set up a custom folder and train it by dragging in a few messages. SaneBox will send all messages from the senders to your new folder. It’s a painless way to organize messages for a special project.
SaneSnooze folders can be customized to defer messages anywhere from hours to weeks. SaneBox comes with default snooze folders like @SaneTommorrow and @SaneNextWeek, but adding custom snooze folders lets you set when messages reappear in your inbox with precision.
SaneReminders are a great way to keep on top of tasks. Send yourself a reminder to do something later or get a reminder that someone hasn’t responded to a messages. For example, blind copy firstname.lastname@example.org and the message will show up back in your inbox only if the recipient doesn’t reply within 3 days.
Also, don’t forget that SaneBox works on top of your existing email setup. There's no app to download or new email account to set up. You can use any email client you want.
Sign up today for a free 14-day SaneBox trial to take back control of your email. MacStories readers can receive a special $25 credit automatically by using this link to sign up.
Our thanks to SaneBox for sponsoring MacStories this week.
Permalink - Posted on 2018-06-18 11:32
Apple has announced a new emergency calling feature for iOS 12, which wasn’t revealed at WWDC two weeks ago. The new functionality will provide automatic, precise location information to first responders when iOS 12 users call 911 in the United States. According to Apple’s press release:
Approximately 80 percent of 911 calls today come from mobile devices, but outdated, landline-era infrastructure often makes it difficult for 911 centers to quickly and accurately obtain a mobile caller’s location. To address this challenge, Apple launched HELO (Hybridized Emergency Location) in 2015, which estimates a mobile 911 caller’s location using cell towers and on-device data sources like GPS and WiFi Access Points.
Apple today announced it will also use emergency technology company RapidSOS’s Internet Protocol-based data pipeline to quickly and securely share HELO location data with 911 centers, improving response time when lives and property are at risk. RapidSOS’s system will deliver the emergency location data of iOS users by integrating with many 911 centers’ existing software, which rely on industry-standard protocols.
The FCC has mandated that mobile phone carriers locate callers within 50 meters 80% of the time by 2021. According to Apple’s press release, Apple’s HELO technology is capable of meeting and exceeding those standards today, and with the adoption of RapidSOS’s protocol this fall in iOS 12, those benefits will be enjoyed by 911 call centers too.
Permalink - Posted on 2018-06-16 16:46
I should start with the obvious. 3D Touch is broken! The user experience is far from great. Apple introduced 3D Touch and its new related interactions Peek and Pop in 2014. It’s been almost 4 years since its first introduction, yet people don’t know/use 3D Touch. Why would they? Even tech-savvy users don’t know which buttons offer 3D touch. Let alone regular users.
What would happen if we decide to make all links same color and style as the regular text? People would not know what to click on right? Why is 3D Touch be any different? We rely on our vision to decide actionability before anything else. If you can’t distinguish 3D Touchable buttons from those that are not, how are you supposed to know you can press on them? Look at this screenshot and see if you can tell which of the buttons can be 3D Touched.
I couldn't agree more with the idea of "decorating" buttons with 3D Touch visual cues.
Here's the thing: I use 3D Touch a lot, and I love the fact that it's the modern equivalent of a contextual click, but, anecdotally speaking, I've never seen any of my friends or relatives use it. Not the quick actions on the Home screen, not peek and pop. It's like 3D Touch just isn't there for them. It's hard to say whether the very concept of 3D Touch is flawed or if iOS' design prevents discovery of this unique interaction. However, the argument that an interface with little depth doesn't lend itself well to a gesture built around pressing into UI elements is a compelling one. It'll be interesting to see what happens with future iPads and iPhones, too.
→ Source: medium.com
Permalink - Posted on 2018-06-15 18:11
In a brief press release today, Apple announced a multi-year deal with Oprah Winfrey to produce original video content. From the press release:
Together, Winfrey and Apple will create original programs that embrace her incomparable ability to connect with audiences around the world.
The project joins more than a dozen others that have been signed by Apple for video content in the past year or so. There is no word yet on when the programming might be released, but CNN Money reports that Oprah is expected to have an onscreen role as a host and interviewer.
Competition among Apple, Netflix, and Amazon for original video content continues to heat up. Against the backdrop of consolidation among traditional media companies and telecommunications companies like the recently-closed AT&T/Time Warner merger, the stage seems set for major shifts in the video entertainment industry.
Permalink - Posted on 2018-06-15 16:25
I've witnessed a slow but encouraging evolution take place over the past six years that has transformed WWDC for the better. When I first flew to San Francisco in 2013, WWDC was a self-contained event. Other than the Thursday night bash, the conference happened entirely within the fortress-like hulk of Moscone West. Developers and others in town for the week gathered outside the convention center in restaurants, bars, and hotel lobbies, but there were few organized activities if you didn't have a ticket. That's changed.
The shift was driven initially by the conference's popularity. In 2013, demand for tickets was so high that they sold out in just two minutes. That left a lot of disappointed developers, but many made the trip to San Francisco anyway to catch up with old friends and meet new ones.
Over the next couple of years, events sprung up to cater to people in San Francisco without a WWDC ticket. One of the first was AltConf, which was first known as AltWWDC, a name that likely didn't endear it to Apple. Companies like Twitter also began holding keynote viewing parties at nearby offices.
Although Apple promotes events like AltConf now, that wasn't the case in 2013. That year, CocoaConf planned a parallel conference at a hotel near Moscone that was canceled at the last minute because the hotel decided the event was in conflict with WWDC and a contract between it and Apple. It isn’t clear whether Apple had a hand in the cancellation, but the incident had a chilling effect on events held during WWDC.
Two years later, Apple butted heads with AltConf over its plans to stream the WWDC keynote and State of the Union presentations to about 1000 attendees. The free conference had streamed the keynote in 2013, and 2014, but 2015 was the first year an optional $300 ticket to the event could be purchased. Apple threatened legal action, but ultimately allowed AltConf to stream the sessions.
In retrospect, the streaming dispute seems to have been a turning point that forced Apple to rethink its approach to the broader community surrounding WWDC. Things began to change in 2016 leading up to what was the last WWDC held in San Francisco. That year, Apple reversed course, promoting conferences like Layers and AltConf as well as social events like the live recording of The Talk Show. With demand for WWDC tickets and interest in the event as high as it has ever been, Apple has continued to promote alternative conferences, fundraisers, live podcast recordings, and other events since the conference moved to San Jose last year.
This year, Apple went a step further, reaching out to the developer community in a new way. WWDC keynotes aren't just for developers. The presentations are also targeted at a broader consumer and media audience, which has felt alienating at times to the developers in the room. It was a small thing, but including cameos by Craig Hockenberry and Jim Dalrymple during the opening video this year and spotlighting developers’ families during the closing video personalized the keynote in a new way. It sent the message that Apple understands the developer community and cares about it.
The sense of community has been helped by moving WWDC to San Jose too. It feels like nearly everyone you see in downtown San Jose is there for WWDC. It’s easier to run into friends and meet new people, and the atmosphere feels more relaxed.
The move to San Jose has made it easier for Apple to spread out beyond the convention center too. This year, the company incorporated its focus on health and fitness with WWDC Run, a series of morning runs led by Nike Run Club coaches, and bootcamps led by renowned personal trainer Kayla Itsines. More than ever before, WWDC is about more than sitting in darkened rooms watching presentations about code.
Apple’s approach to WWDC displays newfound confidence that, in retrospect, wasn’t present in earlier years. It’s as though the company recognized that it no longer needs to be in a big city to capture the attention of the world. Nor does it need to be the only show in town. The McEnery Convention Center remains at the center of WWDC, but it feels like part of a broader event in a way that San Francisco never did.
Those early WWDCs I attended were fantastic. I met people who became close friends, but I’ve managed to meet even more people and have more fun in a far more relaxed atmosphere in the past two years, which is why I hope WWDC stays in San Jose and Apple continues to find ways to embrace the broader community as part of its annual conference.
Permalink - Posted on 2018-06-15 04:36
Apple has debuted a series of four videos on its YouTube channel called Behind the Mac that focus on the Mac as a tool to unleash creativity.
Three of the videos profile individuals. Recording artist Grimes is interviewed about how she uses a MacBook to create music, Peter Kariuki explains how he built an app to monitor driver safety in Africa, and Bruce Hall, a legally blind photographer, shows how he uses a Mac to process his photographs.
The fourth video is an montage that includes clips of each of the creators along with other people using Macs to make art. Interestingly, the spot includes a wide range of Mac laptops ranging from the white polycarbonate MacBook to current MacBook Pros. Each of the videos closes with the prompt ‘Make something wonderful.’
What I like best about these videos is their focus on the work of each creator. Like many users, I’ve been frustrated by the lack of updates to parts of Apple’s Mac lineup and issues with its laptops’ keyboards, but I still enjoy seeing what even older-model Macs can help people create. I expect we’ll see more of these videos debut in the coming weeks.
Permalink - Posted on 2018-06-14 19:09, modified at 19:22
Apple updated its iWork suite for iOS today to version 4.1, bringing a variety of small improvements along with one centerpiece feature: support for audio recordings.
Pages, Keynote, and Numbers have all added the ability to record audio in-app that is saved inside your document. The process for this is simple: hit the plus icon to open the content menu, then select the tab that looks like a photograph, after which you'll see a new 'Record Audio' option. There's a special UI that pops up from the bottom of the screen whenever you're recording an audio snippet, which at first only includes a record button. After you've stopped recording, you'll have the ability to preview your recording, add more audio to what you've already recorded, and edit clips by trimming, deleting, or recording over portions of them. Once you're done with a recording, hit the 'Insert' button in the top-right corner and you'll see your audio presented as a small, round icon matching iOS' system volume icon. This icon can be picked up and moved anywhere inside your document, working especially well as an elegant partial overlay on top of existing content. Tapping the icon will engage audio playback.
While the in-app recording system is fine, I'm thankful that iWork also supports importing existing audio files using either drag and drop or the Files UI picker. You can also export audio recorded in an iWork app using drag and drop, or by selecting the file and hitting the Share button in the contextual menu.
Outside of the new audio-focused features, today's update brings only a handful of smaller improvements. Keynote users will be glad to hear that you can finally edit or create new master slides on iOS, something the Mac version has long been able to do; the app also lets you export presentations as a movie or images. Pages' Smart Annotation feature has been upgraded with the ability to stretch and wrap as you make edits, and the app now enables adding colored backgrounds to documents. Following the introduction of support for LaTeX and MathML notation in Pages 3.1, now Keynote and Numbers gain that support too. If you prefer using the Apple Pencil as a navigation tool for scrolling, all iWork apps now have an option in Settings you can toggle for that behavior. There are also new shapes available in all three apps, new chart design options, and more.
iWork for Mac has been updated as well with most of the minor improvements of the iOS version, but the audio recording capabilities are not included.
iWork 4.1 isn't a major update, but it's a welcome one. Apple continues iterating on Pages, Keynote, and Numbers regularly, with a special emphasis on iOS improvements, and I've definitely seen my iWork use increase as a result.
Permalink - Posted on 2018-06-14 15:56, modified on 2018-06-21 16:23
Last month, we looked at the Power Mac G4 line, a series of computers that defined the professional workstation for OS X users for many years.
In June 2003 — 15 years ago this month — Steve Jobs took the wraps off its successor, the Power Mac G5.
In every way, the Power Mac G5 was more than the G4 it replaced. It was bigger, more expandable and faster.
As his "One More Thing" at WWDC 2003, Jobs spoke about the chip, the system, then the product, building a story of power and capability unrivaled in the PC world.
At the heart of this new Mac was the PowerPC G5 processor. Created in partnership with IBM, it was the first 64-bit processor to be put in a personal computer. Clocked up at 2 GHz, it was the fastest 64-bit processor ever shipped, and with a 1 GHz front-side bus and support for up to 8 GB of memory out of the box, it blew away the system built around the G4.
The G5 processor was designed was to be put into multi-processor machines, which Apple sold at the upper end of its product range.
In practice, these numbers meant that the Power Mac G5 ran circles around the G4, but Apple took the opportunity to improve more than just CPU performance. The original G5 used 400 MHz RAM, which sported twice the bandwidth of the memory in the outgoing G4.
On the board, Apple supported PCI-X for its slots, and used SATA for connecting spinning hard drives and the SuperDrive. This gave users faster access to their files and media, making all of OS X feel smoother.
The new case would be dubbed the "Cheese Grater" due to the large number of holes in the front and back. With the capacity for dual processors, 8 GB of RAM and multiple PCI cards, the system could generate massive amounts of heat. Those openings were key to keeping air moving through the case.
The cooling system was far more complex than a bunch of little holes, though. The Power Mac G5 was home to nine fans.
"You might think, 'Oh my God! Nine fans means its going to be nine times louder!'
No! It turns out, the opposite is true. By putting the fans right precisely where they're needed and independently controlling them all, we can make it a lot quieter.
The enclosure was divided into four separate thermal zones, and those fans did their job. The G5 was twice as quiet as the "Wind Tunnel" Mirror Drive Doors G4 it replaced.
The G5 split its I/O between the front and back panel, with a USB and FireWire port joining a headphone jack on the front of the tower and everything else out the back.
The case was made entirely out of aluminum, immediately making the G4s look small and outdated. Apple retained the handles at the top and bottom of the case, but they were noticeably less comfortable to use than the curved plastic ones on the G4.
Like the Power Macs from the previous several years, the side of the G5 opened to reveal its internal components. The side didn't swing down like before, but simply lifted off the case when unlocked. It was paired with a clear plastic air router that also lifted out of place with minimal effort, granting a user access to their machine's internals.
Over the next two years, Apple kept the Power Mac G5 updated on a regular basis.
In June 2004, Apple gave the G5 a speed bump, with a dual 2.5 GHz at the top of the line.1
As clever as Apple's computer-controlled fan system was, it was no match for this machine, and Apple rolled out its first liquid-cooled Mac.
Even now, 14 years later, this still blows my mind. Apple did all it could to hide the intricacies of the radiator and hose system behind nicely-designed metal panels:
Note that not all G5s were liquid cooled after it was introduced. Apple only used this technology when it was needed, and only three SKUs came with a radiator:
Unfortunately, the first two liquid-cooled models were prone to leaks, which could lead to power supply, CPU and motherboard damage. Apple never issued a specific repair bulletin about these issues, but by the time I was a Mac Genius in 2007 or so, leaky G5s were taken very seriously. Bringing them into the repair system included a lengthy safety interview with the owner, and repairs were more or less always just "taken care of," regardless of warranty status.
That antifreeze-colored hiccup aside, the PowerMac G5 saw modest improvements during its lifetime, but the last one is worth mentioning.
Announced in October 2005, the last generation of Power Mac G5s included PCI Express and, at the top end, a "quad-core" configuration, made up of two dual-core G5 CPUs. This was a beast of a machine, and was noticeably faster at multi-threaded tasks than its siblings.
It's easy to spot a "G5 Quad" by looking at the power cable it used. These machines sported massive power supplies and the oddball IEC320 C19 power connector instead of the standard one still found on things like the iMac today.
The PowerPC G5 chip was, in hindsight, doomed to failure.
Signs of trouble started in 2004, when Apple was unable to keep a promise made at the G5's original introduction: that within 12 months, there would be a 3 GHz Power Mac G5 for sale.
On stage at WWDC 2004, Jobs addressed the broken promise, saying that moving to anything smaller than 90 nanometer fabs had proved difficult for IBM. However, he was quick to point out that their partner had faired better in this regard than Intel.
Apple would never ship a 3 GHz G5, topping out with a dual 2.7 GHz system in early 2005.
As time went on, another problem became apparent: the lack of a PowerBook G5.
At the heart of the problem was the power required by the G5 chip and the heat it generated. This was widely poked fun of by Apple fans. Forum posts were often accompanied by this image:
This problem was heavily stressed when Apple introduced the switch to Intel processors, and again when the first MacBook Pro was announced. Simply put, there was no way that the G5 could be squeezed into such a small portable device.
That leaves the Power Mac G5 in a weird place, historically. It was the fastest PowerPC Mac ever built, but in a way, that's like being the most deadly dinosaur to roam the Earth. Being on top only matters until the landscape changes out from underneath you.
Permalink - Posted on 2018-06-13 18:07, modified at 18:09
Twitter has begun introducing a series of features aimed at highlighting the news and events of the day. The company has also updated how Moments are displayed in the official Twitter app. According to Twitter, the goal of the changes is to make it easier for users to follow the news without having to know which accounts, hashtags, and Moments to follow.
Current events is the primary focus of the new Twitter features, many of which will not be rolled out for weeks or months according to a Twitter blog post. The Explore tab now includes breaking news stories displayed as captioned image banners across the top of the section. Tapping into a story opens a collection of images, video, and tweets in a horizontally scrolling narrative.
Below the highlighted story, Explore is divided into separate sections according to topic. My sections include ‘Trends for you,’ and ‘Today’s Moments’ followed by topical tabs like Software Engineers, Gal Godot, Technology Journalists, and Indie Game Developers. The quality of the content of each section is hit or miss. As you can see from the screenshot below, Twitter’s definition of ‘Software Engineer’ is loose and I got a section full of tweets about Gal Godot because, as Twitter helpfully explains, I liked an MKBHD tweet that mentioned her.
In the coming months, Twitter plans to add breaking and personalized news at the top of users’ feeds similar to the sports news feature that the company introduced in 2017. Twitter has said that it plans to start sending users push notifications based on their interests in the coming weeks too. From Twitter’s blog post, it appears these will be turned on by default requiring anyone who doesn’t want to see the notifications to turn them off:
Now we’re experimenting with sending notifications to you based on your interests (like who you follow and what you Tweet about), so you won’t miss a beat. You can always turn off these notifications by going to your recommendations settings and toggling to not see news.
Twitter is also changing Moments to scroll vertically like your timeline and adding a dedicated World Cup page.
None of these changes has a meaningful impact on my Twitter use because I use a third-party client, but they still bother me. I prefer to manage what I see on Twitter myself. Twitter may think it knows what I want to see, but judging from the suggestions in my Explore tab today, it’s ability to do that is questionable. Also, the addition of notifications that will be turned on by default strikes me as tone deaf considering current efforts of companies like Google and Apple to help users better manage notifications.
For now, the changes are contained mainly in the Explore tab. It will be interesting to see how users react when the changes spread to targeted news in their timelines and they begin receiving push notifications about raccoons climbing skyscrapers.
Permalink - Posted on 2018-06-13 14:27, modified at 14:35
In my Future of Workflow article from last year (published soon after the news of Apple's acquisition), I outlined some of the probable outcomes for the app. The more optimistic one – the "best timeline", so to speak – envisioned an updated Workflow app as a native iOS automation layer, deeply integrated with the system and its built-in frameworks. After studying Apple's announcements at WWDC and talking to developers at the conference, and based on other details I've been personally hearing about Shortcuts while at WWDC, it appears that the brightest scenario is indeed coming true in a matter of months.
On the surface, Shortcuts the app looks like the full-blown Workflow replacement heavy users of the app have been wishfully imagining for the past year. But there is more going on with Shortcuts than the app alone. Shortcuts the feature, in fact, reveals a fascinating twofold strategy: on one hand, Apple hopes to accelerate third-party Siri integrations by leveraging existing APIs as well as enabling the creation of custom SiriKit Intents; on the other, the company is advancing a new vision of automation through the lens of Siri and proactive assistance from which everyone – not just power users – can reap the benefits.
While it's still too early to comment on the long-term impact of Shortcuts, I can at least attempt to understand the potential of this new technology. In this article, I'll try to explain the differences between Siri shortcuts and the Shortcuts app, as well as answering some common questions about how much Shortcuts borrows from the original Workflow app. Let's dig in.
There's an important difference between shortcuts and the Shortcuts app. As a system functionality, a shortcut is a convenient way to reopen or interact with a key feature of an app that the user has previously seen or completed. For instance, in iOS 12 you may see shortcuts for ordering a coffee on your way to work or playing a podcast playlist in a third-party app as you're driving back home. App shortcuts are not workflows; they are the equivalent of a "point of interest" in an app that you can easily find again.
In Apple's parlance, shortcuts are "donated" by apps to the system after a user performs an action in an app. Then, iOS 12 suggests shortcuts in Spotlight search results (where they appear as tappable items featuring the app's icon and labeled action) and on the Lock screen as notification-like bubbles. In watchOS 5, you'll see suggested shortcuts on the Siri watch face.
Shortcut suggestions are generated on-device by taking into account contextual triggers such as the time of day and day of week, the user's location, and detected motion (such as walking or driving). Apple has been developing and refining their so-called Proactive technologies for years now1, but the Shortcuts project marks the company's first foray into deep-linked app actions that react to user patterns and environmental triggers.
There are two types of app shortcuts. The first kind is a basic shortcut that opens a specific piece of content or section in an app. These simpler shortcuts are based on NSUserActivity, an API that Apple first introduced in iOS 8 to enable third-party apps to hand off activity to other devices and later expanded in iOS 9 to offer search results in Spotlight. The same API, with minimal tweaks on the developers' side, is used in iOS 12 to provide users with shortcuts that launch apps into specific screens or activities. I expect developer adoption of shortcuts based on NSUserActivity to be massive when iOS 12 launches later this year; especially for apps that do not require execution of tasks inline within Siri/Spotlight/Lock screen, NSUserActivity should be enough.
According to Apple, while NSUserActivity enables basic integrations with apps, SiriKit Intents provide the best experience for users who want to accomplish more with shortcuts. This is not a new API either, but the way it's surfaced throughout the system is.
SiriKit Intents have been around since iOS 10 as a way for developers to integrate their apps natively with the Siri voice experience and perform actions inline within the assistant. Apple launched SiriKit as a domain-based API designed for specific kinds of apps, and slightly improved it last year with the addition of visual code, list, and note-taking apps. Just as it was becoming clear that Apple's narrow domain-specific approach couldn't scale to thousands of apps that can't be easily categorized, the company is turning SiriKit on its head.
In iOS 12, developers can now create their own custom intents based on built-in semantic templates; furthermore, existing SiriKit Intents can break out of the Siri UI and also work as shortcuts in other places such as Spotlight, the Lock screen, and even the Siri watch face. Apple's approach isn't surprising: if iOS apps can have the ability to perform tasks with custom interfaces and responses outside of the main app environment (as is currently possible with SiriKit Intents), why not expand the same functionality to other types of proactive assistance? With shortcuts, any essential, repeatable feature of an app can become an action that can be executed from anywhere on the system without launching the full app.
The idea of frequent usage and user routine is what separates intent-based shortcuts from traditional SiriKit voice interactions. For example, iOS 12 may learn that, on the way back home from work on Thursdays, you like to order pizza and have it delivered at 8 PM. Or that on an average workday around 1 PM, you open Things into your 'Office' project. These are repeatable actions that developers can turn into shortcuts with custom interfaces2 using the same underlying Intents technology first launched in iOS 10.
Developers who are planning to integrate with SiriKit in iOS 12 will have to consider whether users may want to execute actions from their apps elsewhere on the system; those who have shied away from integrating with SiriKit so far should probably look into custom intents now.
This new feature allows any app to offer a custom interface and custom responses that are used when the intent is invoked via Siri or shortcuts. To create a custom intent, developers can choose from building blocks that include verbs such as "do" or "order" and other templates; these actions essentially define how Siri talks about the task it's executing. I'm excited about the prospect of any app becoming eligible for Siri integration; going forward, I expect Apple to continue expanding its custom intent technology as it may open up Siri to hundreds of thousands of new app integrations.
Even though the opposite may seem true, the shortcut features I've described so far do constitute a form of automation. Arguably, suggested shortcuts are system automations – actions to trigger a specific function that are conveniently presented at the best available time or location to anticipate users' needs. Some old-school power users may disagree with me on this, but, more broadly, I consider Apple's Proactive technologies – whether in surfacing a calendar event in an email message or ordering coffee while driving to work – a new kind of automation. Only time and developer adoption will tell if Apple's bet is successful; conceptually speaking, I see suggested shortcuts as an effortless, almost invisible way to get users accustomed with the idea of actions that are automatically surfaced by the OS.
The line between system and user automation gets blurry once we start considering the second layer of Apple's Shortcuts initiative: the ability for users to create custom phrases to launch shortcuts.
Available in Settings ⇾ Siri & Search, iOS 12 features an option for users to define their own phrases for launching specific shortcuts via voice. This is done by speaking a custom phrase into a Siri recording UI that transcribes the command and creates a shortcut that can be invoked at any time. The Settings app automatically suggests recently used app shortcuts as well as other shortcuts that were previously "donated" by apps. Both recording a custom shortcut phrase and launching the phrase via Siri require an active Internet connection. Once given a custom phrase, user-configured shortcuts appear under the My Shortcuts section in Settings.
There are a few details worth noting about adding custom shortcut phrases to Siri. In their apps, third-party developers can embed messages and buttons (which they can design) to bring up the Siri UI to record a shortcut phrase. This means we'll start seeing apps populate important screens or actions with suggestions and buttons to record a shortcut phrase. Moreover, in the Siri recording UI, developers can include a phrase suggestion, but it's up to the user to decide what they want to record.
More importantly, users always have to create personalized shortcut phrases through direct interaction: apps cannot automatically fill the 'My Shortcuts' page in Settings with shortcuts and custom phrases. The user has to associate a custom phrase to a shortcut first.
The more I think about it, the more I see custom shortcut phrases as the next big step in making Siri a more personal assistant that is unique to each user. As would happen with an actual assistant, shortcut phrases allow users to form their own language over time, creating a personalized set of instructions that only their assistant can interpret and act upon. It's the equivalent of jargon in a group of friends, but applied to Siri and app actions. The potential accessibility perks are tremendous too: Apple now enables everyone to create custom Siri phrases that can be however long or short they want; this removes the need to find actions nested in apps, multiple levels deep into their navigation stack.
Here's why I believe Apple and the Workflow (now Shortcuts) team have been incredibly smart in reframing the concept of user automation around Siri and voice: when you think about it, custom phrases aren't too dissimilar from keyboard shortcuts. However, spoken phrases are easier to remember – they don't feel like dark magic to regular users who have never bothered with "automation" before, and, most of all, they are natively supported across the entire spectrum of Apple products, from iPhones and AirPods to HomePods and Watches.3
I strongly believe that personalized phrases are the first step towards changing the fundamental Siri experience, which is going to evolve into a personal command log – from one Siri to a million Siris, each uniquely tailored to the user who customized it. Furthermore, custom phrases reveal the third (and, for now, final) layer of Apple's automation and Siri personalization strategy: the brand new Shortcuts app.
The Shortcuts app, announced at WWDC last week, is the new version of Workflow. The app will not be part of iOS 12 itself; instead, it'll be available on the App Store. In conversations I had last week, it appears that Apple's goal is to offer full compatibility with existing workflows previously created in the Workflow app. My understanding is that Apple is very much aware of the fact that a sizable portion of the pro/prosumer community relies on Workflow to enhance their iOS experience in key ways; they don't want to change that relationship for the worse. Very little if nothing should break in the transition to the Shortcuts app; in fact, I'm optimistic about the prospect of retaining all the actions from the original Workflow plus new ones created specifically for Shortcuts.
At first glance, Shortcuts looks like a cleaner, more intuitive version of Workflow designed for the modern iOS 12 design language. The app lets you create workflows – now referred to as "custom shortcuts" – with an editor that, just like the Workflow app, supports drag and drop to move actions from a library (organized in content types) into the shortcut editor. In the Shortcuts app, Apple removed the swipe gesture to navigate between the action library and editor in favor of a bottom panel that is reminiscent of Apple Maps and Stocks in iOS 12. A search field is always available at the bottom of the editor; tap on it, and you'll be able to view all the actions Shortcuts offers. Abundant design refinements and new action panel aside, the editor's core structure looks just like Workflow's.
Despite the close resemblance, Shortcuts isn't just a redesigned version of Workflow. And it's also more than a glorified utility for people who want to geek out on an iOS device. As the "third layer" after suggested shortcuts and custom phrases, custom shortcuts are the most versatile tool for every iOS user who wants to deeply personalize Siri, automate sequences of actions, and, yes, even augment their productivity. The Shortcuts app aims to be a powerful blend of Workflow, app shortcuts, and Siri all rolled into one as a new take on personal assistants and iOS automation. It's a bold idea that keeps what made Workflow unique while also opening it up to a broader user base and deeper system integrations.
There are a few key elements to consider. First, app shortcuts – the aforementioned actions donated by developers with NSUserActivity or SiriKit Intents – can be part of a custom shortcut created in the Shortcuts app. These shortcuts4 should either appear under Siri Suggestions or Apps in the action library. The inclusion of these actions in the Shortcuts app is a big deal: for the first time, users can create chains of actions that execute native third-party app commands without launching apps through URL schemes. Whether you want to launch an activity in an app or perform an action, these native actions won't require you to write any code or talk to any web API – and in the case of intent-based actions, they will run inline within the Shortcuts app itself.
In a way, this is the first step toward the WorkflowKit framework I imagined last year – a solution for apps to be supported in an automation environment without the limitations and security concerns of URL schemes. What I couldn't foresee was that Apple would reuse SiriKit for this. I don't think these new integrations will obviate the need for more customizable URL scheme actions just yet (more on this below), but it's a move in the right direction.
Perhaps more impressively, it seems that, upon assembling a custom shortcut, users will be able to choose to display native third-party app actions or not with a new 'Show When Run' toggle. If I were to guess, I'd say that this setting applies both to shortcuts being run in the Shortcuts app as well as the execution of multiple steps in Siri.
Which brings me to the second notable trait of the Shortcuts app: custom phrases. Just like shortcuts provided by apps to the system can be assigned a custom Siri invocation phrase, custom shortcuts from the Shortcuts app can be assigned a phrase to quickly trigger them from Siri.5 This ties into another key functionality of custom shortcuts: whenever possible, Siri on iOS 12 will try to run the multiple steps that comprise a custom shortcut inline, without launching the Shortcuts app; it'll do so simply by moving down the sequence of actions and confirming results to the user.
This was demonstrated by Apple's Kimberly Beverett at last week's keynote: with a custom shortcut, Siri was able to send an iMessage to a contact, set a HomeKit scene, play audio in a third-party radio app, and open directions in Maps – all in a single request that ran contextually inside Siri. The demo showcased two powerful aspects of Shortcuts' integration with Siri: background execution and the ability to skip confirmation steps in a series of automated actions. Just like in a traditional "workflow", Siri completed multiple actions in a row, displayed a summary message, and only launched an app at the very end. I cannot even imagine all the advanced custom shortcuts I could build by mixing background execution with web APIs6, native app shortcuts, and Siri in the same sequence of actions.
Execution of multiple background tasks inside Siri feels to me like the final piece of the Workflow acquisition puzzle. If you consider all the layers I covered above – app shortcuts, custom phrases, and custom shortcuts – it's clear that Apple wants to transform Siri into an assistant capable of not only handling a variety of app-related requests but, more importantly, bundles of sequential requests that are routinely requested by users. Of all the Shortcuts features I've seen so far, running whole sequences of steps in Siri is the one I'm most curious about.
Below, I've assembled a collection of details about shortcuts and the Shortcuts app that I was able to put together by rewatching the WWDC videos, as well as having lots of interesting conversations last week.
Magic Variables and Scripting actions are in. Two of Workflow's most advanced functionalities will continue to be supported in Shortcuts and will likely play an essential role in the creation of complex chains of actions.
Magic Variables, introduced in Workflow 1.7, enable actions to continuously create variables behind the scenes, letting users dynamically convert them between types using the Content Graph engine. Magic Variables are unique to Workflow and they have dramatically changed how advanced users can chain actions together. Judging from Apple's screenshots of the Shortcuts app, Magic Variables will be supported in custom shortcuts and users will also be able to include them as part of a response read aloud by Siri.
I'm also glad to see the inclusion of the Scripting category of actions in Shortcuts. Currently, the Scripting section of Workflow features options such as conditional blocks, repeat loops, getting data types and setting file names, and even fetching device details like battery and network information. All signs are pointing to Shortcuts retaining the same functionalities.
Shortcuts has an action extension for the share sheet. One of Workflow's most powerful (and flexible) system integrations is the action extension that lets users run a workflow contextual to the app they're in and the item they're sharing. From what I hear, Apple plans to keep the same action extension for the Shortcuts app. As someone who regularly uses dozens of workflows activated from the extension in apps like Safari, Ulysses, and Photos, I look forward to adapting them to the Shortcuts extension.
The Gallery stays, but questions remain about public sharing. As confirmed by Apple, the Shortcuts app will have a built-in Gallery for users to browse and download hundreds of pre-made custom shortcuts for different tasks. Shortcuts will be organized in categories and curated by Apple on a regular basis. The Gallery isn't new to the Workflow app, and it's unclear if public sharing of user-created custom shortcuts will be part of it. Shortly after Apple acquired Workflow, the company removed the ability for users to share workflows directly to the gallery with public user profiles; it seems like Shortcuts will follow in the same footsteps with a Gallery limited to custom shortcuts created and promoted by Apple.
I am confident that the Shortcuts app will continue to offer ways for users to share custom shortcuts with each other, but I believe sharing won't be tied to the Gallery, at least initially. That said, I would love to see user sharing return to the Gallery in a future update as a curated marketplace of custom shortcuts created by the community and vetted by Apple. I see tremendous potential in letting the iOS community natively extend Siri and iOS apps.
HomeKit integration. The Shortcuts app will support HomeKit; during the keynote, Apple demonstrated how a custom shortcut could set a thermostat to a specific temperature and toggle a fan on and off. We haven't actually seen how HomeKit actions can be set up in Shortcuts yet though, so it's not clear if HomeKit actions will let you configure individual characteristics of accessories with granular options. I wouldn't be surprised if, for simplicity's sake, the first version of Shortcuts only supports triggering existing HomeKit scenes.
There is a new Show Result action for Siri output. Judging from the slides shown at WWDC, this is the action that will let users craft custom Siri responses in the Shortcuts app. The Show Result action can be filled with Magic Variables and arbitrary plain text; when a custom shortcut is run via Siri, the assistant will speak the text contents of the action.
I can already imagine the possibilities this action opens up – such as the ability to end up with different Siri responses depending on the result of a conditional block in Shortcuts. I'm curious to know what happens to Show Result when a custom shortcut is run outside of the Siri UI though.
No support for passing input to a custom shortcut from Siri. My understanding is that triggering a custom shortcut from Siri won't allow you to pass along a message as input text. For instance, if you have a custom shortcut that sends the input text to Ulysses and have associated the "Ulysses append" phrase to it, you won't be able to say "Ulysses append Buy Nintendo Switch" to Siri and expect the "Buy Nintendo Switch" part to be sent to the Ulysses app.
From what I've been able to gather so far, Siri in iOS 12 doesn't currently support the ability to pass an input message to a custom shortcut activated with a user phrase, and it's unlikely to gain such functionality in the first version of iOS 12. This means that Siri will only be a launcher for custom shortcuts, not an actual interface that can pass user commands for specific parameters at runtime. There's a lot of complexity involved in this, and I assume it is one of the next big features in the pipeline for Shortcuts.
Native app shortcuts don't support custom input, output, and customizable fields. On a similar note, native app shortcuts based on SiriKit Intents that execute within the Shortcuts app can't receive a custom input from previous actions. They also can't set Magic Variables as custom output and don't have customizable parameters. As shown in Apple's screenshots of the Shortcuts app, native app actions are disconnected from every other step in a custom shortcut, which is likely going to limit their flexibility for advanced users.
URL scheme actions for third-party apps should remain available. In addition to manually launching URL schemes with the 'Open URLs' and 'Open x-callback-url' actions, Workflow has long offered a selection of built-in third-party app actions that are based on URL schemes but abstract that complexity with a visual module. For example, Bear, Things, and Ulysses come with native Workflow actions that can pass along custom parameters when launching the respective apps.
After talking to several developers at WWDC, it sounds like there's a good chance third-party app actions powered by URL schemes should remain in the Shortcuts app as well. Personally, I think these will continue to be solid workarounds until SiriKit Intents are powerful and customizable enough to offer the same functionality of URL scheme actions. Ideally, in a future version of Shortcuts, these actions should be replaced by visual SiriKit Intents that can be customized with multiple variables and parameters by users. For now, it seems like traditional URL scheme actions will still allow for deeper customization and control than native app shortcuts.
Shortcuts has a widget with limited user interaction. As shown on Apple's website, the Shortcuts app will keep the same widget that lets Workflow users run workflows from outside the app.
The widget is one of Workflow's most peculiar features: it supports remote execution of workflows with basic interactivity, but it kicks you back to the main app if it comes across an action that can't be completed from the widget, such as entering text with the keyboard or editing an image. The idea of running the same sequence of actions in different, more constrained environments brings me to...
Running custom shortcuts in Siri and audio-only contexts. Because Shortcuts has a widget, and because Apple said custom shortcuts will be offered on iOS, watchOS, HomePod7, and even CarPlay, I can then infer that the ability for the same shortcut to run in different contexts is expanding to audio and the watch face's limited UI. Just like the widget, I assume this means a custom shortcut will completely execute within Siri (whether on HomePod or the Watch) unless an action requires manual user input. In that case, Siri would probably ask you to continue running the shortcut on your iPhone.
If this is the case (and I think my explanation is mostly accurate), I can imagine that custom shortcuts that embed actions such as 'Choose from List' or 'Ask for Input' will require jumping from Siri to the Shortcuts app. I would be completely okay with this as a first version. Generally speaking though, I'd love for Siri to give me multiple options to choose from a list, allow me to enter input via voice, and interact with a shortcut while it is executing in an audio-only context.
The new Play Media intent. Finally, speaking of audio: SiriKit in iOS 12 supports a new INPlayMedia intent which, as the name suggests, allows the assistant to play audio and video content from third-party apps. The intent can launch apps in the background (such as the radio station demo from the keynote) and supports playing audio on HomePod as well as suggestions on the Lock screen.8
In terms of app adoption and shortcut integrations, this intent should make it possible for the likes of Spotify and Overcast to offer users a way to play their favorite content via Siri just like they can with Apple Music and Podcasts. Overcast and other third-party podcast apps could, for instance, offer Siri shortcut buttons in various places of their UIs to let users record phrases such as "Play my Overcast queue" or "Let's listen to Connected"; playback would then kick off immediately in the background and play through the device's speakers, HomePod, or other devices. If my interpretation of this is correct, the combination of shortcuts and the new Media intent may alleviate a lot of the annoyances typically involved with using Siri and third-party media services.
In iOS 12, Apple is providing users with a path from simple shortcuts to custom automation; both revolve around the guiding principle of letting users choose how they want Siri to get things done on their behalf. There is a progressive disclosure of automation happening from system shortcuts to the Shortcuts app: whether you've never dabbled with app shortcuts before or are a Workflow expert, iOS 12 wants to help you spend less time using your phone – a recurring theme this year – and let shortcuts do the work for you.
Shortcuts are going to be the feature for developers to adopt this summer. It was the talk of WWDC 2018 and, based on my conversations, developers are excited and optimistic about embedding SiriKit and shortcuts within their apps. In particular, custom phrases and custom SiriKit intents seem to be the most attractive proposition for developers who want to let users conveniently open and interact with their apps.
Shortcuts and custom phrases feel like the future of Siri and apps: they're supported in every permutation of Siri and, most importantly, they let users develop their own language to access commonly used actions instead of forcing them to remember a syntax made of app names and verbs. Shortcuts, phrases, and custom intents feel like an app- and user-centric response to Alexa skills that Apple is uniquely positioned to build.
While shortcuts are a way to "sell" the idea of lightweight automation to regular users, the Shortcuts app is shaping up to be the automation powerhouse we were hoping to see following the Workflow acquisition. From what I've seen so far, the Shortcuts team has been able to build a cohesive narrative around basic shortcuts and custom ones, going far beyond what Workflow could have achieved as an independent app. I'm optimistic that heavy Workflow users won't be disappointed by Shortcuts.
Even more than feature parity with Workflow though, I see integration of custom shortcuts with Siri as the next frontier for making automation accessible to more people. I believe this is what's going to push automation forward as something more than "scripting" and other old preconceptions. Giving users the tools to create automations with drag and drop and easily trigger them with their voice is a remarkably powerful idea; it can turn Siri into a truly personal, programmable assistant capable of performing entire series of actions with just one request. I don't think I've ever seen anything even remotely similar to Apple's demo of the Shortcuts app and Siri integration on other platforms.
Some people, however, may argue that this isn't real support for multiple commands in Siri; after all, you still have to create a custom shortcut with your favorite actions and manually set it up for Siri. And maybe the Shortcuts app is a way for Apple to circumvent the fact that Siri, unlike Google Assistant, isn't capable of handling multiple questions in the same sentence yet. Ultimately however, it all goes back to whether you see the beauty and potential of user automation or not. With custom shortcuts, you won't even have to speak entire sentences containing multiple requests every time you want to execute them; you can just tell Siri a short custom phrase and it'll fire off multiple steps on its own.
From my perspective, this is exactly what automation is about: making a computer more useful, accessible, and faster so that we can save time to focus on something else. Custom shortcuts and the Shortcuts app show that not only is this vision still very much alive inside Apple, but it's evolving with the times too.
Permalink - Posted on 2018-06-13 12:32
In this very special live episode, Stephen is joined by Jason Snell and Serenity Caldwell to talk about macOS Mojave and Screen Time before going over the Happy-o-meter results and talking about Shortcuts with Myke and Federico.
Recording this episode of Connected last week was one of my highlights from WWDC. If you still don't know the results of our Happy-o-meter, now's a great time to catch up. You can listen here.
→ Source: relay.fm
Permalink - Posted on 2018-06-12 20:31
In an update rolled out last week, Apple fixed two of my longstanding annoyances with Apple Music: there is a new screen that lists popular albums coming soon, and every upcoming album now features an actual release date.
Apple appears to be rolling out a series of updates for Apple Music today, including a small but useful new section called "Coming Soon," which allows subscribers to check out new albums about to be released over the next few weeks.
In another addition, Apple is now making it possible to easily see album launch dates on their respective pages on iOS and macOS. In the Editors' Notes section, following the traditional encouragement to add the pre-release album to your library, there's a new line that begins "Album expected..." followed by the album's specific release date. Some albums not listed in Coming Soon still have a release date specified on their pages, so this update appears to be a bit more wide-ranging.
As someone who likes to keep up with new music, I'm glad to see Apple pushing these small but needed improvements to the service.
Furthermore, as noted by AppleInsider, the iOS 12 version of Apple Music features the ability to search for songs by lyrics. I've been using the beta on my iPhone and iPad for the past week, and lyrics search has already saved me a few minutes I would have otherwise spent looking for songs on Google. Built-in lyrics differentiate Apple Music from Spotify, so it's good to see Apple expanding support throughout the app.
→ Source: macrumors.com
Permalink - Posted on 2018-06-12 19:43, modified on 2018-06-13 12:02
Speaking of smaller features I wouldn't have expected to see at last week's WWDC, Bryan Gaz, writing for Digital Photography Review, has noticed some welcome improvements to camera import and RAW files in iOS 12:
Now, when you plug in Apple’s SD card to Lightning adapter (or camera connection kit), the Photos app will show up as an overlay on whatever app you’re using. This comes as a much less invasive method than previously used in iOS 11, wherein whatever app you were in would be switched over to the full-screen Photos app for importing. It also means you can multitask more efficiently, importing photos while getting other stuff done.
Now, when photos are detected on a card, iOS 12 will automatically sort through the content and determine if any of the photos have already been imported. If they have, they will be put in a separate area so you don’t accidentally import duplicates. Another new feature is a counter on the top of the screen that lasts you know how many photos are being displayed and how much space they take up on the memory card. This should help alleviate the guesswork involved when trying to determine whether or not you have enough storage on your iOS device.
I've never imported photos on my iPad using the Lightning to SD Card Camera Reader because I don't have a camera, but I know that the import process is one of the pain points for photographers who want to use an iPad in their workflows. The idea of having Photos show up automatically in Slide Over upon connecting an external device is interesting; it perfectly ties into the iPad's focus on drag and drop for multitasking and file transfers. It seems like this approach would work nicely for importing files from external USB devices if only Apple decided to add support for those too.
Update: After looking into this more closely, it appears that Photos only appears automatically upon connecting an SD card if it's already in Slide Over mode. This isn't as convenient as DP Review's original report, but at least all the other improvements mentioned in the story are indeed part of iOS 12.
→ Source: dpreview.com
Permalink - Posted on 2018-06-12 18:36
When you've followed Apple for several years, there are certain kinds of announcements you come to expect from the company: iterative refinements that make existing products better, and even those exciting surprise features you never would have thought of yourself, or new hardware that seems like something straight out of the future. There are other kinds of announcements, however, that you're confident will never come to fruition. Perhaps because they simply seem like something Apple wouldn't do, or that the company doesn't seem to really care about.
Every now and then, to our surprise and delight, those unexpected things come about after all. Looking back on last week's news from WWDC, there are several big and small announcements Apple made that hit me as totally unexpected.
I knew that Apple must have had a plan for Workflow before they acquired the app and its team, but I never allowed myself to hope for an outcome as exciting as Shortcuts. While there are clearly a ton of possibilities opened up by the new Shortcuts app, the Shortcuts announcement that surprised me most is that we will soon be able to set up our own custom phrases and actions for Siri. Apple has long touted how it strives to make Siri interactions more human and natural by supporting a wide range of phrases for various commands. There's no more natural way to interact with Siri, though, than using the exact phrase you've personally programmed it to accept. Just like a real-life assistant receives training to understand your needs, Shortcuts mean you'll be able to train Siri to assist you in exactly the ways you want to be assisted. With this one announcement, Siri has the potential to move from being useful in a limited number of domains to having virtually limitless capabilities. Needless to say, Shortcuts were a very pleasant surprise.
One of the reasons I moved on from the Mac in 2015 is that Apple – both in word and deed – declared the iPad "the future of personal computing." iOS has been the more exciting of Apple's software platforms for years now, while the Mac has been left languishing. Though it has never been entirely left for dead, its lack of substantial updates and improvements has left many constantly wondering when that day would come. And while it's possible that at some point Apple was planning to sunset the Mac, this year's WWDC showed that's absolutely not the case anymore. In fact, I think the Mac announcements made last week are the clearest indication so far of the Mac's path forward into the future.
Apple continues making the Mac more iOS-like, as it has for years, but now the end goal is more clear: keeping the Mac focused on professional users while the iPad satisfies the mass market. The definition of "professional" is, of course, up to the users themselves – and there will always be some people who prefer a Mac to the iPad not because of their type of work, but because they're simply more comfortable with traditional forms of computing. However, for everyone else, the Mac of the future seems like it could become a best of both worlds option for professional needs: with iOS-inspired features and the thriving app ecosystem of iOS, paired with the power and system flexibility that Mac users value so much. It's telling how many of macOS Mojave's new features are focused on making professional users' lives better, from dark mode, to Continuity Camera, to Finder improvements like Quick Actions.
I've been a very content iPad Pro user for nearly three years now, but now, to my great surprise, I'm once again excited about and intrigued by the Mac.
With Music, TV, and Podcasts as precedent, I had no doubt the new Apple Books app would look just like Apple's other media apps, with only minor differences as needed. What we actually got, however, is a new design unlike anything else on iOS. Books is a beautiful app, crafted in a way that's meant to make book-lovers feel at home. Its use of a serif font for all header text, alongside extensive use of drop shadows and other UI elements seldom found in other Apple software, makes it almost seem like a third-party app. While I personally would have been fine with Books looking just like the other iOS media apps, it's nice to see Apple's designers experiment with diverse designs on the platform.
Count this as one surprise that wasn't very pleasant. I fully expected there would be some change to either the iPad or the iPhone X's gesture system this year, but I expected that to apply primarily to Control Center. My hope was that the iPhone X would move Control Center into a joined view with the app switcher, like the iPad introduced in iOS 11. Just like the iPhone's app switcher currently slides in from the left edge of the screen, I think it would work well to have Control Center slide in from the opposite side. While this would have been my first choice, a second option I thought possible is that the iPad would adopt the iPhone X's Control Center placement, and that's it. What we got instead was a full adoption of the iPhone X's gestures in a way that makes things like accessing the dock more difficult than ever. I know this is likely due to a Home button-less iPad Pro coming soon, but here's hoping the system sees some refinements over the summer beta season.
The new Password Manager API in iOS 12 enables apps like 1Password to take advantage of the single best feature iCloud Keychain previously had going for it: one-tap autofill of login information in Safari. Maybe this is the real reason iOS 11.3 started requiring user input before Keychain populated Safari login forms; in any case, that minor inconvenience was well worth it to enable quick access to my 1Password vaults.
— 1Password (@1Password) June 5, 2018
I don't have a vehicle with CarPlay, nor do I use third-party navigation apps very often, but like the Password Manager API, this just felt like one of those things Apple didn't need to do, so I expected they wouldn't. CarPlay is a nice benefit to being in the Apple ecosystem, but it can also potentially be a secret weapon for pushing users to give Apple Maps a legitimate try. Now Google Maps users will be able to continue avoiding Apple's navigation service with little inconvenience.
Perhaps more so than any other announcement I've mentioned, if you had told me ahead of WWDC that Apple would be adding a full web browsing experience to the Apple Watch, there's no way I would have believed you. Fortunately, I would have been wrong. While a full-fledged Safari app for the Watch would be a little much, I love what Apple's done to enable viewing web content sent through messages or email. It's just another small way the Watch is becoming truly independent from the iPhone.
As much as I appreciate the often predictable nature of Apple, surprises like these are always nice to have too. They make following the company more interesting, and leave a lot to look forward to when all the new software updates reach the public this fall.
Permalink - Posted on 2018-06-12 17:47
Football fans know that the World Cup kicks off later this week, and ahead of that event Apple today shared a press release highlighting how it plans to cover the event using a host of its services and apps. World Cup support will include the following:
While I'm not a football fan (unless we're talking American football), Apple's World Cup coverage is exciting to me simply because it shows the potential for future integrated efforts around topics I do care about.
As Apple moves deeper into focusing not just on tech, but on media as well, it will have an increasing number of opportunities to use its apps and services to supplement a user's experience of big events, such as other major sporting events, election seasons, and more. Applying the Apple ecosystem's unified media and editorial services to the area of pop culture may seem like a small move, but it could become a product differentiator that users grow to love.
Permalink - Posted on 2018-06-12 16:01
One of the marquee features that Apple showed off for macOS Mojave at WWDC is Dark Mode. As the company demonstrated during the WWDC keynote, Dark Mode is a far more ambitious feature than the dark theme added to macOS Yosemite in 2014. The new look extends much deeper into the system affecting everything from app chrome to window shadows and Desktop Tinting.
There is a lot more to Dark Mode than you might assume. To help developers navigate when and how to implement Dark Mode, Apple has provided developers with guidelines, which Stephen Hackett covers on 512 Pixels:
The biggest is that not all apps should always follow the Appearance that has been set by the user. As before, Apple believes that media-focused tools should be dark at all times. I don’t foresee something like Final Cut Pro X gaining a light theme anytime soon.
Apple has also given developers the ability to use the Light Appearance in sections of their applications. One example is Mail, which can use the Light Appearance for messages, but the Dark Appearance for its window chrome, matching the system[.] This lets text and attachments be viewed more easily for some users. I think it’s a nice nod to accessibility for text-heavy apps, and I hope third-party developers take advantage of this ability.
Hackett also covers Accents, an adaptation and expansion of what is currently called Appearances that affect the look of things like drop-down menus, and how Accessibility features affect Dark Mode.
I like the look of Dark Mode a lot and hope third-party developers adopt it quickly. I expect the pressure to add Dark Mode to existing apps will rapidly increase as more and more third parties begin to use it and hold-out apps become bright, glaring reminders among a sea of muted windows.
→ Source: 512pixels.net
Permalink - Posted on 2018-06-12 13:28, modified on 2018-06-18 11:15
On this week's episode of AppStories, we discuss the apps on their current iPhone X Home screens.
→ Source: appstories.net
Permalink - Posted on 2018-06-11 16:03, modified on 2018-06-12 16:59
LumaFusion is the most powerful multi-track video editor ever created for iOS devices. Used by journalists, filmmakers, music video artists, and professional video producers to tell compelling visual stories, LumaFusion delivers affordable power and flexibility in an intuitive mobile interface.
The functionality packed into LumaFusion is remarkable. The app is full of features that match or exceed expensive desktop apps like Final Cut Pro X and Adobe Premier. LumaFusion delivers because it’s built by professionals who understand video editing. Terri Morgan and Chris Demiris, who met at Avid and founded Luma Touch in 2013, have had long careers in the video industry, which made them the perfect pair to bring sophisticated video editing to iOS.
Since its introduction in 2016, LumaFusion has attracted a loyal following among video artists. The app allows users to create complex multi-track stories with up to three video/audio tracks in resolutions up to 4K, up to three additional audio tracks, and dozens of transitions. Titler functionality allows you to create unlimited layers of text, shapes, and imported graphics too.
Users have quick, direct access to color correction, multiple key-framed effects on every clip, and professional features like Insert and Overwrite mode, Clip Linking for sync management, and much more. There’s support for audio channel mapping too, which allows users to shoot with separate background and voice microphones and then separate the stereo file into two mono tracks for independent control of each track.
This past spring, LumaFusion introduced a custom integration with GNARBOX, which allows users with large amounts of media to wirelessly preview and edit media from GNARBOX right inside LumaFusion while importing media that’s added to a timeline in the background. Later this summer, LumaFusion will introduce a Pro I/O Pack with professional features like external monitor support, batch export, and .xml project export.
For more details about LumaFusion, visit the Luma Touch website and watch their video tutorials. For a limited time, you can get started with LumaFusion for just $19.99 (regularly $39.99), which is a small fraction of the cost of comparable desktop apps.
Our thanks to LumaFusion for sponsoring MacStories this week.
Permalink - Posted on 2018-06-06 22:12
Now that people have had a chance to dig deeper into macOS Mojave, a number of smaller features have been discovered that didn’t get mentioned during the keynote on Monday and weren’t included in our initial overview of the updated OS that will be released in the fall. Here are a few of our favorite discoveries:
macOS Updates in System Preferences. What Apple didn’t explain when it updated the Mac App Store is that macOS updates have been moved from the Mac App Store to System Preferences.
Software Update is now in Settings instead of the Mac App Store! pic.twitter.com/4qWoeTUbxC
— Ian McDowell @ WWDC (@ian_mcdowell) June 5, 2018
HomeKit Support for Siri. Among the iOS apps ported to macOS as part of the upcoming release of Mojave is Home. The app does not currently support AirPlay 2, but control of HomeKit devices is not limited to the Home app itself; Siri can also be used to control devices.
System-Wide Twitter and Facebook Support Removed. In High Sierra, users could log into Twitter and Facebook from the Internet Accounts section of System Preferences and share content using the share button in apps like Safari. Like iOS did in iOS 11, the Mojave beta has removed system-level support for sharing content via Twitter and Facebook.
The Final Version to Support 32-Bit Apps. During the State of the Union presentation, Apple confirmed that Mojave will be the last version of macOS to support 32-bit apps. When a user tries to open a 32-bit app, Mojave currently displays a one-time warning that the app will not work in future versions of macOS.
Favicon Support in Safari Tabs. Unlike Google’s Chrome browser, macOS doesn’t currently support favicons in Safari tabs. According to an article by John Gruber last summer, that led a significant number of people to use Chrome and third-party solutions like Faviconographer, which overlaid favicons on Safari’s tabs. When Mojave ships, Safari will add support for tab favicons, which are coming to iOS too.
Apple Mail Stationary Removed. According to the release notes for the macOS Mojave beta, Stationary, the HTML email feature that allowed users to choose from pre-built email templates, has been removed from the app.
Permalink - Posted on 2018-06-06 16:47
Sonos just announced that AirPlay 2 is coming to “newer” Sonos speakers in July. Unlike using Apple Music on the HomePod, it will stream music from your phone instead of directly over the internet. However, unlike the HomePod you will be able to control some of the AirPlay 2 music with Alexa. You can launch music on your iOS device in all the normal ways, including with Siri.
Essentially, Sonos’ software system is able to be aware of what is playing on your speakers, no matter the source, It’s a clever way to make AirPlay 2 a little more useful. Once the music is playing via AirPlay 2, you can use Alexa to pause, go to the next track, and even ask what’s playing.
For the platform-agnostic user – the exact user Sonos has focused on pitching its products to lately – this kind of blending together of different assistants and ecosystems may carry a lot of appeal. Since Alexa is the sole voice service currently available on Sonos speakers, the ability to control AirPlay 2 playback Amazon's assistant is key. I do wonder, though, if mixing and matching different services might be overly confusing for the average user. With AirPlay 2 support, you'll be able to use Siri on your iPhone to start streaming audio to a Sonos speaker, but you can't start that playback with Alexa. Once audio's already playing, though, that's when Alexa steps in. I appreciate the variety of options, but it sounds like those options bring with them a lot of restrictions to remember.
As for hardware compatibility of AirPlay 2, it will be available on a limited number of Sonos devices:
AirPlay 2 will work with the Sonos One, (second generation) Play 5, and Playbase (and, ahem, “future products”). But if you have older speakers, owning any of those newer ones will make AirPlay 2 work with all of them.
That last line is intriguing, though unclear. Older devices can't actually become AirPlay 2 speakers, otherwise they would appear in the Home app as HomeKit devices – however, it makes sense that an existing HomeKit device that talks to older Sonos devices could serve as a translator of sorts, relaying AirPlay 2 commands over Sonos-native protocols.
We'll see how it all works when AirPlay 2 support arrives next month.
→ Source: theverge.com
Permalink - Posted on 2018-06-06 16:19
It's tough selling a paid up front app on the App Store. Users have no way of knowing ahead of time whether an app will fit their needs or not, and no one wants to spend money on an app only to find that it wasn't what they expected. Fortunately, App Store review guidelines have been updated this week to address that problem. Matthew Humphries reports for PCMag:
The updated guidelines state that, "Non-subscription apps may offer a free time-based trial period before presenting a full unlock option by setting up a Non-Consumable IAP item at Price Tier 0 that follows the naming convention: "14-day Trial." Prior to the start of the trial, your app must clearly identify its duration, the content or services that will no longer be accessible when the trial ends, and any downstream charges the user would need to pay for full functionality."
So users will know before they start using an app that it will cost money, but only after X days of free use. The upfront transparency should prevent any user frustration, but it could also greatly improve the quality of content in apps because the developer really needs the user to reach the end of the free trial wanting to pay to continue using/playing.
This isn't necessarily a change of policy, but more an explicit clarification of something that's already been allowed. The Omni Group, for example, began switching its entire suite of apps in September 2016 to the same sales model: free downloads, with In-App Purchases for unlocking full functionality after 14-day trial periods. Since that time, however, very few apps have followed the same path – likely in part due to continued uncertainty regarding what's officially allowed. The updated review guidelines should lead to a welcome increase of paid up front apps transitioning to free downloads with In-App Purchases, thus enabling more ubiquitous free trials across the App Store.
→ Source: pcmag.com
Permalink - Posted on 2018-06-05 19:51, modified on 2018-06-12 19:41
Every year when Apple introduces the latest versions of its software platforms at WWDC, information streams out in two major phases: we get the biggest, most important announcements during the opening keynote, then afterward, once the new beta builds are in the hands of developers, we find out all the additional details not meriting on-stage attention. In that vein, here's a roundup of all the smaller details we've discovered so far in iOS 12 and watchOS 5 that weren't covered in our initial overviews.
Chapter Support in Podcasts. Apple Podcasts has long had support for AAC chapter markers, but most podcasts – particularly lots of tech-focused podcasts – are formatted as MP3s, so their chapters wouldn't be available inside Apple Podcasts. That changes in iOS 12 though, as Podcasts now fully supports MP3 chapters for the first time.
— Ryan Christoffel (@iryantldr) June 5, 2018
The new Podcasts app also lets you adjust durations for skipping forward and back, and car and headphone controls can be configured to skip forward and back rather than changing podcasts altogether.
Richer Notifications. Similar to watchOS 5's new ability to load web content in things like Mail notifications, now rich notifications in iOS will automatically do the same. With Mail notifications this means that rather than seeing a plain text preview of a message, you'll be able to see a more accurate rendering of the message's contents.
Re-Trigger Face ID. While I'm a huge fan of Face ID, there's no denying I get more failed authentication attempts with it than with Touch ID. At first when this would happen, I would hit the side button to put the iPhone to sleep, then Raise to Wake to give it a second try. Fortunately, I later learned you could re-trigger an authentication simply by leaning the device away from you, then bringing it back where the TrueDepth camera can see you. This is a much better alternative, but it turns out Apple has an even better solution in iOS 12: when Face ID fails to authenticate, you can trigger another attempt simply by swiping up from the bottom of the screen, just like you do to go Home.
Expanded Markup Colors. The set of Markup tools used throughout much of iOS for screenshot annotating and more has received a welcome improvement: there's now a much wider selection of color options. The five core colors from previous versions of iOS are still front and center, but you can now open a new color picker window containing a whopping 120 additional colors.
iPhone X Safari Tabs. Despite a display size that actually measures larger diagonally than Plus-sized iPhones, the iPhone X adopts the size class model of 4.7-inch iPhones, which causes it to lose some UI benefits, particularly in landscape mode. One of those drawbacks is rectified in iOS 12, however: Safari will display an iPad-like tab view when your iPhone X is being used in landscape.
Force-Quitting Apps on iPhone X. In iOS 11, force-quitting apps was more difficult on the iPhone X than any other iOS device, because it required first pressing and holding on an app in the app switcher to activate force-quit mode, after which you could swipe up on apps to force-quit them. iOS 12 does away with the extra step, causing the iPhone X to fall back in line with other iOS devices with a simple swipe up to close an app.
New Wallpaper. As of beta 1, there is only a single new wallpaper in iOS 12, and it's available on both iPhone and iPad.
Automatic System Updates. Joining the already-present setting to turn on automatic updates for apps, you can now activate a similar option for system updates. Turn on automatic updates by visiting Settings ⇾ General ⇾ Software Update.
Search Lyrics in Apple Music and Tweaked Artist Profiles. Apple Music is largely unchanged in iOS 12, except for a couple tweaks: you can now use lyrics as a search term to find songs, and artist profiles feature larger artwork and a new button to shuffle the artist's songs.
Customizable Control Center. After iOS 11 introduced a fully customizable Control Center, Apple Watch owners immediately put that feature on their watchOS wishlists, and fortunately we didn't have to wait long. While Control Center in watchOS 5 isn't quite as customizable as its iOS counterpart – you can't disable any of its options – it does allow you to rearrange all its toggles to your heart's content thanks to a new Edit button.
Overnight Updates. Several years into its life, the Apple Watch is still the most painful Apple device to download updates for. One new feature in watchOS 5 that will hopefully help alleviate the pain is overnight updates. Little is known about this feature because it doesn't appear to be present in the current beta; however, the word cloud doesn't lie.
Walkie-Talkie Details. Though the full Walkie-Talkie app isn't available in the beta of watchOS 5 just yet, The Verge published an article that shared some interesting details behind how the app works: essentially, Walkie-Talkie communications take place over FaceTime audio, only a special branch of FaceTime audio that automatically mutes the line at all the appropriate times.
Change Wi-Fi Network. In watchOS 5, directly from the Watch you can switch to a different Wi-Fi network. This is done by visiting Settings ⇾ Wi-Fi.
With the beta season just beginning, it's likely that even more details of changes in iOS and watchOS will be discovered in the coming days and weeks, particularly as future beta versions ship. It's going to be a fun summer.