There’s a bug in iOS that disables Wi-Fi connectivity when devices join a network that uses a booby-trapped name, a researcher disclosed over the weekend.
By connecting to a Wi-Fi network that uses the SSID “%p%s%s%s%s%n” (quotation marks not included), iPhones and iPads lose the ability to join that network or any other networks going forward, reverse engineer Carl Schou reported on Twitter.
After joining my personal WiFi with the SSID “%p%s%s%s%s%n”, my iPhone permanently disabled it’s WiFi functionality. Neither rebooting nor changing SSID fixes it :~) pic.twitter.com/2eue90JFu3
Apple has been under a mountain of scrutiny lately from legislators, developers, judges, and users. Amidst all that, CEO Tim Cook sat with publication Brut. to discuss Apple’s strategy and policies. The short but wide-ranging interview offered some insight into where Apple plans to go in the future.
As is so common when Tim Cook speaks publicly, privacy was a major focus. His response to a question about its importance was the same one we’ve heard from him many times: “we see it as a basic human right, a fundamental human right.” Noting Apple has been focused on privacy for a long time.
The UK’s competition watchdog will take a deep dive look into Apple and Google’s dominance of the mobile ecosystem, it said today — announcing a market study which will examine the pair’s respective smartphone platforms (iOS and Android); their app stores (App Store and Play Store); and web browsers (Safari and Chrome).
The Competition and Markets Authority (CMA) is concerned that the mobile platform giants’ “effective duopoly” in those areas might be harming consumers, it added.
The study will be wide ranging, with the watchdog concerns about the nested gateways that are created as a result of the pair’s dominance of mobile ecosystem — intermediating how consumers can access a variety of products, content and services (such as music, TV and video streaming; fitness tracking, shopping and banking, to cite some of the examples provided by the CMA).
“These products also include other technology and devices such as smart speakers, smart watches, home security and lighting (which mobiles can connect to and control),” it went on, adding that it’s looking into whether their dominance of these pipes is “stifling competition across a range of digital markets”, saying too that it’s “concerned this could lead to reduced innovation across the sector and consumers paying higher prices for devices and apps, or for other goods and services due to higher advertising prices”.
The CMA further confirmed the deep dive will examine “any effects” of the pair’s market power over other businesses — giving the example of app developers who rely on Apple or Google to market their products to customers via their smart devices.
The CMA said both those existing investigations are examining issues that fall within the scope of the new mobile ecosystem market study but that its work on the latter will be “much broader”.
It added that it will adopt a joined-up approach across all related cases — “to ensure the best outcomes for consumers and other businesses”.
It’s giving itself a full year to examine Gapple’s mobile ecosystems.
It is also soliciting feedback on any of the issues raised in its statement of scope — calling for responses by 26 July. The CMA added that it’s also keen to hear from app developers, via its questionnaire, by the same date.
That earlier market study has been feeding the UK government’s plan to reform competition rules to take account of the market-deforming power of digital giants. And the CMA suggested the new market study, examining ‘Gapple’s’ mobile muscle, could similarly help shape UK-wide competition law reforms.
Last year the UK announced its plan to set up a “pro-competition” regime for regulating Internet platforms — including by establishing a dedicated Digital Markets Unit within the CMA (which got going earlier this year).
The legislation for the reform has not yet been put before parliament but the government has said it wants the competition regulator to be able to “proactively shape platforms’ behavior” to avoid harmful behavior before it happens” — saying too that it supports enabling ex ante interventions once a platform has been identified to have so-called “strategic market status”.
Germany already adopted similar reforms to its competition law (early this year), which enable proactive interventions to tackle large digital platforms with what is described as “paramount significance for competition across markets”. And its Federal Cartel Office has, in recent months, wasted no time in opening a number of proceedings to determine whether Amazon, Google and Facebook have such a status.
The CMA also sounds keen to get going to tackle Internet gatekeepers.
Commenting in a statement, CEO Andrea Coscelli said:
“Apple and Google control the major gateways through which people download apps or browse the web on their mobiles – whether they want to shop, play games, stream music or watch TV. We’re looking into whether this could be creating problems for consumers and the businesses that want to reach people through their phones.
“Our ongoing work into big tech has already uncovered some worrying trends and we know consumers and businesses could be harmed if they go unchecked. That’s why we’re pressing on with launching this study now, while we are setting up the new Digital Markets Unit, so we can hit the ground running by using the results of this work to shape future plans.”
The European Union also unveiled its own proposals for clipping the wings of big tech last year — presenting its Digital Markets Act plan in December which will apply a single set of operational rules to so-called “gatekeeper” platforms operating across the EU.
The clear trend in Europe on digital competition is toward increasing oversight and regulation of the largest platforms — in the hopes that antitrust authorities can impose measures that will help smaller players thrive.
Critics might say that’s just playing into the tech giants’ hands, though — because it’s fiddling around the edges when more radical intervention (break ups) are what’s really needed to reboot captured markets.
Apple and Google were contacted for comment on the CMA’s market study.
A Google spokesperson said: “Android provides people with more choice than any other mobile platform in deciding which apps they use, and enables thousands of developers and manufacturers to build successful businesses. We welcome the CMA’s efforts to understand the details and differences between platforms before designing new rules.”
According to Google, the Android App Economy generated £2.8BN in revenue for UK developers last year, which it claims supported 240,000 jobs across the country — citing a Public First report that it commissioned.
The tech giant also pointed to operational changes it has already made in Europe, following antitrust interventions by the European Commission — such as adding a choice screen to Android where users can pick from a list of alternative search engines.
Earlier this month it agreed to shift the format underlying that choice screen from an unpopular auction model to free participation.
The announcement of new iPad software at this year’s WWDC conference had an abnormally large expectation hung on it. The iPad lineup, especially the larger iPad Pro, has kept up an impressively frantic pace of hardware innovation over the past few years. In that same time frame, the software of the iPad, especially its ability to allow users to use multiple apps at once and in its onramps for professional software makers, has come under scrutiny for an apparently slower pace.
This year’s announcements about iOS 15 and iPadOS 15 seemed designed to counter that narrative with the introduction of a broad number of quality of life improvements to multitasking as well as a suite of system-wide features that nearly all come complete with their own developer-facing APIs to build on. I had the chance to speak to Bob Borchers, Apple’s VP of Worldwide Product Marketing, and Sebastien (Seb) Mariners-Mes, VP, Intelligent System Experience at Apple about the release of iPadOS 15 to discuss a variety of these improvements.
Mariners-Mes works on the team of Apple software SVP Craig Federighi and was pivotal in the development of this new version.
iPad has a bunch of new core features including SharePlay, Live Text, Focuses, Universal Control, on-device Siri processing and a new edition of Swift Playgrounds designed to be a prototyping tool. Among the most hotly anticipated for iPad Pro users, however, are improvements to Apple’s multitasking system.
If you’ve been following along, you’ll know that the gesture-focused multitasking interface of iPadOS has had its share of critics, including me. Though it can be useful in the right circumstances, the un-discoverable gesture system and confusing hierarchy of the different kinds of combinations of apps made it a sort of floppy affair to utilize correctly for an apt user much less a beginner.
Since the iPad stands alone as pretty much the only successful tablet device on the market, Apple has a unique position in the industry to determine what kinds of paradigms are established as standard. It’s a very unique opportunity to say, hey, this is what working on a device like this feels like; looks like; should be.
So I ask Borchers and Mariners-Mes to talk a little bit about multitasking. Specifically Apple’s philosophy in the design of multitasking on iPadOS 15 and the update from the old version, which required a lot of acrobatics of the finger and a strong sense of spatial awareness of objects hovering out off the edges of the screen.
“I think you’ve got it,” Borchers says when I mention the spatial gymnastics, “but the way that we think about this is that the step forward and multitasking makes it easier discover, easier to use even more powerful. And, while pros I think were the ones who were using multitasking in the past, we really want to take it more broadly because we think there’s applicability to many, many folks. And that’s why the, the discovery and the ease of use I think were critical.”
“You had a great point there when you talked about the spatial model and one of our goals was to actually make the spatial model more explicit in the experience,” says Mariners-Mes, “where, for example, if you’ve got a split view, and you’re replacing one of the windows, we kind of open the curtain and tuck the other app to the side, you can see it — it’s not a hidden hidden mental model, it’s one that’s very explicit.
Another great example of it is when you go into the app, switcher to reconfigure your windows, you’re actually doing drag and drop as you rearrange your new split views, or you dismiss apps and so on. So it’s not a hidden model, it’s one where we really try to reinforce a spatial model with an explicit one for the user through all of the animations and all of the kinds of affordances.”
Apple’s goal this time around, he says, was to add affordances for the user to understand that multitasking was even an option — like the small series of dots at the top of every app and window that now allows you to explicitly choose an available configuration, rather than the app-and-dock-juggling method of the past. He goes on to say that consistency was a key metric for them on this version of the OS. The appearance of Slide Over apps in the same switcher view as all other apps, for instance. Or the way that you can choose configurations of apps via the button, by drag and drop in the switcher and get the same results.
In the dashboard, Mariners-Mes says, “you get an at a glance view of all of the apps that you’re running and a full model of how you’re navigating that through the iPad’s interface.”
This ‘at a glance’ map of the system should be very welcome by advanced users. Even as a very aggressive Pro user myself, Slide Over apps became more of a nuisance than anything because I couldn’t keep track of how many were open and when to use them. The ability to combine them on the switcher itself is one of those things that Apple has wanted to get into the OS for years but is just now making its way onto iPads. Persistence of organization, really, was the critical problem to tackle.
“I think we believe strongly in building a mental model where people know where things are [on iPad],” says Mariners-Mes. “And I think you’re right when it comes persistence I think it also applies to, for example, home screen. People have a very strong mental model of where things are in the home screen as well as all of the apps that they’ve configured. And so we try to maintain a well maintained that mental model, and also allow people to reorganize again in the switcher.”
He goes on to explain the new ‘shelf’ feature that displays every instance or window that an app has open within itself. They implemented this as a per-app feature rather than a system-wide feature, he says, because the association of that shelf with a particular app fit the overall mental model that they’re trying to build. The value of this shelf may jump into higher relief when more professional apps that may have a dozen documents or windows open at once and active during a project ship later this year.
Another nod to advanced users in iPadOS 15 is the rich keyboard shortcut set offered across the system. The interface can be navigated by arrow keys now, many advanced commands are there and you can even move around on an iPad using a game controller.
“One of the key goals this year was to make basically everything in the system navigable from the keyboard,” says Mariners-Mes, “so that if you don’t want to, you don’t have to take your hands off the keyboard. All of the new multitasking affordances and features, you can do through the keyboard shortcuts. You’ve got the new keyboard shortcut menu bar where you can see all the shortcuts that are available. It’s great for discoverability. You can search them and we even, you know, and this is a subtle point, but we even made a very conscious effort to rationalize the shortcuts across Mac and iPadOS. So that if you’re using universal control, for example, you’re going to go from one environment to the other seamlessly. You want to ensure that consistency as you go across.”
The gestures, however, are staying as a nod to consistency for existing users that may be used to those.
To me, one of the more interesting and potentially powerful developments is the introduction of the Center Window and its accompanying API. A handful of Apple apps like Mail, Notes and Messages now allow items to pop out into an overlapping window.
“It was a very deliberate decision on our part,” says Mariners-Mes about adding this new element. “This really brings a new level of productivity where you can have, you know, this floating window. You can have content behind it. You can seamlessly cut and paste. And that’s something that’s just not possible with the traditional [iPadOS] model. And we also really strive to make it consistent with the rest of multitasking where that center window can also become one of the windows in your split view, or full size, and then go back to to being a center window. We think it’s a cool addition to the model and we look really look forward to 3rd parties embracing it.”
Early reception of the loop Apple gave at iPadOS 15 has an element of reservation about it still given that many of the most powerful creative apps are made by third parties that must adopt these technologies in order for them to be truly useful. But Apple, Borchers says, is working hard to make sure that pro apps adopt as many of these new paradigms and technologies as possible, so that come fall, the iPad will feel like a more hospitable host for the kinds of advanced work pros want to do there.
One of the nods to this multi-modal universe that the iPad exists in is Universal Control. This new feature uses Bluetooth beaconing, peer-to-peer WiFi and the iPad’s touchpad support to allow you to place your devices close to one another and — in a clever use of reading user intent — slide your mouse to the edge of a screen and onto your Mac or iPad seamlessly.
CUPERTINO, CALIFORNIA – June 7, 2021: Apple’s senior vice president of Software Engineering Craig Federighi showcases the ease of Universal Control, as seen in this still image from the keynote video of AppleÕs Worldwide Developers Conference at Apple Park. (Photo Credit: Apple Inc.)Ê
“I think what we have seen and observed from our users, both pro and and otherwise, is that we have lots of people who have Macs and they have iPads, and they have other iPhones and and we believe in making these things work together in ways that are that are powerful,” says Borchers. “And it just felt like a natural place to be able to go and extend our Continuity model so that you could make use of this incredible platform that is iPadOS while working with your Mac, right next to it. And I think the big challenge was, how do you do that in kind of a magical, simple way. And that’s what Seb and his team and been able to accomplish.
“It really builds on the foundation we made with Continuity and Sidecar,” adds Mariners-Mes. “We really thought a lot about how do you make the experience — the set up experience — as seamless as possible. How do you discover that you’ve got devices side by side.?
The other thing we thought about was what are the workflows that people want to have and what capabilities that will be essential for that. That’s where thinks like the ability to seamlessly drag content across the platforms or cut and paste was we felt to be really, really important. Because I think that’s really what brings to the magic to the experience.”
Borchers adds that it makes all the continuity features that much more discoverable. Continuity’s shared clipboard, for instance, is an always on but invisible presence. Expanding that to visual and mouse-driven models made some natural sense.
“It’s just like, oh, of course, I can drag that all the way across all the way across here,” he says.
“Bob, you say, of course,” Mariners-Mes laughs. “And yet for those of us working in platforms for a long time, the ‘of course’, is technically very, very challenging. Totally non obvious.”
Another area where iPadOS 15 is showing some promising expansionary behavior is in system-wide activities that allow you to break out of the box of in-app thinking. These include embedded recommendations that seed themselves into various apps, Shareplay, which makes an appearance wherever video calls are found and Live Text, which turns all of your photos into indexed archives searchable with a keyboard.
Another is Quick Note, a system extension that lets you swipe from the bottom corner of your screen wherever you are in the system.
“There are, I think a few interesting things that we did with with Quick Note,” says Mariners-Mes. “One is this idea of linking. So, that if I’m working in Safari or Yelp or another app, I can quickly insert a link to whatever content I’m viewing. I don’t know about you, but it’s something that I certainly do a lot when I do research.
“The old way was, like, cut and paste and maybe take a screenshot, create a note and jot down some notes. And now we’ve made that very, very seamless and fluid across the whole system. It even works the other way where, if I’m now in Safari and I have a note that refers to that page in Safari, you’ll see it revealed as a thumbnail at the bottom of the screen’s right hand side. So, we’ve really tried to bring the notes experience to be something that just permeates the system and is easily accessible from, from everywhere.”
Many of the system-wide capabilities that Apple is introducing in iPadOS 15 and iOS 15 have an API that developers can tap into. That is not always the case with Apple’s newest toys, which in years past have often been left to linger in the private section of its list of frameworks rather than be offered to developers as a way to enhance their apps. Borchers says that this is an intentional move that offers a ‘broader foundation of intelligence’ across the entire system.
This broader intelligence includes Siri moving a ton of commands to its local scope. This involved having to move a big chunk of Apple’s speech recognition to an on-device configuration in the new OS as well. The results, says Borchers, are a vastly improved day-to-day Siri experience, with many common commands executing immediately upon request — something that was a bit of a dice roll in days of Siri past. The removal of the reputational hit that Siri was taking from commands that went up to the cloud never to return could be the beginning of a turnaround for the public perception of Siri’s usefulness.
The on-device weaving of the intelligence provided by the Apple Neural Engine (ANE) also includes the indexing of text across photos in the entire system, past, present and in-the-moment.
“We could have done live text only in camera and photos, but we wanted it to apply to anywhere we’ve got images, whether it be in in Safari or quick look or wherever,” says Mariners-Mes. “One of my favorite demos of live text is actually when you’ve got that long complicated field for a password for a Wi-Fi network. You can just actually bring it up within the keyboard and take a picture of it, get the text in it and copy and paste it into into the field. It’s one of those things that’s just kind of magical.”
On the developer service front of iPadOS 15, I ask specifically about Swift Playgrounds, which add the ability to write, compile and ship apps on the App Store for the first time completely on iPad. It’s not the native Xcode some developers were hoping for, but, Borchers says, Playgrounds has moved beyond just ‘teaching people how to code’ and into a real part of many developer pipelines.
“ think one of the big insights here was that we also saw a number of kind of pro developers using it as a prototyping platform, and a way to be able to be on the bus, or in the park, or wherever if you wanted to get in and give something a try, this was super accessible and easy way to get there and could be a nice adjunct to hey, I want to learn to code.”
“If you’re a developer,” adds Mariners-Mes, “it’s actually more productive to be able to run that app on the device that you’re working on because you really get great fidelity. And with the open project format, you can go back and forth between Xcode and Playgrounds. So, as Bob said, we can really envision people using this for a lot of rapid prototyping on the go without having to bring along the rest of their development environment so we think it’s a really, really powerful addition to our development development tools this year.”
I can’t be the only developer wondering just how much of my apps I could shoehorn into Swift Playgrounds now that it can build apps, just so I can tweak them on the go Especially since it supports UIKit development, not just SwiftUI, and Swift packages pic.twitter.com/vpsEMlVigs
Way back in 2018 I profiled a new team at Apple that was building out a testing apparatus that would help them to make sure they were addressing real-world use cases for flows of process that included machines like the (at the time un-revealed) new Mac Pro, iMacs, MacBooks and iPads. One of the demos that stood out at the time was a deep integration with music apps like Logic that would allow the input models of iPad to complement the core app. Tapping out a rhythm on a pad, brightening or adjusting sound more intuitively with the touch interface. More of Apple’s work these days seems to be aimed at allowing users to move seamlessly back and forth between its various computing platforms, taking advantage of the strengths of each (raw power, portability, touch, etc) to complement a workflow. A lot of iPadOS 15 appears to be geared this way.
Whether it will be enough to turn the corner on the perception of iPad as a work device that is being held back by software, I’ll reserve judgement until it ships later this year. But, in the near term, I am cautiously optimistic that this set of enhancements that break out of the ‘app box’, the clearer affordances for multitasking both in and out of single apps and the dedication to API support are pointing towards an expansionist mentality on the iPad software team. A good sign in general.
Apple went big on privacy during its Worldwide Developer Conference (WWDC) keynote this week, showcasing features from on-device Siri audio processing to a new privacy dashboard for iOS that makes it easier than ever to see which apps are collecting your data and when.
While typically vocal about security during the Memoji-filled, two-hour-long(!) keynote, the company also quietly introduced several new security and privacy-focused features during its WWDC developer sessions. We’ve rounded up some of the most interesting — and important.
Passwordless login with iCloud Keychain
Apple is the latest tech company taking steps to ditch the password. During its “Move beyond passwords” developer session, it previewed Passkeys in iCloud Keychain, a method of passwordless authentication powered by WebAuthn, and Face ID and Touch ID.
The feature, which will ultimately be available in both iOS 15 and macOS Monterey, means you no longer have to set a password when creating an account or a website or app. Instead, you’ll simply pick a username, and then use Face ID or Touch ID to confirm it’s you. The passkey is then stored in your keychain and then synced across your Apple devices using iCloud — so you don’t have to remember it, nor do you have to carry around a hardware authenticator key.
“Because it’s just a single tap to sign in, it’s simultaneously easier, faster and more secure than almost all common forms of authentication today,” said Garrett Davidson, an Apple authentication experience engineer.
While it’s unlikely to be available on your iPhone or Mac any time soon — Apple says the feature is still in its ‘early stages’ and it’s currently disabled by default — the move is another sign of the growing momentum behind eliminating passwords, which are prone to being forgotten, reused across multiple services, and — ultimately — phishing attacks. Microsoft previously announced plans to make Windows 10 password-free, and Google recently confirmed that it’s working towards “creating a future where one day you won’t need a password at all”.
Microphone indicator in macOS
macOS has a new indicator to tell you when the microhpone is on. (Image: Apple)
Since the introduction of iOS 14, iPhone users have been able to keep an eye on which apps are accessing their microphone via a green or orange dot in the status bar. Now it’s coming to the desktop too.
In macOS Monterey, users will be able to see which apps are accessing their Mac’s microphone in Control Center, MacRumors reports, which will complement the existing hardware-based green light that appears next to a Mac’s webcam when the camera is in use.
iOS 15, which will include a bunch of privacy-bolstering tools from Mail Privacy Protection to App Privacy Reports, is also getting a feature called Secure Paste that will help to shield your clipboard data from other apps.
This feature will enable users to paste content from one app to another, without the second app being able to access the information on the clipboard until you paste it. This is a significant improvement over iOS 14, which would notify when an app took data from the clipboard but did nothing to prevent it from happening.
“With secure paste, developers can let users paste from a different app without having access to what was copied until the user takes action to paste it into their app,” Apple explains. “When developers use secure paste, users will be able to paste without being alerted via the [clipboard] transparency notification, helping give them peace of mind.”
While this feature sounds somewhat insignificant, it’s being introduced following a major privacy issue that came to light last year. In March 2020, security researchers revealed that dozens of popular iOS apps — including TikTok — were “snooping” on users’ clipboard without their consent, potentially accessing highly sensitive data.
Payments fraud is more prevalent than ever as a result of the pandemic, and Apple is looking to do something about it. As first reported by 9to5Mac, the company has previewed Advanced Fraud Protection, a feature that will let Apple Card users generate new card numbers in the Wallet app.
While details remain thin — the feature isn’t live in the first iOS 15 developer beta — Apple’s explanation suggests that Advanced Fraud Protection will make it possible to generate new security codes — the three-digit number you enter at checkout – when making online purchases.
“With Advanced Fraud Protection, Apple Card users can have a security code that changes regularly to make online Card Number transactions even more secure,” the brief explainer reads. We’ve asked Apple for some more information.
‘Unlock with Apple Watch’ for Siri requests
As a result of the widespread mask-wearing necessitated by the pandemic, Apple introduced an ‘Unlock with Apple Watch’ in iOS 14.5 that let enabled users to unlock their iPhone and authenticate Apple Pay payments using an Apple Watch instead of Face ID.
The scope of this feature is expanding with iOS 15, as the company has confirmed that users will soon be able to use this alternative authentication method for Siri requests, such as adjusting phone settings or reading messages. Currently, users have to enter a PIN, password or use Face ID to do so.
“Use the secure connection to your Apple Watch for Siri requests or to unlock your iPhone when an obstruction, like a mask, prevents Face ID from recognizing your Face,” Apple explains. Your watch must be passcode protected, unlocked, and on your wrist close by.”
To ensure iPhone users who don’t want to upgrade to iOS 15 straight away are up to date with security updates, Apple is going to start decoupling patches from feature updates. When iOS 15 lands later this year, users will be given the option to update to the latest version of iOS or to stick with iOS 14 and simply install the latest security fixes.
“iOS now offers a choice between two software update versions in the Settings app,” Apple explains (via MacRumors). “You can update to the latest version of iOS 15 as soon as it’s released for the latest features and most complete set of security updates. Or continue on iOS 14 and still get important security updates until you’re ready to upgrade to the next major version.”
This feature sees Apple following in the footsteps of Google, which has long rolled out monthly security patches to Android users.
‘Erase all contents and settings’ for Mac
Wiping a Mac has been a laborious task that has required you to erase your device completely then reinstall macOS. Thankfully, that’s going to change. Apple is bringing the “erase all contents and settings” option that’s been on iPhones and iPads for years to macOS Monterey.
The option will let you factory reset your MacBook with just a click. “System Preferences now offers an option to erase all user data and user-installed apps from the system, while maintaining the operating system currently installed,” Apple says. “Because storage is always encrypted on Mac systems with Apple Silicon or the T2 chip, the system is instantly and securely ‘erased’ by destroying the encryption keys.”
Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest in mobile OS news, mobile applications and the overall app economy.
The app industry continues to grow, with a record 218 billion downloads and $143 billion in global consumer spend in 2020. Consumers last year also spent 3.5 trillion minutes using apps on Android devices alone. And in the U.S., app usage surged ahead of the time spent watching live TV. Currently, the average American watches 3.7 hours of live TV per day, but now spends four hours per day on their mobile devices.
Apps aren’t just a way to pass idle hours — they’re also a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into mobile companies — a figure that’s up 27% year-over-year.
This week, our series will take a dive into the key announcements impacting app developers from WWDC 21.
Apple’s WWDC went virtual again this year, but it didn’t slow down the pace of announcements. This week, Apple introduced a slate of new developer tools and frameworks, changes to iOS that will impact how consumers use their devices and new rules for publishing on its App Store, among other things. We don’t have the bandwidth to dig into every dev update — and truly, there are better places to learn about, say, the new concurrency capabilities of Swift 5.5 or what’s new with SwiftUI.
But after a few days of processing everything new, here’s what’s jumping out as the bigger takeaways and updates.
Apple’s development IDE, Xcode 13, now includes Xcode Cloud, a built-in continuous integration and delivery service hosted on Apple’s cloud infrastructure. Apple says the service, birthed out of its 2018 Buddybuild acquisition, will help to speed up the pace of development by combining cloud-based tools for building apps along with tools to run automated tests in parallel, deliver apps to testers via TestFlight and view tester feedback through the web-based App Store Connect dashboard. Beyond the immediate improvements to the development process (which developers are incredibly excited about based on #WWDC21tweets) Xcode Cloud represents a big step by Apple further into the cloud services space, where Amazon (AWS), Google and Microsoft have dominated. While Xcode Cloud may not replace solutions designed for larger teams with more diverse needs, it’s poised to make app development easier — and deliver a new revenue stream to Apple. If only Apple had announced the pricing!
Swift Playgrounds got a notable update in iPadOS 15, as it will now allow developers to build iPhone and iPad apps right on their iPad and submit them to the App Store. In Swift Playgrounds 4, coming later this year, Apple says developers will be able to create the visual design of an app using SwiftUI, see the live preview of their app’s code while building and can run their apps full-screen to test them out. App projects can also be opened and edited with either Swift Playgrounds or Xcode.
While it’s not the Xcode on iPad system some developers have been requesting, it will make app building more accessible because of iPad’s lower price point compared with Mac. It could also encourage more people to try app development, as Swift Playgrounds helps student coders learn the basics then move up to more challenging lessons over time. Now, they can actually build real apps and hit the publish button, too.
I had to save for a couple of months – 8-10 months before buying my first Mac. Swift playgrounds on iPad is a very welcome move!
Antitrust pressure swirling around Apple has contributed to a growing sentiment among some developers that Apple doesn’t do enough to help them grow their businesses — and therefore, is undeserving of a 15%-30% cut of the revenues the developers themselves worked to gain. The new App Store updates may start to chip away at that perception.
Soon, developers will be able to create up to 35 custom product pages targeted toward different users, each with their unique URL for sharing and analytics for measuring performance. The pages can include different preview videos, screenshots and text.
Image Credits: Apple
Apple will also allow developers to split traffic between three treatments of the app’s default page to measure which ones convert best, then choose the percentage of the App Store audience that will see one of the three treatments.
Meanwhile, the App Store will begin to show to customers in-app events taking place inside developers’ apps — like game competitions, fitness challenges, film premieres and more — effectively driving traffic to apps and re-engaging users. Combined, Apple is making the case that its App Store can drive discovery beyond just offering an app listing page.
Beyond the App Store product itself, Apple overhauled its App Store policies to address the growing problem of scam apps. The changes give Apple permission to crack down on scammers by removing offenders from its Developer Program. The new guidelines also allow developers to report spam directly to Apple, instead of, you know, relying on tweets and press.
Apple has historically downplayed the scam problem. It noted how the App Store stopped over $1.5 billion in fraudulent transactions in 2020, for example. Even if it’s a small percentage of the App Store, scam apps with fake ratings not only can cheat users out of millions of dollars, they reduce consumer trust in the App Store and Apple itself, which has longer-term consequences for the ecosystem health. What’s unclear, however, is why Apple is seemingly trying to solve the App Review issues using forms — to report fraud (and now, to appeal rulings, too) when it’s becoming apparent that Apple needs a more systematic way of keeping tabs on the app ecosystem beyond the initial review process.
The App Store discovery updates mentioned above also matter more because developers may need to reduce their reliance on notifications to send users back into their apps. Indeed, iOS 15 users will be able to choose which apps they don’t need to hear from right away — these will be rounded up into a new Notification Summary that arrives on a schedule they configure, where Siri intelligence helps determine which apps get a top spot. If an app was already struggling to re-engage users through push notifications, getting relegated to the end of a summary is not going to help matters.
And users can “Send to Summary” right from the Lock Screen notification itself in addition to the existing options to “Deliver Quietly” or be turned off. That means any ill-timed push could be an app developer’s last.
Image Credits: Apple
Meanwhile, the clever new “Focus” modes let iOS users configure different quiet modes for work, play, sleeping and more, each with their own set of rules and even their own home screens. But making this work across the app ecosystem will require developer adoption of four “interruption levels,” ranging from passive to critical. A new episode of a fav show should be a “passive” notification, for example. “Active” is the default setting — which doesn’t get to break into Focus. “Time sensitive” notifications should be reserved for alerting to more urgent matters, like a delivery that’s arrived on your doorstep or an account security update. These may be able to break through Focus, if allowed.
Image Credits: Apple
“Critical” notifications would be reserved for emergencies, like severe weather alerts or local safety updates. While there is a chance developers may abuse the new system to get their alert through, they risk users silencing their notifications entirely or deleting the app. Focus mode users will be power users and more technically savvy, so they’ll understand that an errant notification here was a choice and not a mistake on the developer’s part.
Image Credits: Apple
Apple has been steadily pushing out more tools for building augmented reality apps, but this WWDC it just introduced a huge update that will make it easier for developers getting started with AR. With the launch of RealityKit 2, Apple’s new Object Capture API will allow developers to create 3D models in minutes using only an iPhone or iPad (or a DSLR or drone if they choose).
Explains Apple this will address one of the most difficult parts of making great AR apps, which was the process of creating 3D models. Before, this could take hours and cost thousands of dollars — now, developers with just an iPhone and Mac can participate. The impacts of this update will be seen in the months and years ahead, as developers adopt the new tools for things like AR shopping, games and other AR experiences — including ones we may not have seen yet, but are enabled by more accessible AR technology tools and frameworks.
This update is unexpected and interesting, despite missing what would have been an ideal launch window: mid-pandemic back in 2020. With SharePlay, developers can bring their apps into what Apple is calling “Group Activities” — or shared experiences that take place right inside FaceTime.
If you were co-watching Hulu with friends during the pandemic, you get the idea. But Apple isn’t tacking on some co-viewing system here. Instead, it’s introducing new APIs that let users listen to music, stream video or screen share with friends, in a way that feels organic to FaceTime. There was a hint of serving the locked-down COVID-19 pandemic crowd with this update, as Apple talks about making people feel as if they’re “in the same room” — a nod to those many months where that was not possible. And that may have inspired the changes, to be sure. Similarly, FaceTime’s support for Android and scheduled calls — a clear case of Zoom envy — feels like a case of playing catch-up on Apple’s part.
Image Credits: Apple
The immediate demand for these sorts of experiences may be dulled by a population that’s starting to recover from the pandemic — people are now going out and seeing others in person again thanks to vaccines. But the ability to use apps while FaceTime’ing has a lifespan that extends beyond the COVID era, particularly among iPhone’s youngest users. The demographic growing up with smartphones at ever-younger ages don’t place phone calls — they text and FaceTime. Some argue Gen Z even prefers the latter.
Image Credits: Apple
With its immediate support for Apple services like Apple Music and Apple TV+, SharePlay will hit the ground running — but it will only fully realize its vision with developer adoption. But such a system seems possibly only because of Apple’s tight control over its platform. It also gives a default iOS app a big advantage over third-parties.
Not to be outdone by WWDC (ha), Google this week launched Android 12, beta 2. This release brings more of the new features and design changes to users that weren’t yet available in the first beta which debuted at Google I/O. This includes the new privacy dashboard; the addition of the mic and camera indicators that show when an app is using those features; an indication when an app is reading from the clipboard; and a new panel that makes it easier to switch between internet providers or Wi-Fi networks.
Google also this week released its next Pixel feature drop which brought new camera and photo features, privacy features, Google Assistant improvements and more. Highlights included a way to create stargazing videos, a car crash detection feature and a way to answer or reject calls hands-free.
Pinterest wants to get more users clicking “buy.” The company this week added a new Shopping List feature which automatically organizes your saved Product Pins for easier access.
Google discontinued its AR-based app Measure, which had allowed users to measure things in the real world using the phone’s camera. The app had seen some stability and accuracy issues in the past.
Facebook’s Messenger app added Venmo-like QR codes for person-to-person payments inside its app in the U.S. Users can scan the codes to send or request a payment, even if they’re not Facebook friends with the other party. Payments are sent over Facebook Pay, which is backed by a users’ credit card, debit card or a PayPal account.
Downloads of fintech apps are up 132%globally YoY according to an AppsFlyer marketing report.
Twitter and Square CEO Jack Dorsey said Square is thinking about adding a bitcoin hardware wallet to its product lineup. The exec detailed some of the thinking behind the plan in a Twitter thread.
Square is considering making a hardware wallet for #bitcoin. If we do it, we would build it entirely in the open, from software to hardware design, and in collaboration with the community. We want to kick off this thinking the right way: by sharing some of our guiding principles.
Instagram head Adam Mosseri said Facebook will help creators get around Apple’s 30% cut. While any transactions that take place in iOS will follow Apple’s rules, Mosseri said Facebook will look for other ways to help creators make a living where they don’t have to give up a portion of their revenue — like by connecting brands and creators offline or affiliate deals.
Related to this, Instagram announced during its Creator Week event it will start testing a native affiliate tool that will allow creators to recommend products and then earn commissions on those sales. Creators can also now link their merch shops to personal profiles instead of just business profiles, and by year-end, will be able to partner on merch and drops with companies like Bravado/UMG, Fanjoy, Represent and Spring.
Image Credits: Instagram
Instagram also rolled out a new “badge” for live videos which lets viewers tip creators, similar to Facebook’s Stars. Facebook also said paid online events, fan subscriptions, badges and its upcoming news products will remain free through 2023. And it rolled out new features and challenges to help creators earn additional payouts for hitting certain milestones.
Finally, Instagram in a blog post explained how its algorithm works. The post details how the app decides what to show users first, why some posts get more views than others, how Explore works and other topics.
Giphy’s Clips (GIFs with sound) are now available in the Giphy iMessage app, instead of only on the web and in its iOS app. That means you can send the…uh, videos (??)…right from your keyboard.
Image Credits: Tinder
Match-owned dating app Tinder added a way for users to block contacts. The feature requires users grant the app permission to access the phone’s contacts database, which is a bit privacy-invasive. But then users can go through their contacts and check those they want to block on Tinder. The benefit is this would allow people to block exes and abusers. But on the downside, it permits cheating as users can block partners and those who might see them and report back.
Streaming & Entertainment
YouTube will allow creators to repurpose audio from existing YouTube videos as its “Shorts” product — basically, its TikTok competitor — rolls out to more global markets.
Roblox is generating estimated revenue of $3.01 million daily on iPhone, according to data from Finbold. Clash of Clans, Candy Crush Saga, Pokémon GO and others follow. Good thing if they have to pay up over that music usage lawsuit.
Image Credits: Finbold
Apple-owned weather app Dark Sky, whose technology just powered a big iOS 15 revamp of Apple’s stock weather app, is not shutting down just yet. The company announced its iOS app, web app and API will remain online through the end of 2022, instead of 2021 as planned.
Microsoft’s Outlook email app for iOS now lets you use your voice to write emails and schedule meetings. The feature leverages Cortana, and follows the launch of a Play My Emails feature inside Outlook Mobile.
Government & Policy
President Biden revoked and replaced Trump’s actions which had targeted Chinese apps, like TikTok and WeChat. The president signed a new executive order that requires the Commerce Dept. to review apps with ties to “foreign adversaries” that may pose national security risks. Trump had previously tried to ban the apps outright, but his order was blocked by federal courts.
Google has agreed to show more mobile search apps for users to choose from on new Android phones following feedback from the European Commission. The company had been showing a choice screen where app providers bid against each other for the slot, and pay only if users download apps. DuckDuckGo and others complained the solution has not been working.
Security & Privacy
Security flaws were found in Samsung’s stock mobile apps impacting some Galaxy devices. One could have allowed for data theft through the Secure Folder app. Samsung Knox security software could have been used to install malicious apps. And a bug in Samsung Dex could have scraped data from notifications. There are no indications users were impacted and the flaws were fixed.
An App Store analysis published by The Washington Post claims nearly 2% of the top grossing apps on one day were scam apps, which cost people $48 million. They included several VPN apps that told users their iPhones were infected with viruses, a QR code reader that tricked customers into a subscription for functionality that comes with an iPhone, and apps that pretend to be from big-name brands, like Amazon and Samsung.
Multiple apps were removed from the Chinese app store for violating data collection rules, Reuters reported. The apps hailed from Sogou, iFlytek and others, and included virtual keyboards.
Funding and M&A
Mexican payments app Clip raised $250 million from SoftBank’s Latin American Fund and Viking Global Investors, valuing the business at $2 billion. The app offers a Square-like credit card reader device and others, and has begun to offer cash advances to clients.
Shopify acqui-hires the team from the augmented reality home design app Primer. The app, which will be shut down, had allowed users to visualize what tile, wallpaper or paint will look like on surfaces inside their home.
Singapore-based corporate services “super app” Osome raised $16 million in Series A funding. The app offers online accounting and other business services for SMBs. Investors include Target Global, AltaIR Capital, Phystech Ventures, S16VC and VC Peng T. Ong.
Chinese grocery delivery app Dingdong Maicai, backed by Sequoia and Tiger Global, has filed for a U.S. IPO. To date, the company has raised $1 billion.
San Francisco-based MaintainX raised $39 millionin Series B funding led by Bessemer Venture Partners for its mobile-first platform for industrial and frontline workers to help track maintenance, safety and operations.
Berlin’s Ada Health raised $90 millionin Series B funding in a round led by Leaps by Bayer, the impact investment arm of Bayer AG. The app lets users monitor their symptoms and track their health and clinical data.
Photo app Dispo confirmed its previously leaked Series A funding, which earlier reports had pegged as being around $20 million. The app had been rebranded from David’s Disposable and dropped its association with YouTuber David Dobrik, following sexual assault allegations regarding a member of the Vlog Squad. Spark Capital severed ties with Dispo as a result. Seven Seven Six and Unshackled Ventures remained listed as investors, per Dispo’s press release, but the company didn’t confirm the size of the round.
Brazilian fintech Nubank raised a $750 million extension to its Series G (which was $400 million last year) led by Berkshire Hathaway. The company offers a digital bank account accessible from an app, debit card, payments, loans, insurance and more. The funding brings the company to a $1.15 billion valuation.
Seattle-based tutoring app Kadama raised $1.7 million in seed funding led by Grishin Robotics. The app, which offering an online tutoring marketplace aimed at Gen Z, rode the remote learning wave to No. 2 in the Education category on the App Store.
Mark Cuban-based banking app Dave, which helps Americans build financial stability, is planning to go public via a SPAC launched by Chicago-based Victory Park Capital called VPC Impact Acquisition Holdings III. It also includes a $210 million private investment from Tiger Global Management.
Mobile game publisher Voodoo acquired Tel Aviv-based marketing automation platform Bidshake for an undisclosed sum. Launched in January 2020, Bidshake combines data aggregation and analytics with campaign and creative management. It will continue to operate independently.
Turntable — tt.fm
Image Credits: tt.fm on iPhone/Brian Heater
Newly launched music social network tt.fm is a Turntable.fm rival that lets you virtually hang out with friends while listening to music. To be clear, the app is not the same as Turntable.fm, which shut down in 2013 but then returned during the pandemic as people looked to connect online. While that Turntable was rebirthed by its founder Billy Chasen, Turntable – tt.fm hails from early Turntable.fm employee, now tt.fm CEO Joseph Perla. But as live events are coming back, the question now may be not which Turntable app to choose, but whether the Turnable.fm experience has missed the correct launch window…again.
The art app SketchAR previously offered artists tools to draw with AR, turn photos into AR, create AR masks for Snapchat, play games and more. With its latest update, artists can now turn their work into NFTs directly inside the app and sell it. The app, now used by nearly 500,000 users, will select a “Creator of the Week” to NFT on OpenSea. Others can create and auction their art as NFTs on-demand.
Today is Launch Day
Introducing OldOS — iOS 4 beautifully rebuilt in SwiftUI.
* Designed to be as close to pixel-perfect as possible. * Fully functional, perhaps even usable as a second OS. * Fully open source for all to learn, modify, and build on. pic.twitter.com/K0JOE2fEKM
As Apple’s annual WWDC conference wraps up, we have a whole week of developer sessions and press briefings to look back on, plus a bunch of bullet points on Apple’s various feature pages to sort through.
The result? There are a bunch of interesting features coming to iPhones in iOS 15 that Apple didn’t highlight during its public-facing keynote event on Monday.
We’re not going to list them all, as there are far too many little changes in the upcoming software updates. If you want to review the complete list, Apple has published detailed feature pages on its site.
Apple incorporated the announcement of this year’s Apple Design Award winners into its virtual Worldwide Developer Conference (WWDC) online event, instead of waiting until the event had wrapped, like last year. Ahead of WWDC, Apple previewed the finalists, whose apps and games showcased a combination of technical achievement, design and ingenuity. This evening, Apple announced the winners across six new award categories.
In each category, Apple selected one app and one game as the winner.
In the Inclusivity category, winners supported people from a diversity of backgrounds, abilities and languages.
This year, winners included U.S.-based Aconite’s highly accessible game, HoloVista, where users can adjust various options for motion control, text sizes, text contrast, sound, and visual effect intensity. In the game, users explore using the iPhone’s camera to find hidden objects, solve puzzles and more. (Our coverage)
Image Credits: Aconite
Another winner, Voice Dream Reader, is a text-to-speech app that support more than two dozen languages and offers adaptive features and a high level of customizable settings.
Image Credits: Voice Dream LLC
In the Delight and Fun, category, winners offer memorable and engaging experiences enhanced by Apple technologies. Belgium’s Pok Pok Playroom, a kid entertainment app that spun out of Snowman (Alto’s Adventure series), won for its thoughtful design and use of subtle haptics, sound effects and interactions. (Our coverage)
Image Credits: Pok Pok
Another winner included U.K.s’ Little Orpheus, a platformer that combines storytelling, surprises, and fun and offers a console-like experience in a casual game.
Image Credits: The Chinese Room
The Interaction category winners showcase apps that offer intuitive interfaces and effortless controls, Apple says.
The U.S.-based snarky weather app CARROT Weather won for its humorous forecasts, unique visuals, and entertaining experience, which is also available as Apple Watch faces and widgets.
Image Credits: Brian Mueller, Grailr LLC
Canada’s Bird Alone game combines gestures, haptics, parallax, and dynamic sound effects in clever ways to brings its world to life.
Image Credits: George Batchelor
A Social Impact category doled out awards to Denmark’s Be My Eyes, which enables people who are blind and low vision to identify objects by pairing them with volunteers from around the world using their camera. Today, it supports over 300K users who are assisted by over 4.5M volunteers. (Our coverage)
Image Credits: S/I Be My Eyes
U.K.’s ustwo games won in this category for Alba, a game that teaches about respecting the environment as players save wildlife, repair a bridge, clean up trash and more. The game also plants a tree for every download.
Image Credits: ustwo games
The Visuals and Graphics winners feature “stunning imagery, skillfully drawn interfaces, and high-quality animations,” Apple says.
Belarus-based Loóna offers sleepscape sessions which combine relaxing activities and atmospheric sounds with storytelling to help people wind down at night. The app was recently awarded Google’s “best app” of 2020.
Innovation winners included India’s NaadSadhana, an all-in-one, studio-quality music app that helps artists perform and publish. The app uses A.I. and Core ML to listen and provide feedback on the accuracy of notes, and generates a backing track to match.
Image Credits: Sandeep Ranade
Riot Games’ League of Legends: Wild Rift (U.S.) won for taking a complex PC classic and delivering a full mobile experience that includes touchscreen controls, an auto-targeting system for newcomers, and a mobile-exclusive camera setting.
Image Credits: Riot Games
The winners this year will receive a prize package that includes hardware and the award itself.
“This year’s Apple Design Award winners have redefined what we’ve come to expect from a great app experience, and we congratulate them on a well-deserved win,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations, in a statement. “The work of these developers embodies the essential role apps and games play in our everyday lives, and serve as perfect examples of our six new award categories.”
An example of Dark Sky’s maps from the desktop Web version. [credit: Dark Sky ]
A new blog post from the developers of Apple-owned, hyperlocal weather app Dark Sky has announced that the iOS and web versions of the app, as well as the Dark Sky API, will sunset at the end of 2022.
Here’s the exact wording from the blog post:
Support for the Dark Sky API service for existing customers will continue until the end of 2022. The iOS app and Dark Sky website will also be available until the end of 2022.
Dark Sky’s developers initially said the API would shut down at the end of 2021, but this new end-of-2022 target obviously moves things back a bit. This is the first time we’ve heard about an end date for the iOS app, though.
If you’ve ever bought a subscription inside an iOS app and later decided you wanted to cancel, upgrade or downgrade, or ask for a refund, you may have had trouble figuring out how to go about making that request or change. Some people today still believe that they can stop their subscription charges simply by deleting an app from their iPhone. Others may dig around unsuccessfully inside their iPhone’s Settings or on the App Store to try to find out how to ask for a refund. With the updates Apple announced in StoreKit 2 during its Worldwide Developers Conference this week, things may start to get a little easier for app customers.
StoreKit is Apple’s developer framework for managing in-app purchases — an area that’s become more complex in recent years, as apps have transitioned from offering one-time purchases to ongoing subscriptions with different tiers, lengths, and feature sets.
Image Credits: Apple
Currently, users who want to manage or cancel subscriptions can do so from the App Store or their iPhone Settings. But some don’t realize the path to this section from Settings starts by tapping on your Apple ID (your name and profile photo at the top of the screen). They may also get frustrated if they’re not familiar with how to navigate their Settings or the App Store.
Meanwhile, there are a variety of ways users can request refunds on their in-app subscriptions. They can dig in their inbox for their receipt from Apple, then click the “Report a Problem” link it includes to request a refund when something went wrong. This could be useful in scenarios where you’ve bought a subscription by mistake (or your kid has!), or where the promised features didn’t work as intended.
Apple also provides a dedicated website where users can directly request refunds for apps or content. (When you Google for something like “request a refund apple” or similar queries, a page that explains the process typically comes up at the top of the search results.)
Still, many users aren’t technically savvy. For them, the easiest way to manage subscriptions or ask for refunds would be to do so from within the app itself. For this reason, many conscientious app developers tend to include links to point customers to Apple’s pages for subscription management or refunds inside their apps.
But StoreKit 2 is introducing new tools that will allow developers to implement these sort of features more easily.
One new tool is a Manage subscriptions API, which lets an app developer display the manage subscriptions page for their customer directly inside their app — without redirecting the customer to the App Store. Optionally, developers can choose to display a “Save Offer” screen to present the customer with a discount of some kind to keep them from cancelling, or it could display an exit survey so you can ask the customer why they decided to end their subscription.
When implemented, the customer will be able to view a screen inside the app that looks just like the one they’d visit in the App Store to cancel or change a subscription. After cancelling, they’ll be shown a confirmation screen with the cancellation details and the service expiration date.
If the customer wants to request a refund, a new Refund request API will allow the customer to begin their refund request directly in the app itself — again, without being redirected to the App Store or other website. On the screen that displays, the customer can select which item they want refund and check the reason why they’re making the request. Apple handles the refund process and will send either an approval or refund declined notification back to the developer’s server.
However, some developers argue that the changes don’t go far enough. They want to be in charge of managing customer subscriptions and handling refunds themselves, through programmatic means. Plus, Apple can take up to 48 hours for the customer to receive an update on their refund request, which can be confusing.
“They’ve made the process a bit smoother, but developers still can’t initiate refunds or cancellations themselves,” notes RevenueCat CEO Jacob Eiting, whose company provides tools to app developers to manage their in-app purchases. “It’s a step in the right direction, but could actually lead to more confusion between developers and consumers about who is responsible for issuing refunds.”
In other words, because the forms are now going to be more accessible from inside the app, the customer may believe the developer is handling the refund process when, really, Apple continues to do so.
People could write in after uninstalling the app. Or have subscribed on a device they no longer have. There are so many cases that this ignores.
This is only a marginal improvement over opening the manage subscriptions URL.
Some developers pointed out that there are other scenarios this process doesn’t address. For example, if the customer has already uninstalled the app or no longer has the device in question, they’ll still need to be directed to other means of asking for refunds, just as before.
For consumers, though, subscription management tools like this mean more developers may begin to put buttons to manage subscriptions and ask for refunds directly inside their app, which is a better experience. In time, as customers learn they can more easily use the app and manage subscriptions, app developers may see better customer retention, higher engagement, and better App Store reviews, notes Apple.
The new additions to SwiftUI, been wanting Pull to Refresh and to improve accessibility. Then surprisingly StoreKit 2, especially for users requesting refunds and managaging subscriptions seamlessly (also great for Buffer).
The StoreKit 2 changes weren’t limited to APIs for managing subscriptions and refunds.
Developers will also gain access to a new Invoice Lookup API that allows them to look up the in-app purchases for the customer, validate their invoice and identify any problems with the purchase — for example, if there were any refunds already provided by the App Store.
A new Refunded Purchases API will allow developers to look up all the refunds for the customer.
Sounds like StoreKit can validate receipts on device and allow users to request refunds. Didn’t have those on my bingo card but wow those are some nice quality-of-life updates!
And a new Renewal Extension API will allow developers to extend the renewal data for paid, active subscriptions in the case of an outage — like when dealing with customer support issues when a streaming service went down, for example. This API lets developers extend the subscription up to twice per calendar year, each up to 90 days in the future.
Another change will help customers when they reinstall apps or download them on new devices. Before, users would have to manually “restore purchases” to sync the status of the completed transactions back to that newly downloaded or reinstalled app. Now, that information will be automatically fetched by StoreKit 2 so the apps are immediately up-to-date with whatever it is the user paid for.
Very interesting changes to StoreKit. Apps will now be able to automatically restore and sync purchases, so users won't need to restore when the app is reinstalled or downloaded on a new device! https://t.co/WSs6L9rTvK
While, overall, the changes make for a significant update to the StoreKit framework, Apple’s hesitancy to allow developers more control over their own subscription-based customers speaks, in part, to how much it wants to control in-app purchases. This is perhaps because it got burned in the past when it tried allowing developers to manage their own refunds.
As The Verge noted last month while the Epic Games-Apple antitrust trial was underway, Apple had once provided Hulu will access to a subscription API, then discovered Hulu had been offering a way to automatically cancel subscriptions made through the App Store when customers wanted to upgrade to higher-priced subscription plans. Apple realized it needed to take action to protect against this misuse of the API, and Hulu later lost access. It has not since made that API more broadly available.
On the flip side, having Apple, not the developers, in charge of subscription management and refunds means Apple takes on the responsibilities around preventing fraud — including fraud perpetrated by both customers and developers alike. Customers may also prefer that there’s one single place to go for managing their subscription billing: Apple. They may not want to have to deal with each developer individually, as their experience would end up being inconsistent.
These changes matter because subscription revenue contributes to a sizable amount of Apple’s lucrative App Store business. Ahead of WWDC 21, Apple reported the sale of digital goods and services on the App Store grew to $86 billion in 2020, up 40% over the the year prior. Earlier this year, Apple said it paid out more than $200 billion to developers since the App Store launched in 2008.
With the upcoming release of iOS 15 for Apple mobile devices, Apple’s built-in search feature known as Spotlight will become a lot more functional. In what may be one of its bigger updates since it introducedSiri Suggestions, the new version of Spotlight is becoming an alternative to Google for several key queries, including web images and information about actors, musicians, TV shows and movies. It will also now be able to search across your photo library, deliver richer results for contacts, and connect you more directly with apps and the information they contain. It even allows you to install apps from the App Store without leaving Spotlight itself.
Spotlight is also more accessible than ever before.
Years ago, Spotlight moved from its location to the left of the Home screen to become available with a swipe down in the middle of any screen in iOS 7, which helped grow user adoption. Now, it’s available with the same swipe down gesture on the iPhone’s Lock Screen, too.
Apple showed off a few of Spotlight’s improvements during its keynote address at its Worldwide Developer Conference, including the search feature’s new cards for looking up information on actors, movies and shows, as well as musicians. This change alone could redirect a good portion of web searches away from Google or dedicated apps like IMDb.
For years, Google has been offering quick access to common searches through its Knowledge Graph, a knowledge base that allows it to gather information from across sources and then use that to add informational panels above and the side of its standard search results. Panels on actors, musicians, shows and movies are available as part of that effort.
But now, iPhone users can just pull up this info on their home screen.
The new cards include more than the typical Wikipedia bio and background information you may expect — they also showcase links to where you can listen or watch content from the artist or actor or movie or show in question. They include news articles, social media links, official websites, and even direct you to where the searched person or topic may be found inside your own apps. (E.g. a search for “Billie Eilish” may direct you to her tour tickets inside SeatGeek, or a podcast where she’s a guest).
Image Credits: Apple
For web image searches, Spotlight also now allows you to search for people, places, animals, and more from the web — eating into another search vertical Google today provides.
Image Credits: iOS 15 screenshot
Your personal searches have been upgraded with richer results, too, in iOS 15.
When you search for a contact, you’ll be taken to a card that does more than show their name and how to reach them. You’ll also see their current status (thanks to another iOS 15 feature), as well as their location from FindMy, your recent conversations on Messages, your shared photos, calendar appointments, emails, notes, and files. It’s almost like a personal CRM system.
Image Credits: Apple
Personal photo searches have also been improved. Spotlight now uses Siri intelligence to allow you to search your photos by the people, scenes, elements in your photos, as well as by location. And it’s able to leverage the new Live Text feature in iOS 15 to find the text in your photos to return relevant results.
This could make it easier to pull up photos where you’ve screenshot a recipe, a store receipt, or even a handwritten note, Apple said.
Image Credits: Apple
A couple of features related to Spotlight’s integration with apps weren’t mentioned during the keynote.
Spotlight will now display action buttons on the Maps results for businesses that will prompt users to engage with that business’s app. In this case, the feature is leveraging App Clips, which are small parts of a developer’s app that let you quickly perform a task even without downloading or installing the app in question. For example, from Spotlight you may be prompted to pull up a restaurant’s menu, buy tickets, make an appointment, order takeout, join a waitlist, see showtimes, pay for parking, check prices and more.
The feature will require the business to support App Clips in order to work.
Image Credits: iOS 15 screenshot
Another under-the-radar change — but a significant one — is the new ability to install apps from the App Store directly from Spotlight.
This could prompt more app installs, as it reduces the steps from a search to a download, and makes querying the App Store more broadly available across the operating system.
Developers can additionally choose to insert a few lines of code to their app to make data from the app discoverable within Spotlight and customize how it’s presented to users. This means Spotlight can work as a tool for searching content from inside apps — another way Apple is redirecting users away from traditional web searches in favor of apps.
However, unlike Google’s search engine, which relies on crawlers that browse the web to index the data it contains, Spotlight’s in-app search requires developer adoption.
Still, it’s clear Apple sees Spotlight as a potential rival to web search engines, including Google’s.
“Spotlight is the universal place to start all your searches,” said Apple SVP of Software Engineering Craig Federighi during the keynote event.
Spotlight, of course, can’t handle “all” your searches just yet, but it appears to be steadily working towards that goal.
Apple in 2018 closed its $400 million acquisition of music recognition app Shazam. Now, it’s bringing Shazam’s audio recognition capabilities to app developers in the form of the new ShazamKit. The new framework will allow app developers — including those on both Apple platforms and Android — to build apps that can identify music from Shazam’s huge database of songs, or even from their own custom catalog of pre-recorded audio.
Many consumers are already familiar with the mobile app Shazam, which lets you push a button to identify what song you’re hearing, and then take other actions — like viewing the lyrics, adding the song to a playlist, exploring music trends, and more. Having first launched in 2008, Shazam was already one of the oldest apps on the App Store when Apple snatched it up.
Now the company is putting Shazam to better use than being just a music identification utility. With the new ShazamKit, developers will now be able to leverage Shazam’s audio recognition capabilities to create their own app experiences.
There are three parts to the new framework: Shazam catalog recognition, which lets developers add song recognition to their apps; custom catalog recognition, which performs on-device matching against arbitrary audio; and library management.
Shazam catalog recognition is what you probably think of when you think of the Shazam experience today. The technology can recognize the song that’s playing in the environment and then fetch the song’s metadata, like the title and artist. The ShazamKit API will also be able to return other metadata like genre or album art, for example. And it can identify where in the audio the match occurred.
When matching music, Shazam doesn’t actually match the audio itself, to be clear. Instead, it creates a lossy representation of it, called a signature, and matches against that. This method greatly reduces the amount of data that needs to be sent over the network. Signatures also cannot be used to reconstruct the original audio, which protects user privacy.
The Shazam catalog comprises millions of songs and is hosted in cloud and maintained by Apple. It’s regularly updated with new tracks as they become available.
When a customer uses a developer’s third-party app for music recognition via ShazamKit, they may want to save the song in their Shazam library. This is found in the Shazam app, if the user has it installed, or it can be accessed by long pressing on the music recognition Control Center module. The library is also synced across devices.
Apple suggests that apps make their users aware that recognized songs will be saved to this library, as there’s no special permission required to write to the library.
Image Credits: Apple
ShazamKit’s custom catalog recognition feature, meanwhile, could be used to create synced activities or other second-screen experiences in apps by recognizing the developer’s audio, not that from the Shazam music catalog.
This could allow for educational apps where students follow along with a video lesson, where some portion of the lesson’s audio could prompt an activity to begin in the student’s companion app. It could also be used to enable mobile shopping experiences that popped up as you watched a favorite TV show.
ShazamKit is current in beta on iOS 15.0+, macOS 12.0+, Mac Catalyst 15.0+, tvOS 15.0+, and watchOS 8.0+. On Android, ShazamKit comes in the form of an Android Archive (AAR) file and supports music and custom audio, as well.
At its Worldwide Developer Conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to get started building AR (augmented reality) experiences. With the launch of RealityKit 2, Apple says developers will have more visual, audio, and animation control when working on their AR experiences. But the most notable part of the update is how Apple’s new Object Capture technology will allow developers to create 3D models in minutes using only an iPhone.
Apple noted during its developer address that one of the most difficult parts of making great AR apps was the process of creating 3D models. These could take hours and thousands of dollars.
With Apple’s new tools, developers will be able take a series of pictures using just an iPhone (or iPad or DSLR, if they prefer) to capture 2D images of an object from all angles, including the bottom.
Then, using the Object Capture API on macOS Monterey, it only takes a few lines of code to generate the 3D model, Apple explained.
Image Credits: Apple
To begin, developers would start a new photogrammetry session in RealityKit that points to the folder where they’ve captured the images. Then, they would call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to generate the USDZ files optimized for AR Quick Look — the system that lets developers add virtual, 3D objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects — an indication that online shopping is about to get a big AR upgrade.
Wayfair, for example, is using Object Capture to develop tools for their manufacturers so they can create a virtual representation of their merchandise. This will allow Wayfair customers to be able to preview more products in AR than they could today.
Image Credits: Apple (screenshot of Wayfair tool))
In addition, Apple noted developers including Maxon and Unity are using Object Capture for creating 3D content within 3D content creation apps, such as Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine tune the look and feel of AR objects; dynamic loading for assets; the ability to build your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale and explore AR worlds in RealityKit-based games.
One developer, Mikko Haapoja of Shopify, has been trying out the new technology (see below) and shared some real-world tests where he shot objects using an iPhone 12 Max via Twitter.
Developers who want to test it for themselves can leverage Apple’s sample app and install Monterey on their Mac to try it out.
Apple's Object Capture on a Pineapple. One of my fav things to test Photogrammetry against. This was processed using the RAW detail setting.
Last month, Apple announced it would soon add lossless audio streaming and Spatial Audio with support for Dolby Atmos to its Apple Music subscription at no extra charge. That upgrade has now gone live, Apple announced this morning — though manynoticed the additions actually rolled out yesterday, following the WWDC keynote.
The entire Apple Music catalog of 75+ million songs will support lossless audio.
The lossless tier begins at CD quality — 16 bit at 44.1 kHz, and goes up to 24 bit at 48 kHz, Apple previously said. Audiophiles can also opt for the high-resolution lossless that goes up to 24 bit at 192 kHz. Apple has said you’ll need to use an external, USB digital-to-analog converter to take advantage of the latter — simply plugging in a pair of headphones to an iPhone won’t work.
Apple Music subscribers will be able to enable the new lossless option under Settings > Music > Audio quality. Here, you’ll be able to choose the different resolutions you want to use for different connections, including Wi-Fi, cellular, and download.
When you make your selection in Settings, iOS warns that lossless files will use “significantly more space” on your device, as 10 GB of storage would allow you to store approximately 3,000 songs at high quality, 1,000 songs with lossless, or 200 songs with high-res lossless.
Image Credits: Apple
Meanwhile, Spatial Audio will be enabled by default on hardware that supports Dolby Atmos, like Apple’s AirPods and Beats headphones with an H1 or W1 chip. The latest iPhone, iPad, and Mac models also support Dolby Atmos. Spatial Audio on Apple Music will also be “coming soon” to Android devices, Apple said.
To kick off launch, Apple Music is today rolling out new playlists designed to showcase Spatial Audio. These include:
Apple is also adding a special guide to Spatial Audio on Apple Music, which will help music listeners hear the difference. This will include tracks from artists like Marvin Gaye and The Weeknd, among others. And Apple will air a roundtable conversation about Spatial Audio featuring top sound engineers and experts, hosted by Zane Lowe at 9 am PT today on Apple Music.
Because songs have to be remastered for Dolby Atmos specifically, these guides and playlists will help music fans experience the new format without having to hunt around. Apple says it’s working with artists and labels to add more new releases and the best catalog tracks in Spatial Audio. To help on this front, Apple notes there are various initiatives underway — including doubling the number of Dolby-enabled studios in major markets, offering educational programs, and providing resources to independent artists.
Apple also said it will build music-authoring tools directly into Logic Pro. Later this year, the company plans to release an update to Logic Pro that will allow any musician to create and mix their songs in Spatial Audio for Apple Music.
Just after the release of iOS 12 in 2018, Apple introduced its own built-in screen time tracking tools and controls. In then began cracking down on third-party apps that had implemented their own screen time systems, saying they had done so through via technologies that risked user privacy. What wasn’t available at the time? A Screen Time API that would have allowed developers to tap into Apple’s own Screen Time system and build their own experiences that augmented its capabilities. That’s now changed.
At Apple’s Worldwide Developer Conference on Monday, it introduced a new Screen Time API that offers developer access to frameworks that will allow parental control experience that also maintains user privacy.
The company added three new Swift frameworks to the iOS SDK that will allow developers to create apps that help parents manage what a child can do across their devices and ensure those restrictions stay in place.
The apps that use this API will be able to set restrictions like locking accounts in place, preventing password changes, filtering web traffic, and limiting access to applications. These sorts of changes are already available through Apple’s Screen Time system, but developers can now build their own experiences where these features are offered under their own branding and where they can then expand on the functionality provided by Apple’s system.
ScreenTime API looks great, I sincerely hope someone provides me a way to bulk change stuff for my kids. If I had known I would have to tweak each kids ScreenTime individually like I do today, I might have had less children. #WWDC21
Developers’ apps that take advantage of the API can also be locked in place so it can only be removed from the device with a parent’s approval.
The apps can authenticate the parents and ensure the device they’re managing belongs to a child in the family. Plus, Apple said the way the system will work lets parents choose the apps and websites they want to limit, without compromising user privacy. (The system returns only opaque tokens instead of identifiers for the apps and website URLs, Apple told developers, so the third-parties aren’t gaining access to private user data like app usage and web browsing details. This would prevent a shady company from building a Screen Time app only to collect troves of user data about app usage, for instance.)
The third-party apps can also create unique time windows for different apps or types of activities, and warn the child when time is nearly up. When it registers the time’s up, the app lock down access to websites and apps and perhaps remind the child it’s time to their homework — or whatever other experience the developer has in mind.
And on the flip side, the apps could create incentives for the child to gain screen time access after they complete some other task, like doing homework, reading or chores, or anything else.
Developers could use these features to design new experiences that Apple’s own Screen Time system doesn’t allow for today, by layering their own ideas on top of Apple’s basic set of controls. Parents would likely fork over their cash to make using Screen Time controls easier and more customized to their needs.
Other apps could tie into Screen Time too, outside of the “family” context — like those aimed at mental health and wellbeing, for example.
Of course, developers have been asking for a Screen Time API since the launch of Screen Time itself, but Apple didn’t seem to prioritize its development until the matter of Apple’s removal of rival screen time apps was brought up in an antitrust hearing last year. At the time, Apple CEO Tim Cook defended the company’s decision by explaining that apps had been using MDM (mobile device management) technology, which was designed for managing employee devices in the enterprise, not home use. This, he said, was a privacy risk.
Apple has a session during WWDC that will detail how the new API works, so we expect we’ll learn more soon as the developer info becomes more public.
During the WWDC conference today, Apple unveiled the new macOS 12 Monterey. A major feature in the macOS update is Universal Control, which builds upon the Continuity features first introduced in OS X Yosemite. For years, it’s been possible to open a news article on your iPhone and keep reading it on your MacBook, or to copy and paste a link from your iPad to your iMac. But Universal Control takes these features further.
With Universal Control, you can use a single mouse and keyboard to navigate across multiple Apple devices at once. This functionality works across more than two devices – in the demo video, the feature is used to seamlessly move across an iPad, MacBook, and iMac. Users can drag and drop files across multiple devices at once, making it possible, for example, to use a multi-screen setup while editing video on Final Cut Pro.
What’s possible in Universal Control isn’t necessarily new – this has been made possible before through third party apps. Plus, in 2019, Apple debuted Sidecar, which allowed users to connect their iPad as a second monitor for their MacBook or iMac. But, Universal Control improves upon Sidecar – and maybe renders it obsolete – by allowing users to link any Apple devices together, even if it’s not an iPad. Though this update may not be ground-breaking, it’s a useful upgrade to existing features.
Apple didn’t announce that rumored combined Apple TV device that would combine the set-top box with a HomePod speaker during its WWDC keynote, but it did announce a few features that will improve the Apple TV experience — including one that involves a HomePod Mini. Starting this fall, Apple said you’ll be able to select the HomePod Mini as the speaker for your Apple TV 4K. It also introduced a handful of software updates for Apple TV users, including a new way to see shows everyone in the family will like, and support for co-watching shows through FaceTime.
The co-watching feature is actually a part of a larger FaceTime update, which will let users stream music, TV, and screen share through their FaceTime calls. The Apple TV app is one of those that’s supported through this new system, called SharePlay. It will now include a new “Shared with You” row that highlights the shows and movies your friends are sharing, as well.
Another feature called “For All of You” will display a collection of shows and movies based on everyone’s interests within Apple TV’s interface. This is ideal you’re planning to watch something as a family — like for movie night, for example. And you can fine tune the suggestions based on who’s watching.
A new Apple TV widget is also being made available, which now includes iPad support.
And the new support for HomePod Mini will help deliver “rich, balanced sound” and “crystal clear dialog,” when you’re watching Apple TV with the Mii set up as your speakers, Apple said.
Among many updates coming to iOS 15, Apple Maps will receive a number of upgrades that will bring more detailed maps, improvements for transit riders, AR experiences and other changes to the platform. The improvements build on the new map Apple begin rolling out two years ago, which had focused on offering richer details, and — in response to user feedback and complaints — more accurate navigation.
Since then, Apple Maps has steadily improved.
The new map experience has since launched in the U.S., U.K., Ireland and Canada and will now make its way to Spain and Portugal, starting today. I will then arrive in Italy and Australia later this year, Apple announced during its keynote address during its Worldwide Developer Conference on Monday.
Image Credits: Apple
In addition, Apple said iOS 15 Maps will include new details for commercial districts, marinas, buildings, and more. Plus, Apple has added things like elevation, new road colors and labels, as well as hundreds of custom designed landmarks — for example, for places like the Golden Gate Bridge.
Apple also built a new nighttime mode for Maps with a “moonlit glow,” it said.
For drivers, Apple added new road details to the map, so it can help drivers as they move throughout a city to better see and understand important things like turn lanes, medians, bus and taxi lanes, and other things. The changes are competitive with some of the updates Google has been making as of late to its own Google Maps platform, which brought street-level details in select cities. These allowed people — including those navigating on foot, in a wheelchair, on a bike, or on a scooter, for example — to better see things like sidewalks and intersections.
Apple is now catching up, saying it, too, will show features like crosswalks and bike lanes.
It will also render things like overlapping complex interchanges in 3D space, making it easier to see upcoming traffic conditions or what lane to take. These features will come to CarPlay later in the year.
Image Credits: Apple
For transit riders, meanwhile, Maps has made improvements to help users find nearby stations.
Users can now pin their favorite lines to the top, and even keep track on their Apple Watch so they don’t have to pull out their phone. The updated Maps app will automatically follow your transit route and notify you when it’s time to disembark, making the app more competitive to third-party apps often favored by transit takers, like Citymapper, for instance.
Image Credits: Apple
When you exit your station, you can also now hold up your iPhone to scan the buildings in the area and Maps will generate an accurate position, offering direction in augmented reality. This is similar to the Live View AR directions Google announced last year.
This feature is launching in select cities in 2021 with more to come in the year ahead, Apple said.
As part of its FaceTime update in iOS 15, Apple introduced a new set of features designed for shared experiences — like co-watching TV shows or TikTok videos, listening to music together, screen sharing and more — while on a FaceTime call. The feature, called SharePlay, enables real-time connections with family and friends while you’re hanging out on FaceTime, Apple explained, by integrating access to apps from within the call itself.
Image Credits: Apple
Apple demonstrated the new feature during its Worldwide Developer Conference keynote this afternoon, showing how friends could press play in Apple Music to listen together, as the music streams to everyone on the call. Shared playback controls also let anyone on the call play, pause or jump to the next track.
The company also showed off watching video from its Apple TV+ streaming service, where the video was synced in real-time between call participants. This was a popular trend during the pandemic, as people looked to virtually watch movies and TV with family and friends, prompting services like Hulu and Amazon Prime Video to add native co-watching features.
But Apple’s SharePlay goes much further than streaming music and video from just Apple’s own services.
The company announced a set of launch partners for SharePlay including Disney+, Hulu, HBO Max, NBA, Twitch, TikTok, MasterClass, ESPN+, Paramount+, and Pluto TV. It’s also making an API available to developers so they can integrate their own apps with SharePlay.
Image Credits: Apple
Users can screen share via SharePlay, too, so you can do things like browse Zillow listings together or show off a mobile gameplay, Apple suggested.
“Screen sharing is also a simple and super effective way to help someone out and answer questions right in the moment, and it works across Apple devices,” noted Apple SVP of Software Engineering, Craig Federighi.