Doug Guthrie, once one of America’s leading China bulls, rang the alarm on doing business there. He spoke about his time at Apple.
Amazon’s experimental approach to product design can hurt your wallet and the environment. Why does it need our help to figure out what it’s doing?
Ms. Khan, who first attracted notice as a critic of Amazon, was confirmed by the Senate as a commissioner on the agency on Tuesday.
The UK’s competition watchdog will take a deep dive look into Apple and Google’s dominance of the mobile ecosystem, it said today — announcing a market study which will examine the pair’s respective smartphone platforms (iOS and Android); their app stores (App Store and Play Store); and web browsers (Safari and Chrome).
The Competition and Markets Authority (CMA) is concerned that the mobile platform giants’ “effective duopoly” in those areas might be harming consumers, it added.
The study will be wide ranging, with the watchdog concerns about the nested gateways that are created as a result of the pair’s dominance of mobile ecosystem — intermediating how consumers can access a variety of products, content and services (such as music, TV and video streaming; fitness tracking, shopping and banking, to cite some of the examples provided by the CMA).
“These products also include other technology and devices such as smart speakers, smart watches, home security and lighting (which mobiles can connect to and control),” it went on, adding that it’s looking into whether their dominance of these pipes is “stifling competition across a range of digital markets”, saying too that it’s “concerned this could lead to reduced innovation across the sector and consumers paying higher prices for devices and apps, or for other goods and services due to higher advertising prices”.
The CMA further confirmed the deep dive will examine “any effects” of the pair’s market power over other businesses — giving the example of app developers who rely on Apple or Google to market their products to customers via their smart devices.
The watchdog already has an open investigation into Apple’s App Store, following a number of antitrust complaints by developers.
It is investigating Google’s planned depreciation of third party tracking cookies too, after complaints by adtech companies and publishers that the move could harm competition. (And just last week the CMA said it was minded to accept a series of concessions offered by Google that would enable the regulator to stop it turning off support for cookies entirely if it believes the move will harm competition.)
The CMA said both those existing investigations are examining issues that fall within the scope of the new mobile ecosystem market study but that its work on the latter will be “much broader”.
It added that it will adopt a joined-up approach across all related cases — “to ensure the best outcomes for consumers and other businesses”.
It’s giving itself a full year to examine Gapple’s mobile ecosystems.
It is also soliciting feedback on any of the issues raised in its statement of scope — calling for responses by 26 July. The CMA added that it’s also keen to hear from app developers, via its questionnaire, by the same date.
Taking on tech giants
The watchdog has previously scrutinized the digital advertising market — and found plenty to be concerned about vis-a-vis Google’s dominance there.
That earlier market study has been feeding the UK government’s plan to reform competition rules to take account of the market-deforming power of digital giants. And the CMA suggested the new market study, examining ‘Gapple’s’ mobile muscle, could similarly help shape UK-wide competition law reforms.
Last year the UK announced its plan to set up a “pro-competition” regime for regulating Internet platforms — including by establishing a dedicated Digital Markets Unit within the CMA (which got going earlier this year).
The legislation for the reform has not yet been put before parliament but the government has said it wants the competition regulator to be able to “proactively shape platforms’ behavior” to avoid harmful behavior before it happens” — saying too that it supports enabling ex ante interventions once a platform has been identified to have so-called “strategic market status”.
Germany already adopted similar reforms to its competition law (early this year), which enable proactive interventions to tackle large digital platforms with what is described as “paramount significance for competition across markets”. And its Federal Cartel Office has, in recent months, wasted no time in opening a number of proceedings to determine whether Amazon, Google and Facebook have such a status.
The CMA also sounds keen to get going to tackle Internet gatekeepers.
Commenting in a statement, CEO Andrea Coscelli said:
“Apple and Google control the major gateways through which people download apps or browse the web on their mobiles – whether they want to shop, play games, stream music or watch TV. We’re looking into whether this could be creating problems for consumers and the businesses that want to reach people through their phones.
“Our ongoing work into big tech has already uncovered some worrying trends and we know consumers and businesses could be harmed if they go unchecked. That’s why we’re pressing on with launching this study now, while we are setting up the new Digital Markets Unit, so we can hit the ground running by using the results of this work to shape future plans.”
The European Union also unveiled its own proposals for clipping the wings of big tech last year — presenting its Digital Markets Act plan in December which will apply a single set of operational rules to so-called “gatekeeper” platforms operating across the EU.
The clear trend in Europe on digital competition is toward increasing oversight and regulation of the largest platforms — in the hopes that antitrust authorities can impose measures that will help smaller players thrive.
Critics might say that’s just playing into the tech giants’ hands, though — because it’s fiddling around the edges when more radical intervention (break ups) are what’s really needed to reboot captured markets.
Apple and Google were contacted for comment on the CMA’s market study.
A Google spokesperson said: “Android provides people with more choice than any other mobile platform in deciding which apps they use, and enables thousands of developers and manufacturers to build successful businesses. We welcome the CMA’s efforts to understand the details and differences between platforms before designing new rules.”
According to Google, the Android App Economy generated £2.8BN in revenue for UK developers last year, which it claims supported 240,000 jobs across the country — citing a Public First report that it commissioned.
The tech giant also pointed to operational changes it has already made in Europe, following antitrust interventions by the European Commission — such as adding a choice screen to Android where users can pick from a list of alternative search engines.
Earlier this month it agreed to shift the format underlying that choice screen from an unpopular auction model to free participation.
Attorney General Merrick B. Garland met with news media executives amid fallout over prosecutors’ seizures of records from phone and tech companies for leak inquiries.
The revelations of a leak investigation started in the Trump administration are a reminder that Big Tech companies often hand over information about their users.
When the technology giant first started doing business in China, it thought it would change the country. Decades later, the reverse is true.
The announcement of new iPad software at this year’s WWDC conference had an abnormally large expectation hung on it. The iPad lineup, especially the larger iPad Pro, has kept up an impressively frantic pace of hardware innovation over the past few years. In that same time frame, the software of the iPad, especially its ability to allow users to use multiple apps at once and in its onramps for professional software makers, has come under scrutiny for an apparently slower pace.
This year’s announcements about iOS 15 and iPadOS 15 seemed designed to counter that narrative with the introduction of a broad number of quality of life improvements to multitasking as well as a suite of system-wide features that nearly all come complete with their own developer-facing APIs to build on. I had the chance to speak to Bob Borchers, Apple’s VP of Worldwide Product Marketing, and Sebastien (Seb) Mariners-Mes, VP, Intelligent System Experience at Apple about the release of iPadOS 15 to discuss a variety of these improvements.
Mariners-Mes works on the team of Apple software SVP Craig Federighi and was pivotal in the development of this new version.
iPad has a bunch of new core features including SharePlay, Live Text, Focuses, Universal Control, on-device Siri processing and a new edition of Swift Playgrounds designed to be a prototyping tool. Among the most hotly anticipated for iPad Pro users, however, are improvements to Apple’s multitasking system.
If you’ve been following along, you’ll know that the gesture-focused multitasking interface of iPadOS has had its share of critics, including me. Though it can be useful in the right circumstances, the un-discoverable gesture system and confusing hierarchy of the different kinds of combinations of apps made it a sort of floppy affair to utilize correctly for an apt user much less a beginner.
Since the iPad stands alone as pretty much the only successful tablet device on the market, Apple has a unique position in the industry to determine what kinds of paradigms are established as standard. It’s a very unique opportunity to say, hey, this is what working on a device like this feels like; looks like; should be.
So I ask Borchers and Mariners-Mes to talk a little bit about multitasking. Specifically Apple’s philosophy in the design of multitasking on iPadOS 15 and the update from the old version, which required a lot of acrobatics of the finger and a strong sense of spatial awareness of objects hovering out off the edges of the screen.
“I think you’ve got it,” Borchers says when I mention the spatial gymnastics, “but the way that we think about this is that the step forward and multitasking makes it easier discover, easier to use even more powerful. And, while pros I think were the ones who were using multitasking in the past, we really want to take it more broadly because we think there’s applicability to many, many folks. And that’s why the, the discovery and the ease of use I think were critical.”
“You had a great point there when you talked about the spatial model and one of our goals was to actually make the spatial model more explicit in the experience,” says Mariners-Mes, “where, for example, if you’ve got a split view, and you’re replacing one of the windows, we kind of open the curtain and tuck the other app to the side, you can see it — it’s not a hidden hidden mental model, it’s one that’s very explicit.
Another great example of it is when you go into the app, switcher to reconfigure your windows, you’re actually doing drag and drop as you rearrange your new split views, or you dismiss apps and so on. So it’s not a hidden model, it’s one where we really try to reinforce a spatial model with an explicit one for the user through all of the animations and all of the kinds of affordances.”
Apple’s goal this time around, he says, was to add affordances for the user to understand that multitasking was even an option — like the small series of dots at the top of every app and window that now allows you to explicitly choose an available configuration, rather than the app-and-dock-juggling method of the past. He goes on to say that consistency was a key metric for them on this version of the OS. The appearance of Slide Over apps in the same switcher view as all other apps, for instance. Or the way that you can choose configurations of apps via the button, by drag and drop in the switcher and get the same results.
In the dashboard, Mariners-Mes says, “you get an at a glance view of all of the apps that you’re running and a full model of how you’re navigating that through the iPad’s interface.”
This ‘at a glance’ map of the system should be very welcome by advanced users. Even as a very aggressive Pro user myself, Slide Over apps became more of a nuisance than anything because I couldn’t keep track of how many were open and when to use them. The ability to combine them on the switcher itself is one of those things that Apple has wanted to get into the OS for years but is just now making its way onto iPads. Persistence of organization, really, was the critical problem to tackle.
“I think we believe strongly in building a mental model where people know where things are [on iPad],” says Mariners-Mes. “And I think you’re right when it comes persistence I think it also applies to, for example, home screen. People have a very strong mental model of where things are in the home screen as well as all of the apps that they’ve configured. And so we try to maintain a well maintained that mental model, and also allow people to reorganize again in the switcher.”
He goes on to explain the new ‘shelf’ feature that displays every instance or window that an app has open within itself. They implemented this as a per-app feature rather than a system-wide feature, he says, because the association of that shelf with a particular app fit the overall mental model that they’re trying to build. The value of this shelf may jump into higher relief when more professional apps that may have a dozen documents or windows open at once and active during a project ship later this year.
Another nod to advanced users in iPadOS 15 is the rich keyboard shortcut set offered across the system. The interface can be navigated by arrow keys now, many advanced commands are there and you can even move around on an iPad using a game controller.
“One of the key goals this year was to make basically everything in the system navigable from the keyboard,” says Mariners-Mes, “so that if you don’t want to, you don’t have to take your hands off the keyboard. All of the new multitasking affordances and features, you can do through the keyboard shortcuts. You’ve got the new keyboard shortcut menu bar where you can see all the shortcuts that are available. It’s great for discoverability. You can search them and we even, you know, and this is a subtle point, but we even made a very conscious effort to rationalize the shortcuts across Mac and iPadOS. So that if you’re using universal control, for example, you’re going to go from one environment to the other seamlessly. You want to ensure that consistency as you go across.”
The gestures, however, are staying as a nod to consistency for existing users that may be used to those.
To me, one of the more interesting and potentially powerful developments is the introduction of the Center Window and its accompanying API. A handful of Apple apps like Mail, Notes and Messages now allow items to pop out into an overlapping window.
“It was a very deliberate decision on our part,” says Mariners-Mes about adding this new element. “This really brings a new level of productivity where you can have, you know, this floating window. You can have content behind it. You can seamlessly cut and paste. And that’s something that’s just not possible with the traditional [iPadOS] model. And we also really strive to make it consistent with the rest of multitasking where that center window can also become one of the windows in your split view, or full size, and then go back to to being a center window. We think it’s a cool addition to the model and we look really look forward to 3rd parties embracing it.”
Early reception of the loop Apple gave at iPadOS 15 has an element of reservation about it still given that many of the most powerful creative apps are made by third parties that must adopt these technologies in order for them to be truly useful. But Apple, Borchers says, is working hard to make sure that pro apps adopt as many of these new paradigms and technologies as possible, so that come fall, the iPad will feel like a more hospitable host for the kinds of advanced work pros want to do there.
One of the nods to this multi-modal universe that the iPad exists in is Universal Control. This new feature uses Bluetooth beaconing, peer-to-peer WiFi and the iPad’s touchpad support to allow you to place your devices close to one another and — in a clever use of reading user intent — slide your mouse to the edge of a screen and onto your Mac or iPad seamlessly.
“I think what we have seen and observed from our users, both pro and and otherwise, is that we have lots of people who have Macs and they have iPads, and they have other iPhones and and we believe in making these things work together in ways that are that are powerful,” says Borchers. “And it just felt like a natural place to be able to go and extend our Continuity model so that you could make use of this incredible platform that is iPadOS while working with your Mac, right next to it. And I think the big challenge was, how do you do that in kind of a magical, simple way. And that’s what Seb and his team and been able to accomplish.
“It really builds on the foundation we made with Continuity and Sidecar,” adds Mariners-Mes. “We really thought a lot about how do you make the experience — the set up experience — as seamless as possible. How do you discover that you’ve got devices side by side.?
The other thing we thought about was what are the workflows that people want to have and what capabilities that will be essential for that. That’s where thinks like the ability to seamlessly drag content across the platforms or cut and paste was we felt to be really, really important. Because I think that’s really what brings to the magic to the experience.”
Borchers adds that it makes all the continuity features that much more discoverable. Continuity’s shared clipboard, for instance, is an always on but invisible presence. Expanding that to visual and mouse-driven models made some natural sense.
“It’s just like, oh, of course, I can drag that all the way across all the way across here,” he says.
“Bob, you say, of course,” Mariners-Mes laughs. “And yet for those of us working in platforms for a long time, the ‘of course’, is technically very, very challenging. Totally non obvious.”
Another area where iPadOS 15 is showing some promising expansionary behavior is in system-wide activities that allow you to break out of the box of in-app thinking. These include embedded recommendations that seed themselves into various apps, Shareplay, which makes an appearance wherever video calls are found and Live Text, which turns all of your photos into indexed archives searchable with a keyboard.
Another is Quick Note, a system extension that lets you swipe from the bottom corner of your screen wherever you are in the system.
“There are, I think a few interesting things that we did with with Quick Note,” says Mariners-Mes. “One is this idea of linking. So, that if I’m working in Safari or Yelp or another app, I can quickly insert a link to whatever content I’m viewing. I don’t know about you, but it’s something that I certainly do a lot when I do research.
“The old way was, like, cut and paste and maybe take a screenshot, create a note and jot down some notes. And now we’ve made that very, very seamless and fluid across the whole system. It even works the other way where, if I’m now in Safari and I have a note that refers to that page in Safari, you’ll see it revealed as a thumbnail at the bottom of the screen’s right hand side. So, we’ve really tried to bring the notes experience to be something that just permeates the system and is easily accessible from, from everywhere.”
Many of the system-wide capabilities that Apple is introducing in iPadOS 15 and iOS 15 have an API that developers can tap into. That is not always the case with Apple’s newest toys, which in years past have often been left to linger in the private section of its list of frameworks rather than be offered to developers as a way to enhance their apps. Borchers says that this is an intentional move that offers a ‘broader foundation of intelligence’ across the entire system.
This broader intelligence includes Siri moving a ton of commands to its local scope. This involved having to move a big chunk of Apple’s speech recognition to an on-device configuration in the new OS as well. The results, says Borchers, are a vastly improved day-to-day Siri experience, with many common commands executing immediately upon request — something that was a bit of a dice roll in days of Siri past. The removal of the reputational hit that Siri was taking from commands that went up to the cloud never to return could be the beginning of a turnaround for the public perception of Siri’s usefulness.
The on-device weaving of the intelligence provided by the Apple Neural Engine (ANE) also includes the indexing of text across photos in the entire system, past, present and in-the-moment.
“We could have done live text only in camera and photos, but we wanted it to apply to anywhere we’ve got images, whether it be in in Safari or quick look or wherever,” says Mariners-Mes. “One of my favorite demos of live text is actually when you’ve got that long complicated field for a password for a Wi-Fi network. You can just actually bring it up within the keyboard and take a picture of it, get the text in it and copy and paste it into into the field. It’s one of those things that’s just kind of magical.”
On the developer service front of iPadOS 15, I ask specifically about Swift Playgrounds, which add the ability to write, compile and ship apps on the App Store for the first time completely on iPad. It’s not the native Xcode some developers were hoping for, but, Borchers says, Playgrounds has moved beyond just ‘teaching people how to code’ and into a real part of many developer pipelines.
“ think one of the big insights here was that we also saw a number of kind of pro developers using it as a prototyping platform, and a way to be able to be on the bus, or in the park, or wherever if you wanted to get in and give something a try, this was super accessible and easy way to get there and could be a nice adjunct to hey, I want to learn to code.”
“If you’re a developer,” adds Mariners-Mes, “it’s actually more productive to be able to run that app on the device that you’re working on because you really get great fidelity. And with the open project format, you can go back and forth between Xcode and Playgrounds. So, as Bob said, we can really envision people using this for a lot of rapid prototyping on the go without having to bring along the rest of their development environment so we think it’s a really, really powerful addition to our development development tools this year.”
Way back in 2018 I profiled a new team at Apple that was building out a testing apparatus that would help them to make sure they were addressing real-world use cases for flows of process that included machines like the (at the time un-revealed) new Mac Pro, iMacs, MacBooks and iPads. One of the demos that stood out at the time was a deep integration with music apps like Logic that would allow the input models of iPad to complement the core app. Tapping out a rhythm on a pad, brightening or adjusting sound more intuitively with the touch interface. More of Apple’s work these days seems to be aimed at allowing users to move seamlessly back and forth between its various computing platforms, taking advantage of the strengths of each (raw power, portability, touch, etc) to complement a workflow. A lot of iPadOS 15 appears to be geared this way.
Whether it will be enough to turn the corner on the perception of iPad as a work device that is being held back by software, I’ll reserve judgement until it ships later this year. But, in the near term, I am cautiously optimistic that this set of enhancements that break out of the ‘app box’, the clearer affordances for multitasking both in and out of single apps and the dedication to API support are pointing towards an expansionist mentality on the iPad software team. A good sign in general.
The F.B.I. scored two major victories, recovering a Bitcoin ransom and tricking lawbreakers with an encryption app. But criminals may still have the upper hand.
Apple went big on privacy during its Worldwide Developer Conference (WWDC) keynote this week, showcasing features from on-device Siri audio processing to a new privacy dashboard for iOS that makes it easier than ever to see which apps are collecting your data and when.
While typically vocal about security during the Memoji-filled, two-hour-long(!) keynote, the company also quietly introduced several new security and privacy-focused features during its WWDC developer sessions. We’ve rounded up some of the most interesting — and important.
Passwordless login with iCloud Keychain
Apple is the latest tech company taking steps to ditch the password. During its “Move beyond passwords” developer session, it previewed Passkeys in iCloud Keychain, a method of passwordless authentication powered by WebAuthn, and Face ID and Touch ID.
The feature, which will ultimately be available in both iOS 15 and macOS Monterey, means you no longer have to set a password when creating an account or a website or app. Instead, you’ll simply pick a username, and then use Face ID or Touch ID to confirm it’s you. The passkey is then stored in your keychain and then synced across your Apple devices using iCloud — so you don’t have to remember it, nor do you have to carry around a hardware authenticator key.
“Because it’s just a single tap to sign in, it’s simultaneously easier, faster and more secure than almost all common forms of authentication today,” said Garrett Davidson, an Apple authentication experience engineer.
While it’s unlikely to be available on your iPhone or Mac any time soon — Apple says the feature is still in its ‘early stages’ and it’s currently disabled by default — the move is another sign of the growing momentum behind eliminating passwords, which are prone to being forgotten, reused across multiple services, and — ultimately — phishing attacks. Microsoft previously announced plans to make Windows 10 password-free, and Google recently confirmed that it’s working towards “creating a future where one day you won’t need a password at all”.
Microphone indicator in macOS
Since the introduction of iOS 14, iPhone users have been able to keep an eye on which apps are accessing their microphone via a green or orange dot in the status bar. Now it’s coming to the desktop too.
In macOS Monterey, users will be able to see which apps are accessing their Mac’s microphone in Control Center, MacRumors reports, which will complement the existing hardware-based green light that appears next to a Mac’s webcam when the camera is in use.
iOS 15, which will include a bunch of privacy-bolstering tools from Mail Privacy Protection to App Privacy Reports, is also getting a feature called Secure Paste that will help to shield your clipboard data from other apps.
This feature will enable users to paste content from one app to another, without the second app being able to access the information on the clipboard until you paste it. This is a significant improvement over iOS 14, which would notify when an app took data from the clipboard but did nothing to prevent it from happening.
“With secure paste, developers can let users paste from a different app without having access to what was copied until the user takes action to paste it into their app,” Apple explains. “When developers use secure paste, users will be able to paste without being alerted via the [clipboard] transparency notification, helping give them peace of mind.”
While this feature sounds somewhat insignificant, it’s being introduced following a major privacy issue that came to light last year. In March 2020, security researchers revealed that dozens of popular iOS apps — including TikTok — were “snooping” on users’ clipboard without their consent, potentially accessing highly sensitive data.
Advanced Fraud Protection for Apple Card
Payments fraud is more prevalent than ever as a result of the pandemic, and Apple is looking to do something about it. As first reported by 9to5Mac, the company has previewed Advanced Fraud Protection, a feature that will let Apple Card users generate new card numbers in the Wallet app.
While details remain thin — the feature isn’t live in the first iOS 15 developer beta — Apple’s explanation suggests that Advanced Fraud Protection will make it possible to generate new security codes — the three-digit number you enter at checkout – when making online purchases.
“With Advanced Fraud Protection, Apple Card users can have a security code that changes regularly to make online Card Number transactions even more secure,” the brief explainer reads. We’ve asked Apple for some more information.
‘Unlock with Apple Watch’ for Siri requests
As a result of the widespread mask-wearing necessitated by the pandemic, Apple introduced an ‘Unlock with Apple Watch’ in iOS 14.5 that let enabled users to unlock their iPhone and authenticate Apple Pay payments using an Apple Watch instead of Face ID.
The scope of this feature is expanding with iOS 15, as the company has confirmed that users will soon be able to use this alternative authentication method for Siri requests, such as adjusting phone settings or reading messages. Currently, users have to enter a PIN, password or use Face ID to do so.
“Use the secure connection to your Apple Watch for Siri requests or to unlock your iPhone when an obstruction, like a mask, prevents Face ID from recognizing your Face,” Apple explains. Your watch must be passcode protected, unlocked, and on your wrist close by.”
Standalone security patches
To ensure iPhone users who don’t want to upgrade to iOS 15 straight away are up to date with security updates, Apple is going to start decoupling patches from feature updates. When iOS 15 lands later this year, users will be given the option to update to the latest version of iOS or to stick with iOS 14 and simply install the latest security fixes.
“iOS now offers a choice between two software update versions in the Settings app,” Apple explains (via MacRumors). “You can update to the latest version of iOS 15 as soon as it’s released for the latest features and most complete set of security updates. Or continue on iOS 14 and still get important security updates until you’re ready to upgrade to the next major version.”
This feature sees Apple following in the footsteps of Google, which has long rolled out monthly security patches to Android users.
‘Erase all contents and settings’ for Mac
Wiping a Mac has been a laborious task that has required you to erase your device completely then reinstall macOS. Thankfully, that’s going to change. Apple is bringing the “erase all contents and settings” option that’s been on iPhones and iPads for years to macOS Monterey.
The option will let you factory reset your MacBook with just a click. “System Preferences now offers an option to erase all user data and user-installed apps from the system, while maintaining the operating system currently installed,” Apple says. “Because storage is always encrypted on Mac systems with Apple Silicon or the T2 chip, the system is instantly and securely ‘erased’ by destroying the encryption keys.”
Democrats denounced the Trump administration’s seizure of lawmakers’ data as an abuse of power and called on Republicans to back the congressional inquiry.
Apple, under fire for turning over the data of two lawmakers to the Trump Justice Dept., said it did so unknowingly, while Google fought a request for New York Times data because it related to a corporate client.
The mood about technology has soured more recently, but bosses’ paychecks have mostly remained unscathed.
A bipartisan group of House members introduced five bills that take direct aim at Amazon, Apple, Facebook and Google.
In Friday acquisition news, Shopify shared today that they’ve acquired augmented reality startup Primer, which makes an app that lets users visualize what tile, wallpaper or paint will look like on surfaces inside their home.
In a blog post, co-founders Adam Debreczeni and Russ Maschmeyer write that Primer’s app and services will be shutting down next month as part of the deal. Debreczeni tells TechCrunch that Primer’s team of eight employees will all be joining Shopify following the acquisition.
Primer had partnered with dozens of tile and textile design brands to allow users to directly visualize what their designs would look like using their iPhone and iPad and Apple’s augmented reality platform ARKit. The app has been highlighted by Apple several times including this nice write-up by the App Store’s internal editorial team.
Terms of the deal weren’t disclosed. Primer’s backers included Slow Ventures, Abstract Ventures, Foundation Capital and Expa.
There’s been a lot of big talk about how augmented reality will impact online shopping, but aside from some of the integrations made in home design, there hasn’t been an awful lot that’s found its way into real consumer use. Shopify has worked on some of their own integrations — allowing sellers to embed 3D models into their storefronts that users can drop into physical space — but it’s clear that there’s much more room left to experiment.
Apple incorporated the announcement of this year’s Apple Design Award winners into its virtual Worldwide Developer Conference (WWDC) online event, instead of waiting until the event had wrapped, like last year. Ahead of WWDC, Apple previewed the finalists, whose apps and games showcased a combination of technical achievement, design and ingenuity. This evening, Apple announced the winners across six new award categories.
In each category, Apple selected one app and one game as the winner.
In the Inclusivity category, winners supported people from a diversity of backgrounds, abilities and languages.
This year, winners included U.S.-based Aconite’s highly accessible game, HoloVista, where users can adjust various options for motion control, text sizes, text contrast, sound, and visual effect intensity. In the game, users explore using the iPhone’s camera to find hidden objects, solve puzzles and more. (Our coverage)
Another winner, Voice Dream Reader, is a text-to-speech app that support more than two dozen languages and offers adaptive features and a high level of customizable settings.
In the Delight and Fun, category, winners offer memorable and engaging experiences enhanced by Apple technologies. Belgium’s Pok Pok Playroom, a kid entertainment app that spun out of Snowman (Alto’s Adventure series), won for its thoughtful design and use of subtle haptics, sound effects and interactions. (Our coverage)
Another winner included U.K.s’ Little Orpheus, a platformer that combines storytelling, surprises, and fun and offers a console-like experience in a casual game.
The Interaction category winners showcase apps that offer intuitive interfaces and effortless controls, Apple says.
The U.S.-based snarky weather app CARROT Weather won for its humorous forecasts, unique visuals, and entertaining experience, which is also available as Apple Watch faces and widgets.
Canada’s Bird Alone game combines gestures, haptics, parallax, and dynamic sound effects in clever ways to brings its world to life.
A Social Impact category doled out awards to Denmark’s Be My Eyes, which enables people who are blind and low vision to identify objects by pairing them with volunteers from around the world using their camera. Today, it supports over 300K users who are assisted by over 4.5M volunteers. (Our coverage)
U.K.’s ustwo games won in this category for Alba, a game that teaches about respecting the environment as players save wildlife, repair a bridge, clean up trash and more. The game also plants a tree for every download.
The Visuals and Graphics winners feature “stunning imagery, skillfully drawn interfaces, and high-quality animations,” Apple says.
Belarus-based Loóna offers sleepscape sessions which combine relaxing activities and atmospheric sounds with storytelling to help people wind down at night. The app was recently awarded Google’s “best app” of 2020.
China’s Genshin Impact won for pushing the visual frontier on gaming, as motion blur, shadow quality, and frame rate can be reconfigured on the fly. The game had previously made Apple’s Best of 2020 list and was Google’s best game of 2020.
Innovation winners included India’s NaadSadhana, an all-in-one, studio-quality music app that helps artists perform and publish. The app uses A.I. and Core ML to listen and provide feedback on the accuracy of notes, and generates a backing track to match.
Riot Games’ League of Legends: Wild Rift (U.S.) won for taking a complex PC classic and delivering a full mobile experience that includes touchscreen controls, an auto-targeting system for newcomers, and a mobile-exclusive camera setting.
The winners this year will receive a prize package that includes hardware and the award itself.
A video featuring the winners is here on the Apple Developer website.
“This year’s Apple Design Award winners have redefined what we’ve come to expect from a great app experience, and we congratulate them on a well-deserved win,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations, in a statement. “The work of these developers embodies the essential role apps and games play in our everyday lives, and serve as perfect examples of our six new award categories.”
If you’ve ever bought a subscription inside an iOS app and later decided you wanted to cancel, upgrade or downgrade, or ask for a refund, you may have had trouble figuring out how to go about making that request or change. Some people today still believe that they can stop their subscription charges simply by deleting an app from their iPhone. Others may dig around unsuccessfully inside their iPhone’s Settings or on the App Store to try to find out how to ask for a refund. With the updates Apple announced in StoreKit 2 during its Worldwide Developers Conference this week, things may start to get a little easier for app customers.
StoreKit is Apple’s developer framework for managing in-app purchases — an area that’s become more complex in recent years, as apps have transitioned from offering one-time purchases to ongoing subscriptions with different tiers, lengths, and feature sets.
Currently, users who want to manage or cancel subscriptions can do so from the App Store or their iPhone Settings. But some don’t realize the path to this section from Settings starts by tapping on your Apple ID (your name and profile photo at the top of the screen). They may also get frustrated if they’re not familiar with how to navigate their Settings or the App Store.
Meanwhile, there are a variety of ways users can request refunds on their in-app subscriptions. They can dig in their inbox for their receipt from Apple, then click the “Report a Problem” link it includes to request a refund when something went wrong. This could be useful in scenarios where you’ve bought a subscription by mistake (or your kid has!), or where the promised features didn’t work as intended.
Apple also provides a dedicated website where users can directly request refunds for apps or content. (When you Google for something like “request a refund apple” or similar queries, a page that explains the process typically comes up at the top of the search results.)
Still, many users aren’t technically savvy. For them, the easiest way to manage subscriptions or ask for refunds would be to do so from within the app itself. For this reason, many conscientious app developers tend to include links to point customers to Apple’s pages for subscription management or refunds inside their apps.
But StoreKit 2 is introducing new tools that will allow developers to implement these sort of features more easily.
One new tool is a Manage subscriptions API, which lets an app developer display the manage subscriptions page for their customer directly inside their app — without redirecting the customer to the App Store. Optionally, developers can choose to display a “Save Offer” screen to present the customer with a discount of some kind to keep them from cancelling, or it could display an exit survey so you can ask the customer why they decided to end their subscription.
When implemented, the customer will be able to view a screen inside the app that looks just like the one they’d visit in the App Store to cancel or change a subscription. After cancelling, they’ll be shown a confirmation screen with the cancellation details and the service expiration date.
If the customer wants to request a refund, a new Refund request API will allow the customer to begin their refund request directly in the app itself — again, without being redirected to the App Store or other website. On the screen that displays, the customer can select which item they want refund and check the reason why they’re making the request. Apple handles the refund process and will send either an approval or refund declined notification back to the developer’s server.
However, some developers argue that the changes don’t go far enough. They want to be in charge of managing customer subscriptions and handling refunds themselves, through programmatic means. Plus, Apple can take up to 48 hours for the customer to receive an update on their refund request, which can be confusing.
“They’ve made the process a bit smoother, but developers still can’t initiate refunds or cancellations themselves,” notes RevenueCat CEO Jacob Eiting, whose company provides tools to app developers to manage their in-app purchases. “It’s a step in the right direction, but could actually lead to more confusion between developers and consumers about who is responsible for issuing refunds.”
In other words, because the forms are now going to be more accessible from inside the app, the customer may believe the developer is handling the refund process when, really, Apple continues to do so.
Some developers pointed out that there are other scenarios this process doesn’t address. For example, if the customer has already uninstalled the app or no longer has the device in question, they’ll still need to be directed to other means of asking for refunds, just as before.
For consumers, though, subscription management tools like this mean more developers may begin to put buttons to manage subscriptions and ask for refunds directly inside their app, which is a better experience. In time, as customers learn they can more easily use the app and manage subscriptions, app developers may see better customer retention, higher engagement, and better App Store reviews, notes Apple.
The StoreKit 2 changes weren’t limited to APIs for managing subscriptions and refunds.
Developers will also gain access to a new Invoice Lookup API that allows them to look up the in-app purchases for the customer, validate their invoice and identify any problems with the purchase — for example, if there were any refunds already provided by the App Store.
A new Refunded Purchases API will allow developers to look up all the refunds for the customer.
And a new Renewal Extension API will allow developers to extend the renewal data for paid, active subscriptions in the case of an outage — like when dealing with customer support issues when a streaming service went down, for example. This API lets developers extend the subscription up to twice per calendar year, each up to 90 days in the future.
Another change will help customers when they reinstall apps or download them on new devices. Before, users would have to manually “restore purchases” to sync the status of the completed transactions back to that newly downloaded or reinstalled app. Now, that information will be automatically fetched by StoreKit 2 so the apps are immediately up-to-date with whatever it is the user paid for.
While, overall, the changes make for a significant update to the StoreKit framework, Apple’s hesitancy to allow developers more control over their own subscription-based customers speaks, in part, to how much it wants to control in-app purchases. This is perhaps because it got burned in the past when it tried allowing developers to manage their own refunds.
As The Verge noted last month while the Epic Games-Apple antitrust trial was underway, Apple had once provided Hulu will access to a subscription API, then discovered Hulu had been offering a way to automatically cancel subscriptions made through the App Store when customers wanted to upgrade to higher-priced subscription plans. Apple realized it needed to take action to protect against this misuse of the API, and Hulu later lost access. It has not since made that API more broadly available.
On the flip side, having Apple, not the developers, in charge of subscription management and refunds means Apple takes on the responsibilities around preventing fraud — including fraud perpetrated by both customers and developers alike. Customers may also prefer that there’s one single place to go for managing their subscription billing: Apple. They may not want to have to deal with each developer individually, as their experience would end up being inconsistent.
These changes matter because subscription revenue contributes to a sizable amount of Apple’s lucrative App Store business. Ahead of WWDC 21, Apple reported the sale of digital goods and services on the App Store grew to $86 billion in 2020, up 40% over the the year prior. Earlier this year, Apple said it paid out more than $200 billion to developers since the App Store launched in 2008.
Margrethe Vestager is the European regulator trying to do something audacious: get companies like Apple, Amazon and Facebook to play fair and pay taxes.
Apple in 2018 closed its $400 million acquisition of music recognition app Shazam. Now, it’s bringing Shazam’s audio recognition capabilities to app developers in the form of the new ShazamKit. The new framework will allow app developers — including those on both Apple platforms and Android — to build apps that can identify music from Shazam’s huge database of songs, or even from their own custom catalog of pre-recorded audio.
Many consumers are already familiar with the mobile app Shazam, which lets you push a button to identify what song you’re hearing, and then take other actions — like viewing the lyrics, adding the song to a playlist, exploring music trends, and more. Having first launched in 2008, Shazam was already one of the oldest apps on the App Store when Apple snatched it up.
Now the company is putting Shazam to better use than being just a music identification utility. With the new ShazamKit, developers will now be able to leverage Shazam’s audio recognition capabilities to create their own app experiences.
There are three parts to the new framework: Shazam catalog recognition, which lets developers add song recognition to their apps; custom catalog recognition, which performs on-device matching against arbitrary audio; and library management.
Shazam catalog recognition is what you probably think of when you think of the Shazam experience today. The technology can recognize the song that’s playing in the environment and then fetch the song’s metadata, like the title and artist. The ShazamKit API will also be able to return other metadata like genre or album art, for example. And it can identify where in the audio the match occurred.
When matching music, Shazam doesn’t actually match the audio itself, to be clear. Instead, it creates a lossy representation of it, called a signature, and matches against that. This method greatly reduces the amount of data that needs to be sent over the network. Signatures also cannot be used to reconstruct the original audio, which protects user privacy.
The Shazam catalog comprises millions of songs and is hosted in cloud and maintained by Apple. It’s regularly updated with new tracks as they become available.
When a customer uses a developer’s third-party app for music recognition via ShazamKit, they may want to save the song in their Shazam library. This is found in the Shazam app, if the user has it installed, or it can be accessed by long pressing on the music recognition Control Center module. The library is also synced across devices.
Apple suggests that apps make their users aware that recognized songs will be saved to this library, as there’s no special permission required to write to the library.
ShazamKit’s custom catalog recognition feature, meanwhile, could be used to create synced activities or other second-screen experiences in apps by recognizing the developer’s audio, not that from the Shazam music catalog.
This could allow for educational apps where students follow along with a video lesson, where some portion of the lesson’s audio could prompt an activity to begin in the student’s companion app. It could also be used to enable mobile shopping experiences that popped up as you watched a favorite TV show.
ShazamKit is current in beta on iOS 15.0+, macOS 12.0+, Mac Catalyst 15.0+, tvOS 15.0+, and watchOS 8.0+. On Android, ShazamKit comes in the form of an Android Archive (AAR) file and supports music and custom audio, as well.
At its Worldwide Developer Conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to get started building AR (augmented reality) experiences. With the launch of RealityKit 2, Apple says developers will have more visual, audio, and animation control when working on their AR experiences. But the most notable part of the update is how Apple’s new Object Capture technology will allow developers to create 3D models in minutes using only an iPhone.
Apple noted during its developer address that one of the most difficult parts of making great AR apps was the process of creating 3D models. These could take hours and thousands of dollars.
With Apple’s new tools, developers will be able take a series of pictures using just an iPhone (or iPad or DSLR, if they prefer) to capture 2D images of an object from all angles, including the bottom.
Then, using the Object Capture API on macOS Monterey, it only takes a few lines of code to generate the 3D model, Apple explained.
To begin, developers would start a new photogrammetry session in RealityKit that points to the folder where they’ve captured the images. Then, they would call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to generate the USDZ files optimized for AR Quick Look — the system that lets developers add virtual, 3D objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects — an indication that online shopping is about to get a big AR upgrade.
Wayfair, for example, is using Object Capture to develop tools for their manufacturers so they can create a virtual representation of their merchandise. This will allow Wayfair customers to be able to preview more products in AR than they could today.
In addition, Apple noted developers including Maxon and Unity are using Object Capture for creating 3D content within 3D content creation apps, such as Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine tune the look and feel of AR objects; dynamic loading for assets; the ability to build your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale and explore AR worlds in RealityKit-based games.
One developer, Mikko Haapoja of Shopify, has been trying out the new technology (see below) and shared some real-world tests where he shot objects using an iPhone 12 Max via Twitter.
Developers who want to test it for themselves can leverage Apple’s sample app and install Monterey on their Mac to try it out.
Last month, Apple announced it would soon add lossless audio streaming and Spatial Audio with support for Dolby Atmos to its Apple Music subscription at no extra charge. That upgrade has now gone live, Apple announced this morning — though many noticed the additions actually rolled out yesterday, following the WWDC keynote.
The entire Apple Music catalog of 75+ million songs will support lossless audio.
The lossless tier begins at CD quality — 16 bit at 44.1 kHz, and goes up to 24 bit at 48 kHz, Apple previously said. Audiophiles can also opt for the high-resolution lossless that goes up to 24 bit at 192 kHz. Apple has said you’ll need to use an external, USB digital-to-analog converter to take advantage of the latter — simply plugging in a pair of headphones to an iPhone won’t work.
Apple Music subscribers will be able to enable the new lossless option under Settings > Music > Audio quality. Here, you’ll be able to choose the different resolutions you want to use for different connections, including Wi-Fi, cellular, and download.
When you make your selection in Settings, iOS warns that lossless files will use “significantly more space” on your device, as 10 GB of storage would allow you to store approximately 3,000 songs at high quality, 1,000 songs with lossless, or 200 songs with high-res lossless.
Meanwhile, Spatial Audio will be enabled by default on hardware that supports Dolby Atmos, like Apple’s AirPods and Beats headphones with an H1 or W1 chip. The latest iPhone, iPad, and Mac models also support Dolby Atmos. Spatial Audio on Apple Music will also be “coming soon” to Android devices, Apple said.
To kick off launch, Apple Music is today rolling out new playlists designed to showcase Spatial Audio. These include:
- Made for Spatial Audio
- Hits in Spatial Audio
- Rock in Spatial Audio
- Pop in Spatial Audio
- Hip-Hop in Spatial Audio
- Country in Spatial Audio
- Jazz in Spatial Audio
- Classical in Spatial Audio
Apple is also adding a special guide to Spatial Audio on Apple Music, which will help music listeners hear the difference. This will include tracks from artists like Marvin Gaye and The Weeknd, among others. And Apple will air a roundtable conversation about Spatial Audio featuring top sound engineers and experts, hosted by Zane Lowe at 9 am PT today on Apple Music.
Because songs have to be remastered for Dolby Atmos specifically, these guides and playlists will help music fans experience the new format without having to hunt around. Apple says it’s working with artists and labels to add more new releases and the best catalog tracks in Spatial Audio. To help on this front, Apple notes there are various initiatives underway — including doubling the number of Dolby-enabled studios in major markets, offering educational programs, and providing resources to independent artists.
Apple also said it will build music-authoring tools directly into Logic Pro. Later this year, the company plans to release an update to Logic Pro that will allow any musician to create and mix their songs in Spatial Audio for Apple Music.
Just after the release of iOS 12 in 2018, Apple introduced its own built-in screen time tracking tools and controls. In then began cracking down on third-party apps that had implemented their own screen time systems, saying they had done so through via technologies that risked user privacy. What wasn’t available at the time? A Screen Time API that would have allowed developers to tap into Apple’s own Screen Time system and build their own experiences that augmented its capabilities. That’s now changed.
At Apple’s Worldwide Developer Conference on Monday, it introduced a new Screen Time API that offers developer access to frameworks that will allow parental control experience that also maintains user privacy.
The company added three new Swift frameworks to the iOS SDK that will allow developers to create apps that help parents manage what a child can do across their devices and ensure those restrictions stay in place.
The apps that use this API will be able to set restrictions like locking accounts in place, preventing password changes, filtering web traffic, and limiting access to applications. These sorts of changes are already available through Apple’s Screen Time system, but developers can now build their own experiences where these features are offered under their own branding and where they can then expand on the functionality provided by Apple’s system.
Developers’ apps that take advantage of the API can also be locked in place so it can only be removed from the device with a parent’s approval.
The apps can authenticate the parents and ensure the device they’re managing belongs to a child in the family. Plus, Apple said the way the system will work lets parents choose the apps and websites they want to limit, without compromising user privacy. (The system returns only opaque tokens instead of identifiers for the apps and website URLs, Apple told developers, so the third-parties aren’t gaining access to private user data like app usage and web browsing details. This would prevent a shady company from building a Screen Time app only to collect troves of user data about app usage, for instance.)
The third-party apps can also create unique time windows for different apps or types of activities, and warn the child when time is nearly up. When it registers the time’s up, the app lock down access to websites and apps and perhaps remind the child it’s time to their homework — or whatever other experience the developer has in mind.
And on the flip side, the apps could create incentives for the child to gain screen time access after they complete some other task, like doing homework, reading or chores, or anything else.
Developers could use these features to design new experiences that Apple’s own Screen Time system doesn’t allow for today, by layering their own ideas on top of Apple’s basic set of controls. Parents would likely fork over their cash to make using Screen Time controls easier and more customized to their needs.
Other apps could tie into Screen Time too, outside of the “family” context — like those aimed at mental health and wellbeing, for example.
Of course, developers have been asking for a Screen Time API since the launch of Screen Time itself, but Apple didn’t seem to prioritize its development until the matter of Apple’s removal of rival screen time apps was brought up in an antitrust hearing last year. At the time, Apple CEO Tim Cook defended the company’s decision by explaining that apps had been using MDM (mobile device management) technology, which was designed for managing employee devices in the enterprise, not home use. This, he said, was a privacy risk.
Apple has a session during WWDC that will detail how the new API works, so we expect we’ll learn more soon as the developer info becomes more public.
Is the world finally ready to take on tax havens?
Apple today is releasing a new version of its App Store Review Guidelines, its lengthy document which dictates the rules which apps must abide by in order to be published to its App Store. Among the more notable changes rolling out today are several sections that will see Apple taking a harder stance on App Store fraud, scams and developer misconduct, including a new process that aims to empower other developers to hold bad actors accountable.
One of the key updates on this front involves a change to Apple’s Developer Code of Conduct (Section 5.6 and 5.6.1-5.6.4 of the Review Guidelines).
This section has been significantly expanded to include guidance stating that repeated manipulative or misleading behavior or other fraudulent conduct will lead to the developer’s removal from the Apple Developer Program. This is something Apple has done for repeated violations, it claims, but wanted to now ensure was clearly spelled out in the guidelines.
In an entirely new third paragraph in this section, Apple says that if a developer engages in activities or actions that are not in accordance with the developer code of conduct, they will have their Apple Developer account terminated.
It also details what, specifically, must be done to restore the account, which includes providing Apple with a written statement detailing the improvements they’ve made, which will have to be approved by Apple. If Apple is able to confirm the changes has been made, it may then restore the developer’s account.
Apple explained in a press briefing that this change was meant to prevent a sort of catch and release scenario where a developer gets caught by Apple, but then later reverts their changes to continue their bad behavior.
As part of this update, Apple added a new section about developer identity (5.6.2). This is meant to ensure the contact information for developers provided to Apple and customers is accurate and functional, and that the developer isn’t impersonating other, legitimate developers on the App Store. This was a particular issue in a high-profile incident of App Store fraud which involved a crypto wallet app that scammed a user out of his life savings (~$600,000) in Bitcoin. The scam victim had been deceived because the app was using the same name and icon as a different company that made a hardware crypto device, and because the scan app was rated 5 stars. (Illegitimately, that is).
Related to this, Apple clarified the language around App Store discovery fraud (5.6.3) to more specifically call out any type of manipulations of App Store charts, search, reviews and referrals. The former would mean to crack down on the clearly booming industry of fake App Store ratings and reviews, which can send scam app up higher in charts and search.
Meanwhile, the referral crackdown would address consumers being shown incorrect pricing outside the App Store in an effort to boost installs.
Another section (5.6.4) addresses issues that come up after an app is published, including negative customer reports and concerns and excessive refund rates, for example. If Apple notices this behavior, it will investigate the app for violations, it says.
Of course, the question here is: will Apple actually notice the potential scammers? In recent months, a growing number of developers believe Apple is allowing far too many scammers to fall through the cracks of App Review.
One particular thorn in Apple’s side has been Fleksy keyboard app founder Kosta Eleftheriou, who is not only suing Apple for the revenue he’s personally lost to scammers, but also formed a sort of one-man bunco squad to expose some of the more egregious scams to date. This has included the above-mentioned crypto scam; a kids game that actually contained a hidden online casino; and a VPN app scamming users out of $5 million per year, among many others.
The rampant fraud taking place on the App Store was also brought up during Apple’s antitrust hearing, when Georgia’s Senator Jon Ossoff asked Apple’s Chief Compliance Officer Kyle Andeer why Apple was not able to locate scams, given they’re “trivially easy” to identify.
Apple downplayed the concerns then, and continues to do so through press releases like this one which noted how the App Store stopped over $1.5 billion in fraudulent transactions in 2020.
But a new update to these Guidelines seems to be an admission that Apple may need a little help on this front. It says developers can now directly report possible violations they find in other developers’ apps. Through a new form that standardizes this sort of complaint, developers can point to guideline violations and any other trust and safety issues they discover. Often, developers notice the scammers whose apps are impacting their own business and revenue, so they’ll likely turn to this form now as a first step in getting the scammer dealt with.
Another change will allow developers to appeal a rejection if they think there was unfair treatment of any kind, including political bias. Previously, Apple had allowed developers to appeal App Store decisions and suggest changes to guidelines.
Apple told us it has 500 app reviewers covering 81 languages who see new scenarios daily that have to be accounted for in updated guidelines and policies. Apple says it takes what it learns from these individual issues it encounters to invest in its systems, algorithms and training so it can prevent similar issues in the future. The company believes the new Code of Conduct rules, in particular, will give it the tools needed to better crack down on App Store fraud.
The rules about scams are only a handful of the many changes rolling out with today’s updated App Store Review Guidelines.
There are a few others, however, also worth highlighting:
- Apple clarified rules around “hookup” apps to ensure developers understand porn and prostitution are not allowed on the App Store — often an issue with the fly-by-night hookup apps, which bait and switch users.
- Creator content apps are instructed that they must follow rules for user-generated content, when applicable, meaning they must have content blocking, reporting and robust moderation.
- Apple added the ability for licensed pharmacies and licensed cannabis dispensaries to facilitate purchasing provided they’re legal and geogated.
- Apps that report criminal activity require the developers to work with local law enforcement. (Citizen is a recent example of an app gone awry when users hunted down the wrong person. That level of carelessness may be coming to an end now.)
- Bait-and-switch marketing and ads about app pricing isn’t allowed.
- Cellular carrier apps can now include other kinds of subscription apps besides music and video services.
- Apple clarifies that developers can communicate on email with anyone, but says they can’t target customers acquired through the App Store with messages about how to make purchases outside of the App Store.
- Apple has enough drinking game apps. Stop sending them in.
- Apps that offer account creation also have to offer account deletion.
- Other clarity was added around in-app purchases for gift cards, app metadata, bug fix submissions, and more. But these were not major changes.
Apple’s privacy push has put the company at odds with rivals. Despite protests from some corners of Silicon Valley, Monday’s announcements show that Apple has doubled down on privacy features.
Apple today announced a number of coming changes and improvements to the App Store that will help developers better target their apps to users, get their apps discovered by more people, and even highlight what sort of events are taking place inside their apps to entice new users to download the app and encourage existing users to return.
The company said its App Store today sees 600 million weekly users across 175 countries, and has paid out over $230 billion to developers since the App Store launched, highlighting the business opportunity for app developers.
However, as the App Store has grown, it’s become harder for app developers to market their apps to new users or get their apps found. The new features aim to address that.
One change involves the app’s product page. Starting this year, app developers will be able to create multiple custom product pages to showcase different features of their app for different users. For instance, they’ll be able to try out things like different screenshots, videos, and even different app icons to A/B test what users like the most.
They’ll also be able to advertise the dynamic things that are taking place inside their apps on an ongoing basis. Apple explained that apps and games are constantly rolling out new content and limited time events like film premieres on streaming services, events like Pokémon Go fests, or Nike fitness challenges. But these events were often only discoverable by those who already had the app installed and then opted in to push notifications.
Apple will now allow developers to better advertise these events, with the launch in-app events “front and center on the App Store.” The events can be showcased on the app’s product page. Users can learn more about the events, sign up to be notified, or quickly join the event, if it’s happening now. They can also discover events with personalized recommendations and through App Store search.
App Store editors will curate the best events and the new App Store widget will feature upcoming events right on users’ homescreens, too.
Apple says the feature will be open to all developers, including those who already run events and those who are just getting started.
During the WWDC conference today, Apple unveiled the new macOS 12 Monterey. A major feature in the macOS update is Universal Control, which builds upon the Continuity features first introduced in OS X Yosemite. For years, it’s been possible to open a news article on your iPhone and keep reading it on your MacBook, or to copy and paste a link from your iPad to your iMac. But Universal Control takes these features further.
With Universal Control, you can use a single mouse and keyboard to navigate across multiple Apple devices at once. This functionality works across more than two devices – in the demo video, the feature is used to seamlessly move across an iPad, MacBook, and iMac. Users can drag and drop files across multiple devices at once, making it possible, for example, to use a multi-screen setup while editing video on Final Cut Pro.
What’s possible in Universal Control isn’t necessarily new – this has been made possible before through third party apps. Plus, in 2019, Apple debuted Sidecar, which allowed users to connect their iPad as a second monitor for their MacBook or iMac. But, Universal Control improves upon Sidecar – and maybe renders it obsolete – by allowing users to link any Apple devices together, even if it’s not an iPad. Though this update may not be ground-breaking, it’s a useful upgrade to existing features.
Apple didn’t announce that rumored combined Apple TV device that would combine the set-top box with a HomePod speaker during its WWDC keynote, but it did announce a few features that will improve the Apple TV experience — including one that involves a HomePod Mini. Starting this fall, Apple said you’ll be able to select the HomePod Mini as the speaker for your Apple TV 4K. It also introduced a handful of software updates for Apple TV users, including a new way to see shows everyone in the family will like, and support for co-watching shows through FaceTime.
The co-watching feature is actually a part of a larger FaceTime update, which will let users stream music, TV, and screen share through their FaceTime calls. The Apple TV app is one of those that’s supported through this new system, called SharePlay. It will now include a new “Shared with You” row that highlights the shows and movies your friends are sharing, as well.
Another feature called “For All of You” will display a collection of shows and movies based on everyone’s interests within Apple TV’s interface. This is ideal you’re planning to watch something as a family — like for movie night, for example. And you can fine tune the suggestions based on who’s watching.
A new Apple TV widget is also being made available, which now includes iPad support.
And the new support for HomePod Mini will help deliver “rich, balanced sound” and “crystal clear dialog,” when you’re watching Apple TV with the Mii set up as your speakers, Apple said.
Among many updates coming to iOS 15, Apple Maps will receive a number of upgrades that will bring more detailed maps, improvements for transit riders, AR experiences and other changes to the platform. The improvements build on the new map Apple begin rolling out two years ago, which had focused on offering richer details, and — in response to user feedback and complaints — more accurate navigation.
Since then, Apple Maps has steadily improved.
The new map experience has since launched in the U.S., U.K., Ireland and Canada and will now make its way to Spain and Portugal, starting today. I will then arrive in Italy and Australia later this year, Apple announced during its keynote address during its Worldwide Developer Conference on Monday.
In addition, Apple said iOS 15 Maps will include new details for commercial districts, marinas, buildings, and more. Plus, Apple has added things like elevation, new road colors and labels, as well as hundreds of custom designed landmarks — for example, for places like the Golden Gate Bridge.
Apple also built a new nighttime mode for Maps with a “moonlit glow,” it said.
For drivers, Apple added new road details to the map, so it can help drivers as they move throughout a city to better see and understand important things like turn lanes, medians, bus and taxi lanes, and other things. The changes are competitive with some of the updates Google has been making as of late to its own Google Maps platform, which brought street-level details in select cities. These allowed people — including those navigating on foot, in a wheelchair, on a bike, or on a scooter, for example — to better see things like sidewalks and intersections.
Apple is now catching up, saying it, too, will show features like crosswalks and bike lanes.
It will also render things like overlapping complex interchanges in 3D space, making it easier to see upcoming traffic conditions or what lane to take. These features will come to CarPlay later in the year.
For transit riders, meanwhile, Maps has made improvements to help users find nearby stations.
Users can now pin their favorite lines to the top, and even keep track on their Apple Watch so they don’t have to pull out their phone. The updated Maps app will automatically follow your transit route and notify you when it’s time to disembark, making the app more competitive to third-party apps often favored by transit takers, like Citymapper, for instance.
When you exit your station, you can also now hold up your iPhone to scan the buildings in the area and Maps will generate an accurate position, offering direction in augmented reality. This is similar to the Live View AR directions Google announced last year.
This feature is launching in select cities in 2021 with more to come in the year ahead, Apple said.
As part of its FaceTime update in iOS 15, Apple introduced a new set of features designed for shared experiences — like co-watching TV shows or TikTok videos, listening to music together, screen sharing and more — while on a FaceTime call. The feature, called SharePlay, enables real-time connections with family and friends while you’re hanging out on FaceTime, Apple explained, by integrating access to apps from within the call itself.
Apple demonstrated the new feature during its Worldwide Developer Conference keynote this afternoon, showing how friends could press play in Apple Music to listen together, as the music streams to everyone on the call. Shared playback controls also let anyone on the call play, pause or jump to the next track.
The company also showed off watching video from its Apple TV+ streaming service, where the video was synced in real-time between call participants. This was a popular trend during the pandemic, as people looked to virtually watch movies and TV with family and friends, prompting services like Hulu and Amazon Prime Video to add native co-watching features.
But Apple’s SharePlay goes much further than streaming music and video from just Apple’s own services.
The company announced a set of launch partners for SharePlay including Disney+, Hulu, HBO Max, NBA, Twitch, TikTok, MasterClass, ESPN+, Paramount+, and Pluto TV. It’s also making an API available to developers so they can integrate their own apps with SharePlay.
Users can screen share via SharePlay, too, so you can do things like browse Zillow listings together or show off a mobile gameplay, Apple suggested.
“Screen sharing is also a simple and super effective way to help someone out and answer questions right in the moment, and it works across Apple devices,” noted Apple SVP of Software Engineering, Craig Federighi.
The feature will roll out with iOS 15.
An email has been going around the internet as a part of a release of documents related to Apple’s App Store based suit brought by Epic Games. I love this email for a lot of reasons, not the least of which is that you can extrapolate from it the very reasons Apple has remained such a vital force in the industry for the past decade.
The gist of it is that SVP of Software Engineering, Bertrand Serlet, sent an email in October of 2007, just three months after the iPhone was launched. In the email, Serlet outlines essentially every core feature of Apple’s App Store — a business that brought in an estimated $64B in 2020. And that, more importantly, allowed the launch of countless titanic internet startups and businesses built on and taking advantage of native apps on iPhone.
Forty five minutes after the email, Steve Jobs replies to Serlet and iPhone lead Scott Forstall, from his iPhone, “Sure, as long as we can roll it all out at Macworld on Jan 15, 2008.”
Apple University should have a course dedicated to this email.
Here it is, shared by an account I enjoy, Internal Tech Emails, on Twitter. If you run the account let me know, happy to credit you further here if you wish:
First, we have Serlet’s outline. It’s seven sentences that outline the key tenets of the App Store. User protection, network protection, an owned developer platform and a sustainable API approach. There is a direct ask for resources — whoever we need in software engineering — to get it shipped ASAP.
It also has a clear ask at the bottom, ‘do you agree with these goals?’
Enough detail is included in the parentheticals to allow an informed reader to infer scope and work hours. And at no point during this email does Serlet include an ounce of justification for these choices. These are the obvious and necessary framework, in his mind, for accomplishing the rollout of an SDK for iPhone developers.
There is no extensive rationale provided for each item, something that is often unnecessary in an informed context and can often act as psychic baggage that telegraphs one of two things:
- You don’t believe the leader you’re outlining the project to knows what the hell they’re talking about.
- You don’t believe it and you’re still trying to convince yourself.
Neither one of those is the wisest way to provide an initial scope of work. There is plenty of time down the line to flesh out rationale to those who have less command of the larger context.
If you’re a historian of iPhone software development, you’ll know that developer Nullriver had released Installer, a third-party installer that allowed apps to be natively loaded onto iPhone, in the summer of 2007. Early September, I believe. It was followed in 2008 by the eventually far more popular Cydia. And there were developers that August and September already experimenting with this completely unofficial way of getting apps on the store, like the venerable Twitterific by Craig Hockenberry and Lights Off by Lucas Newman and Adam Betts.
Though there has been plenty of established documentation of Steve being reluctant about allowing third-party apps on iPhone, this email establishes an official timeline for when the decision was not only made but essentially fully formed. And it’s much earlier than the apocryphal discussion about when the call was made. This is just weeks after the first hacky third-party attempts had made their way to iPhone and just under two months since the first iPhone jailbreak toolchain appeared.
There is no need or desire shown here for Steve to ‘make sure’ that his touch is felt on this framework. All too often I see leaders that are obsessed with making sure that they give feedback and input at every turn. Why did you hire those people in the first place? Was it for their skill and acumen? Their attention to detail? Their obsessive desire to get things right?
Then let them do their job.
Serlet’s email is well written and has the exact right scope, yes. But the response is just as important. A demand of what is likely too short a timeline (the App Store was eventually announced in March of 2008 and shipped in July of that year) sets the bar high — matching the urgency of the request for all teams to work together on this project. This is not a side alley, it’s the foundation of a main thoroughfare. It must get built before anything goes on top.
This efficacy is at the core of what makes Apple good when it is good. It’s not always good, but nothing ever is 100% of the time and the hit record is incredibly strong across a decade’s worth of shipped software and hardware. Crisp, lean communication that does not coddle or equivocate, coupled with a leader that is confident in their own ability and the ability of those that they hired means that there is no need to bog down the process in order to establish a record of involvement.
One cannot exist without the other. A clear, well argued RFP or project outline that is sent up to insecure or ineffective management just becomes fodder for territorial games or endless rounds of requests for clarification. And no matter how effective leadership is and how talented their employees, if they do not establish an environment in which clarity of thought is welcomed and rewarded then they will never get the kind of bold, declarative product development that they wish.
All in all, this exchange is a wildly important bit of ephemera that underpins the entire app ecosystem era and an explosive growth phase for Internet technology. And it’s also an encapsulation of the kind of environment that has made Apple an effective and brutally efficient company for so many years.
Can it be learned from and emulated? Probably, but only if all involved are willing to create the environment necessary to foster the necessary elements above. Nine times out of ten you get moribund management, an environment that discourages blunt position taking and a muddy route to the exit. The tenth time, though, you get magic.
And, hey, maybe we can take this opportunity to make that next meeting an email?