The last time Apple appeared at CES in an official capacity was in 1992, when CEO John Sculley introduced the Newton device that Apple created. While many tech giants haven’t pulled off 2019 when they’ve been paid sky-high fines for privacy issues, Apple, which has always prided itself on respecting privacy, taught them a lesson on the big stage at CES at the start of the new year.
“Apple defines privacy as the user’s sole control and gives the user the decision to use the data. We take Privacy by Design very seriously. “On January 7th, local time in Las Vegas, Jane Horvath, Apple’s senior director of privacy, said at the event.
Figures . . . “What Consumers Want” roundtable, from left to right, Rajeev Chand, Erin Egan, Jane Horvath, Susan Shook and Rebecca Slaughter
Horvath joins Facebook’s chief privacy officer, Erin Egan, Susan Shook, global privacy officer for Procter and Gamble, and Rebecca Slaught, a Federal Trade Commission official, at the CEO Roundtable event on “What Consumers Want”. er Attend the meeting together. The moderator was Rajeev Chand, head of research at Wing Venture Capital, and discussed how technology companies can establish privacy protections in terms of size, regulation and consumer needs.
It is also Apple’s first return to CES after 28 years. The last time Apple appeared at CES in an official capacity was in 1992, when CEO John Sculley introduced the Newton device that Apple created. During CES last year, it wrote a giant sign on a billboard in the field: “Everything that happens on the iPhone is left on the iPhone.” “
Three privacy protections apples currently in use
At the start of the latest conversation, Jane Horvath was asked, “Is consumer technology doing well enough on privacy issues?” “
Her answer partly represents Apple’s view of privacy: “We’ve never done enough, we should always try to do more.” The world is always changing, and we certainly don’t have a one-in-one solution right now, and we have to constantly look for innovative solutions. “
Horvath later said that Apple management, including CEO Cook, had a consistent view on privacy issues, highlighting that Apple would be equipped with privacy engineers and privacy lawyers to work with other engineers for each new product, even at the beginning of the design.
In a subsequent discussion, Horvath listed three privacy practices that Apple currently uses.
The first is differential privacy, one of the most common methods of data anonymization in the industry. The technique actively adds mathematical noise to the collected data through algorithms, similar to the artificial addition or deletion of small amounts of data, making it difficult to be accurately classified and identified.
Obviously, this approach is somewhat at the expense of the usefulness of the data, but it is still effective when large-scale statistical analysis or extraction of overall trends, so it can be done to protect user privacy without compromising functionality.
Horvath points out that Apple uses this technique in recommending common emoji features to ensure that users’ usage habits are safe without compromising statistics on emoji usage data.
The second category is On-data processing. Thanks to the powerful computing performance of mobile devices such as today’s mobile phones, many computing tasks and models can be processed directly on the device without sending information to the server, Horvath says.
For example, apple devices, such as camera and face recognition, use a range of deep learning algorithms that run on the device and store the information stored on the device’s encryption chip, which Apple can’t get.
The third category is random identification (identifier Randoms). When using Apple devices, services such as Siri and Maps inevitably send data to the server, which represents the user’s identifier is randomly generated, not the Apple ID. As a result, Apple knows that a user generates data, but there is no way to tell who it is.
In addition, Horvath reaffirms the importance of end-to-end encryption and Apple’s understanding and practice of the principle of minimizing data collection. Citing Siri Voice Assistantas as an example, she said that when users ask about the weather, Apple will only use accurate location data to the city, and only when users ask about nearby gas stations or supermarkets, Apple collects accurate geographic information such as latitude and longitude.
“Through these measures, we minimize the potential impact of privacy issues,” Horvath concludes.
Innovate with privacy right
Looking at the global technology, I’m afraid it’s hard to find a company that’s better suited to talk about privacy than Apple.
Horvath has been Apple’s head of privacy since September 2011. Also in 2011, Apple unveiled its first iPhone with artificial intelligence , a voice assistant called Siri that answers users’ questions and helps manage everyday life.
But it also means that the phone as an electronic device is beginning to penetrate deeper into the details of a user’s life. Around that time, Apple began strengthening its commitment to privacy in consumer technology, and has since been considered one of the Silicon Valley giants’ leaders in user privacy protection.
Even in the 2019 data breach “hard-hit” face recognition, Apple can almost “leaf without a touch” of the whole body: according to Appleinsider 2019, a report claims that researchers are using some means to obtain image databases. The databases were made up of people who frequently used facial recognition but did not know it, but when the researchers collated the collected images and results, they found no Apple data. Although the sources of these databases are images from public sources and other company-collected images, none of them are from Apple.
It is not clear what specific data set Apple used in face recognition research, but at least there is no evidence that the data came from its users.
In fact, the realization of the function of mobile face recognition is a typical example of how Apple pursues innovation under the basic principle of privacy.
Apple has published an article in its official machine learning blog that systematically describes its face recognition algorithms, the neural network mechanisms involved behind the Vision API, and how face recognition was initially implemented by simple non-neural algorithms.
In 2014, Apple saw the application of deep learning on large computing platforms increasingly mature, and therefore assumed that increasingly practical deep learning should also have great potential on mobile platforms. However, in the technical realization of the time, the deep learning on the phone to achieve a cooler, more accurate recognition function, almost the night sky.
Now we’re used to integrating dedicated processing units for AI computing in mobile phones, such as the NPU used by Huawei in the Kirin 970, and in 2014, mobile chip computing performance was so weak that it could be used as a visual model computing platform for deep learning.
One shortcut is to provide relevant deep learning solutions through the cloud API. With cloud service devices, terminals such as mobile phones can also use deep learning to solve problems.
But this raises another problem, with Apple’s iCloud subject to strict privacy and data usage restrictions, which cannot be used for deep learning, despite the huge amount of photo data available on iCloud. So for deep learning, Apple can only choose to do the relevant calculations directly on the phone, rather than in the cloud.
Cloud AI has implications for privacy issues, prompting Apple to turn to terminal solutions to protect privacy by changing its design philosophy.
In the end, Apple successfully implemented face recognition deep neural network algorithms on mobile phones by replacing traditional Viola-Jones feature recognition with OverFeat deep learning algorithms, establishing standard processing processes, and other supporting system optimizations. instead of taking the cloud route chosen by other vendors.
Apple has changed as phone sales stagnate
Unlike Google, Amazon and Facebook, Apple has been reluctant to use user data to provide targeted advertising or personalized recommendations. According to previous media reports, several former Apple employees said any work involving Apple user data collection would require the approval of three “privacy tsars” and a senior executive.
Many employees are proud of Apple’s position, while CEO Cook sees it as a matter of principle.
“Customers want Apple and other technology companies to do everything they can to protect their personal information,” Cook wrote in a letter, even opposing a request from the government to help unlock the iPhone of a suspect in an attack.
However, this phase is turning the page, and a new test has come.
Apple’s results for the fourth quarter of fiscal 2017 showed that while iPhone sales still generated $37.185 billion in revenue, sales were virtually zero, nearly unchanged from the past two years. With iPhone sales stagnant, questions have grown about the sustainability of its hardware innovation capabilities.
In contrast to hardware sales, Apple’s service revenue has been growing steadily for years, with $31 billion in 2017, or 13 percent of total revenue, and Apple has made services such as iCloud and Apple Music a major source of new growth.
Photo Apple’s service business (Source: aboveavalon.com)
However, a rule that has been almost repeatedly proven is that the better the personal data of users is mined, the more likely it is to provide good Internet services.
For Apple, which wants to create more room for growth in its services, this will be a bigger test of the company’s commitment to limiting the use of personal data.
In the past 2019, Apple has “moved its heart” and tasted the pain.
In August 2019, Apple was revealed to hire an outsider to perform manual analysis of voice commands received by Siri to improve Siri’s performance, and its voice messages include location information, contact information, and more. In response, Apple admitted to hiring staff to analyze Siri’s voice commands, but applesaid that the number of voices was less than 1% of Siri’s daily activation and stopped the program.
Apple service revenue has been growing steadily over the years
Since then, Apple has issued an apology on its website, saying that from the fall of 2019, “recordings of users interacting with Siri will no longer be retained” by default.
In November of that year, Apple released its latest statement on privacy on its website:
Recently, Apple management highlighted the goal of generating about $50 billion in service revenue by 2020. Apple’s service revenue in fiscal 2019 was $46.3 billion, nearly nine times higher than in 2010, according to the data.
Under this small $50 billion goal, has Apple’s proud privacy base shaken even more dramatically?
Protection privacy options are far from enough
Of course, the contradiction between the boundaries of user privacy and making money is not just a problem for Apple, but also for Facebook, which participated in the round-table discussion, not to mention the massive privacy breach that has arisen in the wake of the Cambridge Analytica scandal.
When an occasion like CES also sets up a forum specifically for this topic, it is a good idea to see the universality of privacy issues in the field of technology.
CES, fullname, international consumer electronics exhibition, started in 1967, has been through more than 50 years, covering consumer electronics, smart home, automotive, AIoT and other popular areas, is the global technology industry a leading target.
At the start of 2020, this ceS has added a new meaning: which companies will set the tone for the next 10 years in these days in this crowded setting?
At the very least, the heat of electronics privacy, information security and other issues is rising again. Especially in smart home, the exhibition also has more and more smart home products no longer rely solely on the cloud for interaction, but began to turn to partial control localization to solve privacy issues.
As an average consumer, we’d love to see these efforts, such as more and more technology companies making data processing transparent, giving users as many options as possible, and allowing users to decide for themselves how data is shared and used.
But no matter how transparent, users and technology companies are in a state of information asymmetry. Users can’t 100% master what the company does with this data, whether it’s shared with third parties, whether they really comply with user agreements and legal requirements, and so on.
In response, Rebecca Slaughter, an official at the Federal Trade Commission, has thrown out a thought-provoking point: Data collectors have written 300 pages of privacy provisions that offer complex privacy options, but that doesn’t mean they can shift the burden of data protection to users.
“The trend towards consumers controlling privacy options is worrying, and the amount of information is much beyond what they can handle. Data collectors must explore how to ensure the quality of service, with minimal data collection, sharing and preservation, and truly fulfill the responsibility to protect user privacy. “
For the next inevitable Internet of Everything war, I believe there will be more and more companies to prove whether user privacy is a momentary “honey” or a long-term “frost”.