A Bias Toward Innovation, Not Intrusive Regulation

June 3, 2016

Zurich Google Earth (1).jpg

View of Zurich, Switzerland using Google Earth. Photo by Zavier Caballe, via Flickr. Creative Commons license.

Takeaways

Protecting privacy is essential, but unnecessary regulations could block or delay exciting innovations.

“Of all new technology, I think someone from 50 years ago would be most impressed with Google Earth,” the business journalist Morgan Hounsel recently commented on Twitter. “And they’d faint when you told them it’s free.”

Hounsel’s ranking of visual satellite mapping technology as the most astonishing modern tool is of course a subjective account, and yet it points to an important phenomenon. Large scale, low-cost platforms allow others to build applications on top and make available previously unimaginable information to the masses. Google Earth allows us to visualize our world from any mobile device and empowers thousands of unaffiliated apps to plug into it, from real estate firms and restaurants to non-profit and government services. There’s even the recent story of a 15-year old Canadian who thinks he’s discovered a lost Mayan city using Google Earth and star maps.

Facebook is an analogous platform for social networking. The stock market is a kind of precursor data platform. Similar platforms are coming for health care, transportation, education, retail, and recreation.

By 2020, Cisco estimates there will be 11.6 billion personal mobile devices, or around one-and-a-half for each person on Earth. The number of wireless connections to machines, sensors, and other data-gathering devices, however, will be far larger—perhaps 50 billion. Then there are the millions of computers in “the cloud” storing, churning, sharing, and spitting out yet more data. Together, these connected multimedia and sensory systems will generate some 44 zettabytes (44 trillion gigabytes) of new data in 2020, adding to the gigantic troves of existing information.

These zettabytes of location, health, and other personal data are causing us to reexamine our legal and social frameworks for privacy. Some people intuit that more data mean less privacy. Some worry that more data mean more opportunity for anti-consumer behavior. Too often, the temptation is to constrain the existing data platforms or to foreclose new ones before they even emerge. Policy makers in Washington and the states are proposing all sorts of new privacy laws and rules.

Too often, the temptation is to constrain the existing data platforms or to foreclose new ones before they even emerge. 

For decades, the Federal Trade Commission (FTC) has served as the chief privacy regulator, and it has used a series of workshops and studies to begin thinking about privacy in a zettabyte world. Other regulators, however, now want in on the act. The Federal Communications Commission (FCC), for example, has issued a proposed rule that not only usurps much of the FTC’s previous oversight of the internet but also dumbs down the modern view of privacy regulation, which is based on costs, benefits, technology neutrality, and consumer welfare. The FCC proposal mostly ignores these factors in favor of specific ex ante prescriptions and proscriptions.

Former FTC chairman Jon Leibowitz is wary of the new proposal. “Parts of the FCC’s proposed rule are consistent with the FTC approach,” Leibowitz said. “However, in many important areas it overshoots the mark, proposing regulations for broadband providers that go well beyond those imposed upon the rest of the Internet economy and which, if adopted, undercut benefits to the very consumers it seeks to protect.”

“FCC rulemaking consistent with the FTC’s privacy framework would ensure that privacy enforcement remains technology neutral, based on the type of data being collected and how it is used, rather than turning on the type of entity collecting the data.”

Digital companies increasingly realize that protecting customer privacy is an important part of their product offering and can yield competitive advantages. A secondary effect of proactive privacy protection is to lessen the need for counterproductive regulation. Amazon, for example, has for two decades been one of the most voracious users of customer data, but it has also developed a reputation for serving the customer first by delivering packages fast, making helpful product suggestions, and protecting privacy. Its new Echo voice-activated personal assistant has privacy built into the product. The cylindrical home computer, which sits in your kitchen or family room, is made to answer your questions, play music, control your lights, and interact with the Internet. But a voice-activated device that sits in the middle of your home poses obvious privacy concerns. So Amazon built Echo with an analog switch that shows when it is off. When Echo’s red light is on, it cannot hear or transmit sound—period.

But how many other innovations like Echo might be blocked or delayed with intrusive privacy regulation?

The Technology Policy Institute notes that the FCC proposal lacks basic economic analysis and in particular ignores the broad and deep benefits of data. The proposal instead would impose a rigid but cockeyed view that data collection by some firms is necessarily dangerous, but that data collection by dozens of other types of companies can never be harmful and so is not covered by the rule at all.

An economic view of privacy would take just the opposite approach. It would assume that data collection can be used to serve many valuable customer needs, but it would also adopt a general standard of consumer welfare that could be used to judge complaints against any entity across the vast Internet ecosystem which collects and uses personal data.

In this way, we would have a bias towards innovation—like Google Earth, Siri, and Amazon Echo-- but we could also punish and deter firms from using data to harm consumers.