Landslide Digital Feature

IoT Big Data: Consumer Wearables, Data Privacy and Security

Katherine Britton

©2015. Published in Landslide, Vol. 8, No. 2, November/December 2015, by the American Bar Association. Reproduced with permission. All rights reserved. This information or any portion thereof may not be copied or disseminated in any form or by any means or stored in an electronic database or retrieval system without the express written consent of the American Bar Association or the copyright holder.

[T]he world contains an unimaginably vast amount of digital information which is getting ever vaster ever more rapidly. . . . The effect is being felt everywhere, from business to science, from government to the arts. Scientists and computer engineers have coined a new term for the phenomenon: “big data.”1



In the United States, the age of big data is upon us. In 1965, Intel co-founder Gordon Moore predicted that the number of transistors on a computer chip would double every two years while the chip’s price would remain constant. “Moore’s law” meant consumers could buy the same technology two years later for about the same price. Fifty years later, Moore’s prediction has remained remarkably accurate to the point that technology companies have recognized Moore’s law as a benchmark they must meet, or fall behind in the market.2 The wearables market generally follows Moore’s law, creating a “mad rush” among companies to bring products to market. Consumers have come to expect technological products to be faster, cheaper, and more compact over time; this expectation has driven trends of rapid growth in computing power, smaller devices, better battery life, ability to connect to the Internet, and reduction in cost.

Ideally, this consumer demand should drive the market; however, the wearables market poses certain intellectual property imperfections pertaining to data privacy. For example, consumers have imperfect information about how companies collect and use personal data. Federal data privacy regulations in the United States focus on following the Fair Information Practice Principles: notice, choice, access, accuracy, data minimization, security, and accountability. Third-hand collected personal data—the data of consumers who do not use wearables but whose data are collected by others’ wearables—would not be protected by the Fair Information Practice Principles.

The benefits wearables pose to consumers are considerable, assuming data security and data privacy concerns are addressed. This article explores the existing and developing infrastructure and technological features supporting wearables, the specific data privacy and security concerns wearables pose in the United States commercial sphere in the age of big data, particularly in the healthcare space, and the idea that policymakers should address the data privacy and security concerns posed by wearables because consumers and businesses are unlikely to do so.


IoT Infrastructure Supporting Wearables Might Not Address Data Privacy or Security


IoT Connectivity Is Based on RFID Technologies

Kevin Ashton, one of the founders of the Massachusetts Institute of Technology (MIT) Auto-ID Center, is credited with coining the term “the Internet of Things” (IoT).The term refers to objects embedded with technologies like microchips, sensors, and actuators that often use Internet Protocol (IP) and share data with other machines or software over communications networks. Wearable computing devices, or “wearables,” are a subset of IoT. The MIT Auto-ID Center was founded in 1999 with the mission of pioneering a global open standard system for radio-frequency identification (RFID) technologies. By developing RFID technologies, the Center laid the foundation for the many architectures supporting IoT.

RFID technologies use radio waves, microchips, and antennas to identify people, products, and objects automatically. RFID technologies use machine-to-machine (M2M) transmissions, which refer to direct communications between machines such as a microchip and a microchip scanner, a wearable and a third-party application (app), or a wearable and a monitoring hub. M2M transmissions share information without any special configuration or other setup requirements. For example, veterinarians use RFID technology to identify missing microchipped pets. In 2004, the Food and Drug Administration (FDA) approved a similar technology for use on humans.3 The technology relies on a slender capsule of bioglass imbedded in the skin. That capsule contains a microchip with a unique serial number, and is attached to a tiny antenna (the chip and the antenna together are called an RFID transponder or an RFID tag). The capsule’s sole function is to store and transmit a unique identification code to a reader. The code can be read with a microchip scanner passed over the skin. The reader converts the radio waves reflected back from the RFID tag into digital information that can be compared to a veterinary or medical database.


IoT Connectivity Relies on Systems That Handle Security Independently

Wearables are subject to cybersecurity attacks. In April 2014, a vulnerability in Internet encryption (named the Heartbleed bug) was so widespread that it affected wearables.4 The Federal Trade Commission (FTC) held a workshop titled “Internet of Things: Privacy and Security in a Connected World” (FTC Workshop), solicited public comments, and published a staff report in January 2015 summarizing the various viewpoints. When considering how to handle data security, there was widespread agreement among panelists at the FTC Workshop on the need for companies manufacturing IoT devices to incorporate reasonable security measures.5 These devices, however, also rely on legacy systems that may not be secure.

Sanjay Sarma, one of the MIT Auto-ID Center’s founders, described the problem as not IoT themselves but the “pell-mesh rush to build systems in any which way” without regard to a comprehensive security plan.6 The underlying challenge, Sarma explained, is that even if independent systems were secure, these systems are cobbled together, and “the chain will only be as strong as the weakest link.”7 The software used for IoT apps also pose a problem for data security because, like the infrastructure, they “are hard to upgrade or improve” and use a “patchwork of legacy systems [such] that it is virtually impossible to replace any one without a wholesale replacement of all.”8


Exploding Wearables Market Might Not Address Data Privacy or Security


Sensors Embedded in Wearables Allow Them to Gather Huge Amounts of Data

Wearables collect tremendous amounts of data. The technologies surrounding wearables allow that data to be used and analyzed in a variety of ways. Wearables today are embedded with more advanced technologies including microchips, sensors, and actuators. As of 2012, 3.5 billon sensors are already on the market.9 According to a June 2015 Lux study analyzing patents filed between 2010 and May 2015, 41,301 patents were granted for wearable electronics, and patent applications for wearable electronics are increasing at over 40 percent annually.10

Information about a person derived from wearables data such as the time, duration, and proximity of an activity to other tracked individuals combined with demographic information can provide crucial and detailed context to each individual interaction. Data gathered impacts how businesses market their products and how companies recruit talent and motivate their employees. Wearables gather a new class of sensitive data about people: not only who they are, what they do, and who they know, but also how healthy they are, what movements they make, and how well they feel.11 Heart rate monitors can provide insight into people’s excitement and stress levels, and glassware can reveal exactly what they are seeing. Microsoft’s health-tracking wearable, Microsoft Band, incorporates exotic sensors like galvanic skin response, the same technology that is used in lie detectors. By adding heart rate and temperature information, it is now possible to make educated guesses on a user’s emotional state. There is now a hands-free Tinder app for the Apple Watch that instead of allowing the user to decide consciously on a match by swiping left or right on his or her smartphone, makes the decision using the wearer’s heartbeat.12


Consumers Demand Wearables

Great Wolf Resorts, owner of 11 water parks in North America, has used RFID wristbands since 2006 that allow the resort company to track users throughout the park and tie their activities and purchases to their names.13 These wristbands allow users to pay for food and beverages on account and allows them to avoid carrying money or keys on waterslides. In 2013, Walt Disney World introduced a similar vacation management system to provide users with a more customized park experience. Economist Paul Krugman cited the “Varian rule,” which provides that the future can be forecasted by examining what the rich have today, supporting the idea that consumers would want resort-like experiences in their daily lives.14 For example, the super-rich do not wait in line, rather “[t]hey have minions who ensure that there’s a car waiting at the curb, that the maître-d escorts them straight to their table, that there’s a staff member to hand them their keys and their bags are already in the room. . . . [S]mart wristbands could replicate some of that for the merely affluent.”15


Companies’ Demand for Big Data Is Increasing

The European Commission’s new antitrust chief, Margrethe Vestager, described data as the “new currency of the Internet.” FTC Chairwoman Edith Ramirez made a similar comment: “Today’s currency is data.”16 Apart from consumer goodwill and trust by self-disclosing “we won’t collect your data” (as Apple CEO Tim Cook has done), there is little incentive for a company not to collect data on consumers using wearables.17 A 2011 McKinsey report noted that when a competitor fails to use data and business analytics to guide decision making, it suffers competitively.18

Data collected by wearables can be analyzed to create highly targeted, individually tailored marketing campaigns. Marketers could derive from raised stress levels, poor sleep, and a combination of other behavior that a romance is in trouble. Wearable data could determine if a user was habitually late for work, largely immobile when at the office, or spent little time with his or her colleagues, and determine such behavior is due to low morale or dissatisfaction with his or her current job.

Analyzing data from wearables in conjunction with other information will allow businesses to deliver messages and services tailored to a particular customer’s location, activity, and mood.19 Recruitment firms could use big data to target dissatisfied workers, and employers can use the same data to implement policy changes.20 De-identified and aggregated data from wearables reveal otherwise indiscernible patterns and trends in a number of socially beneficial contexts. Medical and epidemiological research, energy conservation, and commercial productivity and efficiency are benefits of using big data.21 Companies can use aggregated data to have a better idea of consumer demand and develop better products and services.


Companies Innovate Independently without Addressing Data Security

In the rush to bring new wearables to market, companies may not address the data security threats. According to Cisco, by 2019, 24 billion networked devices are expected to come online (compared with 14 billion in 2014). By the end of 2012, 8.7 billion devices were connected to the Internet. That figure is expected to increase to 40 billion by 2020 as cars, refrigerators, ovens, thermostats, medical devices, and others come online.22


IoT Innovation and Infrastructure in Healthcare Wearables


Healthcare Wearables Present the Greatest Potential for Consumer Gains

Healthcare wearables contain wireless sensors embedded in the device and worn on the body. M2M technologies and healthcare apps along with healthcare wearables could improve patient outcomes, reduce health expenditures, and allow providers to deliver care in more patient-friendly ways. For example, insulin pumps and blood-pressure cuffs that connect to mobile apps could let people record, track, and monitor their own vital signs without having to go to a doctor’s office.23 Healthcare providers can monitor patients’ blood pressures, respiration rates, and a variety of other biometric information remotely and continuously thanks to wearables.

Healthcare wearables engage patients in their own care. A clinical trial of diabetic users of continuous glucose monitors showed an average blood sugar level reduction of two points; to put this finding in perspective, the FDA considers medications that reduce blood sugar by as little as one-half point to be successful.24 Economist Paul Krugman said that he uses a Fitbit “because the thing spies on me all the time, and therefore doesn’t let me lie to myself about my efforts.”25

Healthcare wearables also help medical providers better understand patient’s health and healthcare issues in general. By analyzing continuous data, healthcare providers are better able to spot trends and make better decisions. In the case of continuous glucose monitors, healthcare providers can examine a patient’s blood glucose levels throughout the day and over the course of their disease. Examining aggregated data, they can spot trends and better understand diabetes and how it can be controlled.26


Healthcare Wearables May Pose Data Security Risks

Security risks of healthcare wearables increase with the degree of human interaction. There is a significant degree of human interaction in telehealth apps. The data captured by healthcare wearables typically flow across short, unlicensed wireless links to a monitoring hub in the patient’s home, which then passes the information to the broadband network, routing it to the cloud where analytics continuously monitor a patient’s status, notifying a healthcare provider in case of anomalies.27 Healthcare wearables measure a patient’s biometric data; an on-premises healthcare worker or a medical professional can receive the data on the other end of a wireless communications link.

In the hospital setting, medical devices have become the key points of vulnerability within healthcare networks and have been subject to attacks.28 Medical devices including x-ray equipment, picture archive and communications systems, and blood gas analyzers have been the subject of cybersecurity attacks.29 These attacks threaten overall hospital operations and the security of patient data. If a hospital, with a fixed infrastructure, cannot keep its medical devices secure, it is highly likely that consumers will be more vulnerable to cybersecurity attacks.


Does Government Regulation Address the Data Privacy and Security Concerns Wearables Pose?


U.S. Data Privacy Regulations Follow Fair Information Practice Principles

Even if a company follows Fair Information Practice Principles and a consumer trusts a particular company with his or her data today, those conditions may change in the future. Additionally, if a customer approves his or her data to be collected and used for a particular purpose today, that does not mean the use could be different in the future. For example, although a consumer may today use a fitness tracker solely for wellness-related purposes, the data gathered by the device could be used in the future to price health or life insurance or to infer the user’s suitability for credit or employment (e.g., a conscientious exerciser is a good credit risk or will make a good employee).30 Use of data for credit, insurance, and employment decisions could bring benefits—e.g., enabling safer drivers to reduce their rates for car insurance or expanding consumers’ access to credit—but such uses could be problematic if they occurred without consumers’ knowledge or consent, or without ensuring accuracy of the data.31

The Fair Credit Reporting Act (FCRA) applies to third-party consumer reports used for credit or employment purposes; it requires consent for a report to be generated and allows that report to be reviewed for inaccuracies. The FCRA excludes most “first parties” that collect consumer information. Thus, it would not generally cover IoT device manufacturers that do their own in-house analytics. Nor would the FCRA cover companies that collect data directly from consumers’ connected devices and use the data to make in-house credit, insurance, or other eligibility decisions—something that could become increasingly common as IoT develops.32

Consumers’ tolerance of how companies use their data will depend on the company’s transparency and how much trust the consumer has in the company with his or her data. Companies, marketers, and employers collecting data can de-identify data, but it is possible to re-identify data, especially if inadequate security measures are in place.


Demand Side of Wearables Market May Not Be Able to Address Data Privacy and Security

Targeted ads based on data gathered from wearables could reduce marketing spam for consumers and provide them with more relevant offers. Customer service can be improved and the gulf between offline and online shopping experiences can be bridged using wearable technology. Consumers, however, are increasingly more willing to view the data privacy and security of their personal data as more important than quality of service, and are starting to give false information for access to free services.33 The trust consumers have in a company will influence how willing they are to reveal truthful personal information and how willing they are to have their data collected.

Nest Labs is a company known for its smart thermostat that can be controlled remotely by an app. The app learns a consumer’s temperature preference and when he or she is home. The app does not collect much data about the consumer apart what it needs to function. Google acquired Nest Labs in January 2014 for over $3.2 billion in cash. Although Nest Labs has repeatedly insisted that it is not merging its data with Google’s, consumers may not fully trust the company’s assurances.34

Users are aware of the potential data privacy implications of wearables. One study specifically found that users are aware that when data are continuously collected, stored, published, and shared, they could include information that users would not want to recall later or would not be willing to capture or be reminded of later.35 Users are also aware that when data from wearables are stored in the cloud, that data could be revealed without the user’s knowledge or consent. Users’ data privacy concerns primarily result from devices that include cameras and microphones followed by devices with GPS and displays. Activity trackers that monitor heart rate, steps, and pulse are seen by users as inoffensive to data privacy; however, the authors of the study postured that it is likely that users are not aware of how third parties could misuse data or of the potential data privacy implications when the data are collected long term or associated with complementary information.



The technology supporting wearables began in a time when security risks were low and the end users were mainly businesses. Consumers have increasingly demanded technology over the past decades. Business models have changed requiring more and better consumer data. While wearables pose significant gains to consumers, especially in healthcare, a concerted effort must be made to address privacy and security. The current technological infrastructure supporting today’s wearables have not addressed the security risks. The data privacy risks have not been addressed, and there are incentives for companies to gather more data than less from consumers. Consumers have shown that they are willing to trade privacy for lower cost, more innovative products. Where the demand or supply side of the market for wearables do not address privacy, policy or self-regulation should address the data privacy and security concerns posed by wearables.



1. Data, Data Everywhere, Economist (Feb. 25, 2010),

2. Davey Alba, 50 Years On, Moore’s Law Still Pushes Tech to Double Down, Wired (Apr. 19, 2015),

3. Rob Stein, Implantable Medical ID Approved by FDA, Wash. Post, Oct. 14, 2004,

4. Robert McMillan, It’s Crazy What Can Be Hacked Thanks to Heartbleed, Wired (Apr. 28, 2014),

5. FTC Staff Report, Internet of Things: Privacy & Security in a Connected World 20 (2015),

6. Sanjay Sarma, I Helped Invent the Internet of Things. Here’s Why I’m Worried about How Secure It Is, Politico (June 2015),

7. Id.

8. Id.

9. See Stanford Univ., TSensors Summit for Trillion Sensor Roadmap (Oct. 23–25, 2013), [hereinafter TSensors Summit].

10. Carole Jacques, Led by Samsung, Wearable Electronics Patents Are Growing at over 40% Annually, Lux Res. (June 30, 2015),

11. Anthony Mullen, Fearing the Quantified Life—Privacy, Data and Wearable Devices, The Next Web (June 5, 2015),

12. Jeff Beer, Your Heart Does the Swiping on This Hands-Free Tinder App for Apple Watch, Fast Company (July 6, 2015),

13. Theresa M. Payton & Theodore Claypoole, Privacy in the Age of Big Data 108–09 (2014).

14. Paul Krugman, Apple and the Self-Surveillance State, N.Y. Times, Apr. 10, 2015,

15. Id.

16. Allen P. Grunes & Maurice E. Stucke, No Mistake About It: The Important Role of Antitrust in the Era of Big Data, 14 Antitrust Source, no. 4, Apr. 2015, at 1, 2.

17. James Vincent, Apple CEO Tim Cook: Unlike Other Companies, We Don’t Want Your Data, Just Your Money, Indep. (Sept. 16, 2014),

18. Brad Brown et al., Are You Ready for the Era of “Big Data”?, McKinsey Q., Oct. 2011,

19. Mullen, supra note 11.

20. Id.

21. Comments of AT&T Inc. at 8, Workshop to Explore Privacy and Security Implications of the Internet of Things (F.T.C. May 31, 2013), available at

22. See TSensors Summit, supra note 9.

23. FTC Staff Report, supra note 5, at 7.

24. Id.

25. Krugman, supra note 14.

26. Jennifer Britton-Colonnese & Devin Steenkamp, Continuous Blood Glucose Monitoring in Newly Diagnosed Type 1 Diabetes, Endocrinology Advisor (Jan. 9, 2015),

27. Comments of AT&T Inc., supra note 21, at 5.

28. TrapX Labs, Anatomy of an Attack: Medical Device Hijack (MedJack) 5 (May 7, 2015).

29. Id. at 6.

30. FTC Staff Report, supra note 5, at 16.

31. Id.

32. Id. at 17.

33. Nicole Kobie, Tech Firms Need to Use Data Ethically around the Internet of Things, Guardian (June 10, 2015),

34. Allison Kade, How to Manage the Threats to Our Privacy and Financial Security in the Digital Age, The Street (June 17, 2015),

35. Scott Amyx, Data Privacy Playbook for Wearables and IoT, InformationWeek (June 8, 2015),

Katherine Britton

Katherine E. Britton has her own practice, is of counsel at Simmons Legal, PPLC, and is an affiliate professor at the University of North Texas Dallas College of Law in Dallas, Texas. Katherine specializes in complex civil litigation, employment and human resources counseling, probate, estate planning, consumer protection, and privacy law matters. Katherine is a Certified Information Privacy Professional (CIPP/US) through the International Association of Privacy Professionals and is admitted to the bars in Illinois, the District of Columbia, and Texas.