FTC Calling On Ad Networks to Limit and Justify Data Collection

The issue of data collection is an important one in online privacy, particularly as it applies to ad networks. This issue is especially contentious in the context of Do Not Track mechanisms. A number of browsers – such as Safari, Internet Explorer, and Firefox – have mechanisms that permit consumers to instruct websites not to track their activities across the web. The FTC has said on numerous occasions, though, that an effective Do Not Track system should go beyond opting consumers out of receiving targeted advertisements; it should opt them out of the collection of behavioral data for all purposes, unless the purpose is consistent with the context of the interaction (e.g., to prevent click-fraud). Such sentiments were expressed in the FTC’s Privacy Report, as well as its testimony before Congress.

Last month, FTC Commissioner Julie Brill delivered a speech at the State of the Net West conference in San Francisco, calling on ad networks to spell out why they say they must collect data from consumers who do not wish to be tracked. "On numerous occasions, the FTC and other stakeholders have asked the advertising networks for specific market research and product improvement uses that require retention of linkable consumer data. The advertising networks are the only ones who can make the case for such use; without input from them it will be hard to see how such uses can be justified when a consumer has opted out of tracking." In that same speech, Commissioner Brill cited to a recent study by the Pew Research Center on privacy concerns with mobile devices. That study noted that 54 percent of app users have decided to not install an app after they discovered how much personal information they would need to share in order to use it, and that 30 percent of app users have uninstalled an app that was already on their phone because they learned it was collecting personal information that they didn’t wish to share.

Ad networks, and more broadly, all companies that collect data about consumers, should take note of the FTC’s focus on data collection. Data collection efforts should be limited appropriately and be accompanied with accurate disclosures.

FTC to MySpace: Watch What You Do in Consumers' Space

On Tuesday, the FTC approved a final order and consent decree settling charges that MySpace misrepresented its protection of users’ personal information. The settlement bars MySpace from future misrepresentations about its privacy practices, and requires MySpace to implement a comprehensive privacy program with regular, independent privacy assessments for the next 20 years.

For more information, please visit the Global Regulatory Enforcement Law Blog and read their Client Alert, "FTC’s Final Order with MySpace Focuses on Privacy by Design and Protection of Unique Device Identifiers" written by Paul Bond, Amy S. Mushahwar, and Christine E. Nielsen.

FTC Issues Guidance to Mobile App Developers

On September 5, 2012, the Federal Trade Commission published "Marketing Your Mobile App: Get It Right from the Start", a set of guides addressing compliance with truth in lending and privacy principles for mobile app developers. Disclosures and privacy protection for mobile apps is a major issue and the FTC's guidance is important. In their summary, the FTC provided an overview that advised that app developers:

  • Tell the Truth About What Your App Can Do
  • Disclose Key Information Clearly and Conspicuously
  • Build Privacy Considerations in From the Start
  • Offer Choices that are Easy to Find and Easy to Use
  • Honor Your Privacy Promises
  • Protect Kids’ Privacy
  • Collect Sensitive Information Only with Consent
  • Keep User Data Secure

Pretty basic stuff, but the reminders from the FTC are well taken and should be carefully digested by anyone in the mobile app business, whether they’re developers or marketers. For a copy of the full publication, click here.

SEC Proposes Amendments to Permit Hedge Funds to Advertise - Data Privacy Issues

This post was written by Alexandra Poe, Paul Bond, Keri Bruce, and Frederick Lah.

On August 29, the SEC proposed amendments to lift a long-standing ban on advertising for hedge funds and other issuers. Reed Smith will be releasing a series of blog posts over the next few weeks about the various implications this proposed rule may have if it goes into effect. Please vist the Global Regulatory Enforcement Law Blog, where we consider the data privacy issues with the proposed rule.

FTC OKs Self-Regulation Program for Online Behavioral Advertising

The Federal Trade Commission issued an advisory opinion letter this week saying that it has no present intention to challenge the Council of Better Business Bureaus' accountability self regulatory program for companies engaged in online behavioral advertising. The program is designed to foster compliance with the Self-Regulatory Principles for Online Behavioral Advertising, which were released by the FTC in 2009. The issue presented to the FTC by the CBBB was whether the accountability program would be viewed as a restraint of trade under the antitrust laws.

For more information, please read Reed Smith's Client Alert, written by Christopher G. Cwalina, Amy S. Mushahwar and Frederick Lah.

Commissioner Brill Introduces Competition Analysis to Privacy Debate

FTC Commissioner Julie Brill stated today that "there may be a tipping point" at which self-regulatory privacy initiatives "turn[] anticompetitive, particularly in cases where the mechanisms are developed by a trade association or industry players that have a dominant market position." A self-regulatory privacy proposal could raise "competition concerns," she said, if it "disadvantages competitors of the platform offering the proposal, especially if the platform operator has a dominant market share and is vertically integrated." Proposals offered by a group of rivals or their trade association could raise competition concerns if the rivals "act in ways that favor their own economic interests to the detriment of other competitors and consumers." In the recent Google Buzz settlement, FTC Commissioner Rosch raised similar concerns about whether seemingly pro-privacy moves could have anticompetitive effects.

For more information on Commissioner Brill's Analysis, please read Paul Bond's and Chris Cwalina's article on the Global Regulatory Enforcement Law Blog.

Rep. Markey Releases a Kids Do Not Track Discussion Draft Bill

This post is written by John Feldman and Amy Mushahwar.

Bill Adds to the Web of Proposed Privacy Legislation and Contains Much More Than Kids Do Not Track

Today, Rep. Ed Markey (D-Mass.) circulated a discussion draft of his kids online do-not-track bill, co-sponsored by Joe Barton (R-Tex.) that proposes to make it illegal to use kids' or teens' information for targeted marketing and require parental consent for online tracking of the info. Both Congressmen co-chair the House Privacy Caucus and their kids' privacy bill will join other more generally-applicable privacy legislation pending in the 112th Congress by Representatives Cliff Stearns (R-Fl.), Fred Upton (R-Mich.), Jackie Speier (D-Calf.) and Bobby Rush (D-Ill.) and Senators John Kerry (D-Mass.) and John McCain (R.-Ariz.) with Senator Jay Rockefeller (D-W.Va.) promising to release a generally-applicable privacy bill containing Do Not Track provisions next week.

But, members of the privacy community were expecting this piece of proposed legislation. Markey had promised since late 2010 that the bill was coming. Specifically, the bill would update the Childrens' Online Privacy Protection Act of 1998 ("COPPA") provisions relating to the collection, use and disclosure of children's personal information. Further, it would establish protections for personal information of teens who were previously not addressed in COPPA at all.

Key provisions of the bill include:

Scope Updates: The bill would expand the scope of the definition of covered Internet operators to include online applications and the mobile web. The Federal Trade Commission ("FTC") would also be empowered with rulemaking authority to create more flexible definitions of operators that account for the development of new technology. The also expands the personal information protected to include IP Addresses, mobile SIMs or any other computer or other device identifying numbers.

Privacy Policies/Disclosure: The bill would require online companies to explain the types of personal information collected, how that information is used and disclosed, and the policies for collection of personal information.

Further Parental Choice: In addition to keeping the existing requirements for online companies to obtain parental consent for collection of childrens' personal information, the bill also includes provisions requiring companies to provide parents access to the information collected about their child and the opportunity to opt-out of further use of maintenance of their child's data.

Targeted Marketing Prohibitions for Kids & Minors: Website operators and other online providers would be prohibited from knowingly collecting personal information for behavioral marketing purposes from children and minors. The FTC would be required to issue regulations within one year of the bill's passage.

Digital Marketing Bill of Rights for Teens & Fair Information Practices Principles: This section incorporates the Fair Information Practice Principles ("FIPPs") concept that was in the Department of Commerce's Privacy Green Paper. Under this proposed bill, website operators and other online providers are prohibited from collecting personal information from any minors, unless they adopt a Digital Marketing Bill of Rights for Teens. Such a bill of rights or FIPPs must include provisions regarding data: collection, quality, purpose specification, use limitations, security, use transparency, access and correction.

Geolocation Information Collection of Kids and Minors: Website operators and service providers must establish procedures for notice and choice regarding geolocation information. In the case of information collection from children, an operator/provider must obtain verifiable parental consent before this information would be collected, in most cases.

Eraser Button: Website operators must create an "Erase Button" for parents and children by requiring companies to permit users to eliminate publicly available personal information content when technologically feasible. (Such a provision, however, could lead parents and children into a false sense of security on the web. With multiple outlets for data cashing, it is difficult to wholly erase data on the web.)

Expansion of FTC Jurisdiction to Telecom: In keeping with the Kerry bill, the Markey bill also seeks to expand FTC jurisdiction to telecommunications carriers.

We will be carefully evaluating these provisions while this bill pends, but we can readily identify that complications are likely to arise for marketing to young adults. For example, teens are far more likely to lie when faced with traditional age screens. So, even though the statute contains a 'knowing' information collection requirement, to what degree would marketers be required 'fortify' their existing age screens to account for teens? If more stringent age screens must be employed, will the more tedious screens reduce marketing to adults, too?

If this bill advances on the Hill, please lookout for upcoming privacy bill updates from our team.

Email Marketing Under Attack

In recent months, a number of major brands have faced complex legal and reputational risks that arose from the hacking of email fulfillment vendors.

These cases generally present the following challenges:

  1. Gaining a technical, business and legal understanding of what happened, to who, when and how, and developing privileged and unprivileged messaging about the event to interested constituencies;
  2. Analyzing US and non-US notice obligations to customers, business partners, government and others, as well as assisting with identifying the judgment calls regarding such notice that must often be made, in real time;
  3. Monitoring and helping in discussions with responding/complaining customers, including developing scripts, protocols and a risk-based triage approach;
  4. Analyzing relevant insurance coverage and coordinating with in-house insurance experts or brokers on notice to carriers and responses to initial denials;
  5. Preparing a litigation-ready story, and advising the on privilege issues that arise during investigation and response activities.

In light of these recent email vendor breaches, forward-looking and consumer-focused companies are working around these ongoing challenges:

  1. Your own company's policies, procedures and training, including event and litigation-preparedness ("is my own house in order?");
  2. Knowledge/due diligence concerning vendors on these issues ("are we working with the right people and how do we measure that?");
  3. The process/flow among the company and its business partners ("do we have to outsource this work at all, and, if so, is there a simpler or lower risk way to do design the system?");
  4. The information being collected ("what are we collecting, how long are we keeping it, and why?");
  5. Contracts/indemnification provisions and insurance coverage ("if the worst happens, do we have the right contractual protections and insurance coverage?").

The recent data security breaches have highlighted legal and reputational vulnerabilities for the national brands. No amount of spending on data security technology or attention to policies, procedures and training on consumer privacy issues at the national brands immunizes one from reliance on vendors. Their data security events end up becoming yours. The letter notifying customers of such a breach ends up on your letterhead.

For more information, feel free to contact Douglas J. Wood at 212 549-0377, email dwood@reedsmith.com or Mark Melodia at 609 520-6015, email mmelodia@reedsmith.com.

Lots of "Buzz" Around Google Buzz

Earlier this week, Google, Inc. agreed to a proposed consent order over charges that it used deceptive tactics and violated its privacy promises to consumers when it launched its social network, Google Buzz. For more on this important development, please click here.

Privacy Challenges in Marketing Practices European (over)ruling of the use of personal data?

This post was written by Avv. Felix Hofer.

Multinational companies planning to target EU consumers with sophisticated marketing techniques may easily find themselves on a marshy ground, if they do not deserve sufficient attention to European privacy laws. Costumer profiling, monitoring and categorizing offers essential information for crucial business decisions, but have also to comply with restrictions and prescription likely to result a lot stricter than those non-European companies may be familiar with. Find in the following article a summarizing picture of what to worry about when marketing strategies are meant to cross the Ocean and to reach out to Europe.

Corporations Do Not Have Personal Information Protected Under FOIA

The Supreme Court came down this week with a watershed decision that may effect businesses small and large which collect and store customers' personal information. Despite AT&T's attempt to argue they are a private citizen and, therefore protected under the Freedom of Information Act, the Supreme Court ruled otherwise and clarified the FOIA only applies to actual persons, and not "corporate" persons. For more information on this important development, please click here.

For Privacy, European Commission Must Be Innovative

This blog post is republished by permission of the Center for Democracy and Technology where it first appeared.

This post is part of "CDT Fellows Focus," a series that presents the views of notable experts on tech policy issues. This month, CDT Fellow Omer Tene writes about the consultation launched by the European Commission to update the European Union Data Protection Directive. Posts featured in "CDT Fellows Focus" don't necessarily reflect the views of CDT; the goal of the series is to present diverse, well-informed views on significant tech policy issues.

In a way, the process undertaken by the European Commission to review the current framework applicable to privacy and data protection is akin to speeding on a highway at 100 mph while looking at the rearview mirror. The consultation launched by the EC and comments filed by some of the main players (see, e.g., here and here) are strongly anchored in the text of the EU Data Protection Directive ("EU DPD"), enacted in 1995, negotiated several years before then, and based on documents dating back to the late 1970s. That was the era of mainframe computers and punched cards; long before PCs, the Internet, and mobile, not to mention cloud services, ubiquitous computing, smart grid, genetics and biometrics.

Building on acquired knowledge and proceeding with care in small increments is firmly rooted in legal culture. Ours is a discipline based on precedent and cautious tweaking of existing texts. Torts, contracts, and even public law today are strikingly similar to those in Roman times or ancient Jewish law. Yet given the scope and pace of technological innovation over the past 40 years and its massive impact on the collection, storage and use of personal information, it seems that an innovative mindset is needed to overcome some of the shortcomings of the current framework.

General structure

The EU DPD is a structure based on two pillars – fair information practice principles (FIPPs) and a regulatory bureaucracy – with an overarching concept of consent hovering above. The FIPPs are not unique to the EU DPD and are in fact almost ubiquitous. They come under different names and are clustered differently, but are essentially the principles of data minimization (collection limitation), purpose specification, use limitation, retention limitation, transparency, accuracy (data quality), individual participation (access, rectification and right to object), security and accountability. I don’t think there’s reason enough to delve into these, as they are largely agreed upon from Canada and the US, through Europe, Israel, South Korea, and Japan, come to Australia and New Zealand. To be sure, data minimization has come under stress in the era of "big data;" and we have not fully figured out the principle of accountability yet. But all in all, there is a great degree of convergence with respect to the FIPPs. Put another way: where the US Department of Homeland Security is in accord with European Parliamentarians, who’s to argue?

Much more discord surrounds the regulatory bureaucratic aspects of the privacy framework. Here, different jurisdictions vary significantly, with the EU leading the way with its "fully independent" supervisory authorities charged with enforcing the law vis-à-vis both private sector and state. The EU DPD is inundated with form filling and filing processes that currently occupy a vast ecosystem of regulators, data protection officers (DPOs), private sector lawyers, accounting firms, and consultants (to name a few). "Notifying" or registering data processing operations; approving cross border data transfers; executing "model clauses" or certifying "binding corporate rules" – are just some of the activities undertaken by privacy professionals. A bit like sorcery, this meticulous activity yields questionable benefits to anyone but the professionals engaged in it. As one CPO once told me: "I view the notification form filed annually with the data protection authority as an envelope for the filing fee; I’m happy to send them the check without the envelope." Little doubt remains, even among regulatory strongholds, that the EU DPD’s bureaucratic processes must be greatly simplified.

This brings us to the challenging issue of consent. Consent is a wild card in privacy regulation: difficult to tame but impossible to get rid of. It is a concept so intertwined with the meaning of privacy that one cannot exist without the other. Any privacy infringement presupposes lack of consent. You invade my privacy by lurking around my home and peeking through the window; yet if I invite you to my home you come as a visitor, a guest, not an infringer. If I use Google to search my date’s name and seek personal information about her, I may be invading her privacy; if she volunteers medical information over a drink, I am a polite listener.

The EU DPD currently authorizes the processing (meaning collection, storage, use or transfer) of personal data based on "unambiguous consent" or "explicit consent" in the case of sensitive data. The problem, of course, is that consent is often illusory. The state does not need citizens’ consent to process data about them; employers can obtain employee consent to anything save (perhaps) pay cuts; and businesses bury statements about privacy and data use in dense legal documents undecipherable to non-experts.

Some have called for the abolition of consent as legal basis for processing data in certain situations. That is, prohibiting certain data processing operations outright, with or without consent. I view this as highly problematic. Data processing can be justified based on "implicit" consent (e.g., Article 7(b) of the EU DPD: "processing is necessary for the performance of a contract to which the data subject is party") or with no consent at all (e.g., Article 7(c) of the EU DPD: "processing is necessary for compliance with a legal obligation" or Article 7(f): "processing is necessary for the purposes of the legitimate interests pursued by the controller"). But I do not think the converse is true: processing cannot be outlawed in the presence of consent. To be sure, consent must be real – that is, free and informed. If it’s not free and informed, it’s not consent; and many common situations fall into this category. But overruling individual choice where it is present is paternalistic and fails to capture the nonconsensual element of any privacy infringement.

In addition, current debate about consent is often fixated on opt-in vs. opt-out. I think the more salient issue is transparency. Consider what is a better expression of individual autonomy – signing a 36 page contract printed in font 6 which includes a hidden paragraph on data usage (opt-in consent); or receiving conspicuous, clear notice and being offered a simple, no cost right to refuse (opt-out)? The point is that opt-out is not inherently inferior to opt-in; it depends on the notice. The FTC recognized this in its recent Report on Protecting Consumer Privacy in an Era of Rapid Change, noting: "Different mechanisms for obtaining opt-in and opt-out consent can vary in their effectiveness. Indeed, a clear, simple, and prominent opt-out mechanism may be more privacy protective than a confusing, opaque opt-in." I support searching for mechanisms to provide transparency and robust notice to individuals, such as icons, privacy dashboards, and layered notices written in plain English. Improving consent, not doing away with it, is the right way to go.

Definitions

Every legal text is only as good as its basic building blocks – the definitions. Unfortunately, the definitions in the EU DPD are in danger of unraveling. Look no further than the most fundamental term – "personal data" – currently defined as "information relating to an identified or identifiable natural person (…); an identifiable person is one who can be identified, directly or indirectly (…)". Endless debate has raged concerning the identifiablity of an IP address or cookie and the use of anonymization to render data un-identifiable. Yet recent advances in analytics and de-anonymization attacks have shown the futility of the "personal, non-personal" dichotomy.

Moreover, it is the singling out of an individual for unique treatment (e.g., the pricing of a loan or targeting of an ad) based on his or her profile, even without the ability to unmask his or her name, which has significant privacy implications. It is precisely this "commodification" of individuals that Ruth Gavison warned about in her 1980 Yale Law Journal article, "Privacy and the Limits of Law." Arguably, a company purchasing individual "profiles" without even addressing such individuals by name inflicts a more severe dignitary harm than one associating profiles with identified individuals. After all, it is statements like "gather all the ones with the yellow badge" that led to the adoption of data protection framework in the first place. However, extending the EU DPD to apply to the processing of any form of data, personal or non-personal, seems like an over-expansion.

An additional dichotomy in need of review is that between controllers and processors. Data protection law allocates responsibility and delineates duties according to a categorization of an organization as a "controller" or "processor." A controller, defined in the EU DPD as the party that "determines the purposes and means of the processing of personal data," is traditionally viewed as the owner of the database, the one who has a direct relationship with the individual and therefore locus of liability. The processor (or "mere processor") is traditionally perceived as a service provider, a servant to the master-controller, whose sole responsibility is keeping the data secure. Yet how far this description is from market reality today, where layer upon layer of service providers (processors?) undertake an increasing role in the clients’ (controllers?) business processes, including providing consulting services, driving innovation, and managing change. Moreover, with the advent of cloud computing and its architecture as a stack of infrastructure, platform and software layers, the neat distinction between controllers and processors has muddled. This is a critical matter, since in the absence of a clearly identified controller the framework remains teetering without a focal point for responsibility/accountability.

An additional sticky point concerns choice of law. The EU DPD was initially adopted as a common market measure intended to harmonize data protection regulation and thus remove barriers to data flows among EU Member States. As practitioners in Europe know well, harmonization remains a utopian vision far from a reality where large multinationals struggle to reconcile sometimes conflicting regulations. In addition, application of the European framework seems overextended under Article 4(1)(c) of the EU DPD, which applies European law to a controller established outside of Europe processing the personal data of non-Europeans if such "controller (…) for purposes of processing personal data makes use of equipment, automated or otherwise, situated on the territory of [a] Member State." European regulators interpreted "use of equipment" broadly, applying the EU DPD for example where a US-based website places a cookie on the browser of a user in the EU.

The Article 29 Working Party, group of European regulators charged with enforcing the law, recently issued a document analyzing choice of law under the EU DPD. Yet much confusion remains, and will continue to exist given the inherent geographic indeterminacy of data flows. Peter Hustinx, the European Data Protection Supervisor, recently called for replacing the EU DPD with a regulation, European legislation with direct effect in Member States, to avoid the inevitable disharmony in transposition of a directive. While an appealing prospect, such a regulation would be excruciatingly difficult to negotiate and agreed upon among 27 Member States.

Enforcement is a sore issue for the EU DPD. It is an open secret that the framework is largely not enforced. Indeed, implementation of the EU DPD is probably highest among US based multinationals, which implement strict compliance programs for risk management purposes and as part of overall corporate governance schemes. To increase enforcement, mechanisms must be put in place to facilitate cooperation among data protection authorities; incentivize individual enforcement by consumers and consumer organizations; and engage the press.

Call in the engineers

These issues and others, such as the expansion of the EU DPD to the sphere of law enforcement and national security pursuant to abolition of the "pillar structure" under the Lisbon Treaty, pose very difficult problems for us lawyers to solve. Play as we will with the language of the EU DPD, "personal data" will remain an amorphous notion, consent a treacherous concept, and enforcement problematic. John Palfrey recently called for new collaborative policymaking mechanisms in the context of use of social media by youth. I echo this call with respect to the EU data protection framework: to make real progress, let’s call in the engineers.

California Supreme Court Halts ZIP Code Collection

Reed Smith colleagues on our Global Regulatory Enforcement Law Blog discussed a recent California Supreme Court ruling that declared illegal the collection of an individual’s ZIP code when completing a credit card transaction. As a result, the ability of many retailers to generate in-store marketing leads becomes even more difficult. We encourage you to visit the blog to read the full summary and analysis.

Privacy Remains At the Forefront

Last week saw a flurry of activity on the privacy front, likely unprecedented at least in recent history. Over the course of less than 48 hours, three different privacy bills were introduced in the House of Representatives, one by Rep. Bobby Rush (D-Ill.), and two by Congresswoman Jackie Speier (D-Cal.). Speier is no stranger to the privacy arena, having been the primary driver behind very similar legislation, the California Financial Privacy Act, that was passed in her home state back in 2003. In a somewhat unique twist, Speier introduced two bills on Friday – the "Financial Information Privacy Act of 2011" and the "Do Not Track Me Online Act." We discuss each of the Rush bill and the Speier "Do Not Track Me Online" bill below (with a separate article on Speier’s "Financial Information Privacy Act of 2011" bill to follow shortly).

Rush Bill

Rush’s bill, essentially the same bill he introduced in July 2010 during the last Congress, is focused on enhancing consumer privacy online. Rush’s bill, dubbed "Best Practices Act for Online Privacy," allows for the collection and use of information from consumers, but requires entities to provide consumers with the ability to opt out from such collection, and to obtain a consumer’s consent before his/her data may be shared with any third party. Rush's proposed legislation, which would apply to both online and offline companies collecting personally identifiable data from customers, attempts to build federal standards around the ways personal data can be collected and used.

More specifically, the Rush bill provides (again):

  • Companies are required to provide concise, meaningful, timely, prominent and easy-to-understand notice to users about their privacy policies and practices, including what information and why
  • Internet companies, like search engines and social networks, would be required to get explicit consent from consumers before using any sensitive personal information for commercial purposes
  • Companies that have already collected personal information may keep such data on hand as long as it either serves a legitimate business need or is used for law-enforcement needs
  • State attorneys general may also bring actions against companies that violate customers’ privacy rights, with a maximum penalty of $5 million
  • Companies outside the Federal Trade Commission’s traditional jurisdiction — including financial services firms, nonprofits and agricultural businesses — are exempted
  • The FTC shall be tasked with establishing regulations under this proposed law, including the establishment of a safe harbor program for companies that wish to self-regulate. By voluntarily pledging to follow the new privacy policy, Rush is proposing that companies would no longer need to obtain user consent to share information.

Both in contrast to Speier’s "Do Not Track Me Online" bill and interesting in its own right, Rush’s bill does not mandate a do-not-track mechanism that would give consumers an easy way to opt out of having their Web activities tracked for advertising purposes, as does the Speier bill.

Speier’s "Do Not Track Me Online" Bill

By way of background, the "Do Not Track Me Online" bill is intended to define (i) who is subject to the bill, (ii) the nature of data that is subject to the bill, (iii) the Federal Trade Commission's (“FTC”) responsibility to establish online opt-out mechanisms, and (iv) the penalties assessed against violators of the proposed Do Not Track Me laws, if applicable.

The term covered entity is defined to include any party that collects and stores online data containing covered information in interstate commerce. Covered information is represented by a fairly extensive rundown of information generated from an individual’s online activity, including: (i) the websites and content accessed, (ii) the date and hour of online access, (iii) the computer and geo-location from which online information was accessed, (iv) the means by which such information is accessed (i.e., device, browser or application), (v) any unique user identifiers (i.e., customer numbers, IP addresses, etc.), and (vi) personal information (i.e., name, address, email addresses, etc.).  From there the bill creates a further category – sensitive information. The term sensitive information is defined to encompass medical history (including both physical and mental health information), an individual’s social security number, unique biometric data, race or ethnicity, religious beliefs, sexual behavior, income assets, financial records and related information, and a user’s geo-location information. 

The bill directs the FTC to establish and promulgate, within 18 months from its enactment, standards that establish an online opt-out mechanism that allows consumers to stop the collection or use of any covered information and to require a covered entity to honor such individuals’ opt-out decisions. Moreover, covered entities are required to disclose their information collection and use-practices, and have processes and procedures in place to abstain from the collection of covered information from those consumers that have opted-out of such collection or use, unless the consumer changes his/her opt-out preferences. Moreover, the FTC is given the authority to prescribe regulations it feels are necessary to carry out the purposes of this bill, to perform random audit of covered entities for investigative purposes to ensure compliance with the regulations, and to take any action it deems necessary to monitor, implement and enforce the regulations.

Sensitive to the realities that there are many uses of data, the bill enumerates several data uses that the FTC may exempt from some or all of the regulations. For example, the bill contemplates that there are data uses where consumer choice is not necessary, including analyzing data related to use of a product (e.g., web metrics), customer service, basic business functions (e.g., accounting, inventory, quality assurance and supply chain management), protecting or defending one’s rights or property, and compliance with applicable federal, state or local laws. 

The Speier bill provides that a violation of the regulation amounts to a deceptive and unfair advertising and marketing practice, under 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)). In contrast to the Rush bill, Speier’s bill more closely follows the recent FTC report on Privacy, which asked for comment on a proposed do-not-track mechanism. While the Rush bill contemplates the FTC establishing rules to implement his Best Practices for Online Privacy initiative, Speier’s bill goes further by specifically empowering the FTC under Section 553 of Title 5 to prosecute deceptive and unfair advertising practices. The most immediate challenges facing Speier:  no GOP co-sponsor, she’s not a member of the House Energy and Commerce Committee, and the likelihood that we’ll see several more privacy bills introduced in the coming weeks and months.           

Senate Judiciary Committee on Privacy, Technology and the Law

Lastly, on February 14, 2011, Sen. Patrick Leahy (D-Vt.), Chairman of the Senate Judiciary Committee, announced the creation of a subcommittee on Privacy, Technology and the Law.  The subcommittee will be chaired by Sen. Al Franken (D-Minn.), and its jurisdiction will include oversight of laws and policies that govern the commercial collection, use and dissemination of personal information. Both the niche and agenda of this subcommittee remains somewhat in flux, as is the manner in which this committee will navigate the choppy and increasingly crowded privacy waters. While this subcommittee will increase the Senate’s focus on privacy issues, it is likely to encounter both political and jurisdictional conflicts with the Financial Services and Commerce Committees when proposing legislation.

Why This Is Important

While Congress continues to consider and debate various incarnations of a privacy law and model, this issue is clearly picking up momentum. There is also fervent activity within the states and courts, as privacy causes of action continue to be used by class-action plaintiff attorneys. With the FTC and DOC both issuing final privacy reports this year, 2011 promises to be an interesting year in the privacy world. 

UK Data Privacy Development

As data privacy heats up on this side of the pond, last week the UK government announced a package of measures focused on extending the scope of the Freedom of Information Act (FOIA) and strengthening the independence of the UK’s data protection and freedom of information regulator, the Information Commissioner’s Office (ICO).

The anticipated Freedom Bill (to be published in February 2011) will include proposals to extend the scope of FOIA to a number of organisations for the first time. The Government announced the definite inclusion of the Financial Ombudsman Service and has proposed including The Advertising Standards Authority, The Panel on Takeovers and Mergers, The Law Society, Bar Council and other approved regulators under the Legal Services Act 2007, subject to consultation.

Reed Smith's entire alert on this development, written by Nick Tyler and Cynthia O'Donoghue, can be accessed here.

NTIA Green Paper Released

If you thought Washington had nothing more to say on the issue of privacy...think again.  Check out Legal Bytes' (our sister-blog) article on the latest report to come out of Washington -- this time focusing on the release by the NTIA of its "Green Paper" (which you can download and read), Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework. In their own words, "The Commerce Department Office of the Secretary's, leveraging the expertise of the National Telecommunications and Information Administration ("NTIA"), the Patent and Trademark Office ("PTO"), the National Institute of Standards and Technology ("NIST"), and the International Trade Administration ("ITA"), has created an Internet Policy Task Force to conduct a comprehensive review of the nexus between privacy policy, copyright, global free flow of information, cybersecurity, and innovation in the Internet economy." The Federal Register notice of this paper will seek public comments, noting that they will be due on or before January 28, 2011.

We'll have more to share in the coming days...

Déjà Google

Give Google credit that when it announced its acquisition of AdMob, a leading provider of mobile advertising services and technology, in November 2009, it proactively addressed the likelihood of a Federal Trade Commission (FTC) investigation into the transaction. Google even went as far as posting a web page that the media, regulators and other interested parties alike could access that explained why it believed the deal did not pose any “competitive” (note: antitrust) concerns.  Whether it was a self-fulfilling prophesy or just an inevitable step whenever Google makes an acquisition in the digital advertising space, Google last week announced it received a second request for information from the FTC on the AdMob acquisition. This, however, is familiar territory for Google, which has been the target of government scrutiny over previous deals. The FTC held an eight-month investigation into Google's plan to buy DoubleClick Inc. in 2007 before approving that transaction, and last year Google walked away from a search deal with Yahoo after the U.S. Justice Department indicated that it would consider blocking the agreement and strategic alliance.

What Google may not have expected is the data privacy and consumer protection industry group backlash that has taken up the not-yet-completed transaction as a struggle to protect consumer data and the mobile advertising market. At least two prominent consumer groups reportedly approached the FTC, asking it to block the acquisition, arguing that a Google/AdMob combination would put “significant amounts of data for tracking, profiling and targeting” of U.S. mobile consumers into the hands of a single advertising network. Google and AdMob combined will form the largest mobile-advertising company, with 30 to 40 percent of the market, according to Karsten Weide, an analyst with researcher IDC in San Mateo, California. These groups want the FTC to consider whether Google's access to AdMob's technology will give it an unfair advantage in selling mobile advertising.

Understandably, Google has asserted that the economic/market impact of such an acquisition would be almost impossible to measure against the dozens of other mobile ad networks that compete with AdMob on a daily basis. Moreover, a spokesperson for Google has suggested the deal will provide users with more free mobile applications, in some cases as an alternative to pay-to-download apps, since it will allow developers to subsidize their products through better and more targeted mobile advertising.

One interesting issue that has arisen from this and other similar transactions over the past couple of years is whether and how consumer privacy fits into an FTC antitrust analysis. It is well documented that the FTC primarily rests its antitrust analysis on two categories: (i) agreements that are per se illegal, and (ii) agreements that are analyzed under the Rule of Reason. Types of agreements that have been held per se illegal include agreements among competitors to fix prices or output, rig bids, or share or divide markets by allocating customers, suppliers, territories, or lines of commerce. On the other hand, agreements not challenged as per se illegal are analyzed under the Rule of Reason to determine their overall competitive effect. A Rule of Reason analysis entails a flexible inquiry and varies in focus and detail, depending on the nature of the agreement and market circumstances. While this analysis still begins with a review of the primary agreement (e.g., merger, joint venture, license, etc.) driving the FTC’s analysis, it will then extend to other external factors.

Largely until 2007 and the Google/DoubleClick transaction, the issues and types of analysis described above were primarily centered on consolidations and combinations of goods and services, and not privacy or consumer information. During the FTC’s review of Google’s acquisition of DoubleClick, however, all five FTC commissioners who reviewed that transaction agreed that data privacy can constitute a form of non-price competition under a Rule of Reason analysis and, where/when appropriate, should be considered as one of many pieces in their study and review of a prospective transaction. In fact, the FTC, in its decision approving the Google/DoubleClick transaction, provided, “We investigated the possibility that this transaction could adversely affect non-price attributes of competition, such as consumer privacy.” At the core of the FTC’s review was whether, given the nature and economics of online and digital advertising, the concentration of user information that results from a Google/DoubleClick combination meant that no other company would be able to buy, target and optimize ads as profitably, thereby substantially reducing the ability of other ad networks to compete.

On what basis, then, is consumer privacy evaluated? Proponents have successfully argued that privacy harms can reduce consumer welfare, which is a principal goal of modern antitrust analysis. In addition, these same groups have argued that privacy harms can lead to a reduction in the quality of a good or service, which is a standard category of harm that results from excessive market power. On the other hand, those who oppose the incorporation of a privacy review in any antitrust analysis generally rest their argument on two points: (i) they disagree that privacy is a competition-related issue and point to precedents in which non-competition issues (like pollution) have not been traditionally factored into an antitrust analysis, and (ii) these transactions have proved themselves to create market efficiencies and improved offering/technology that ultimately benefit consumers with a more personalized online experience. This latter opinion may best be summarized in a Yahoo statement from 2008: “The advertising model has made Internet content and services available to millions of people in the United States and around the world—for free. The business model of relying on advertising revenue to fund websites has meant that vast amounts of information on the Internet has been fully accessible to people of all ages and income levels.”

Why this Matters: 

Those who ignore history are doomed to repeat it. Our economy today is flush with companies that have been created to essentially trade in almost every aspect of behavioral advertising and consumer data. In fact, one might argue that consumer data has become a currency of sorts in the digital advertising and media industries. As consumer privacy becomes, on the one hand, increasingly protected by both legislation and self-regulatory initiatives (leaving aside the even more complex discussion of the implications of cross-border transactions and acquisitions where the same piece of consumer data may be subject to varying laws), and also a valuable commodity that is highly sought after, companies should be more aware of the legal implications associated therewith in all spheres of their business – including the arena of mergers and acquisitions. Whether one agrees that consumer privacy should be factored into an FTC antitrust analysis or not, it seems unlikely that the FTC will shift from the position it seems to have taken (as evidenced by the Google/AdMob transaction) over the past couple years, and therefore, companies that are contemplating mergers or acquisitions in the digital media and advertising arenas should at least consider the implications that consumer privacy may have on their deals.

Ten Data Security Questions Faced by Every Company

This post was written by Paul Bond

When emergencies hit, Reed Smith's clients routinely call upon the Firm's Privacy, Security, and Management Group. We've dealt with everything from lost laptops to international hacking, database thefts by employees to pharmacy dumpster diving. On the litigation side, we have defended more than sixty (60) consumer class actions arising from privacy incidents, along with significant government relations and insurance recovery work. We are assisting a wide variety of clients with emerging challenges to what personal information they capture and utilize, both in connection with marketing as well as for day-to-day business operations.

There is no perfect privacy compliance program. However, in an article recently published by The Privacy & Data Security Law Journal, Paul Bond has presented a series of "Ten Data Security Questions Faced by Every Company." A comprehensive approach to privacy compliance will address each of these questions, and reduce the incidence and likely severity of privacy events going forward.

If you have questions about this article, feel free to contact Paul directly, the Group's head Mark Melodia, or the Reed Smith attorney with whom you regularly work.

Reed Smith DC Office Hosting FCBA Privacy/Data Security & Legislative Committees Meeting Tomorrow

Reed Smith will host the next brown bag lunch meeting of the Federal Communications Bar Association’s joint Privacy/Data Security and Legislative Committees. The meeting will be held tomorrow, October 13, 2009, between 12:00 noon – 2:00 p.m. at Reed Smith’s Washington, D.C. offices (1301 K Street, NW, Suite 1100 East Tower). The Committees will discuss the legislative priorities for the 111th Congress with special emphasis on behavioral marketing and data security legislation. The following speakers are confirmed to-date:

  • Amy Levine, Legislative Counsel to Congressman Rich Boucher; and
  • Paul Cancienne, Legislative Aide to Congresswoman Mary Bono Mack.

We also have invited staff from the U.S. Senate. It's not too late if you'd still like to attend -- please RSVP to Desiree Logan at dlogan@reedsmith.com.

Global Regulation of Behavioral Marketing Teleseminar

In response to the global needs of our clients, Reed Smith is pleased to announce the next installment of our 2009 "Doing Business Globally" teleseminar programs, a series focused on issues that companies should understand about doing business in the global marketplace.

Our next teleseminar will take place Wednesday, September 30 at 12 p.m. (EDT), and will focus on "Global Regulation of Behavioral Marketing." This teleseminar explores the labyrinth of global regulation of targeting consumers—on and off the web—through behavioral marketing. Regulators and consumerists object to such sophisticated techniques, fearful that it further erodes what little privacy protection remains, and that it violates data protection laws. Marketers respond that such advances allow for a far more efficient and consumer-friendly marketplace, pointing out that the personal information retrieved is not on a "one-on-one" target but on aggregate marketing to a large group of consumers with similar demographic and psychographic profiles. The two sides are far apart, and understanding the legal minefield is critical for every marketer.

Join moderator Doug Wood as he probes the issues with Joe Rosenbaum (New York) and Gregor Pryor (UK) for a 45-minute discussion, followed by a Q&A period to address your specific concerns.

Douglas Wood is a partner in Reed Smith's New York office and head of the firm's Media & Entertainment Industry Group, and co-chair of the firm’s global Advertising Technology & Media practice. He has more than 30 years' experience representing national and multinational companies in advertising, marketing, promotions, unfair competition, intellectual property, and e-commerce matters.

Joseph Rosenbaum is a partner in Reed Smith's New York office and co-chair of the firm’s global Advertising Technology & Media practice. Joe focuses on law and policy relating to digital, online and interactive advertising & marketing, e-commerce & information technology, digital content, media & entertainment law, online and interactive gaming & promotions, and privacy & data protection.

Gregor Pryor is a partner in Reed Smith's London office and a member of the Advertising Technology & Media team. Gregor advises clients concerning data protection and privacy matters, particularly in relation to online operations and targeted advertising.

To view the invitation, please click here.

To register for this event, please click here.

Age Verification Technology Enables Targeted Advertising

As regulators push website operators to adopt age verification technology to protect children from inappropriate content and social contact with adults, a new opportunity has arisen for advertisers.

Nancy Willard, who calls herself an expert on Internet safety, says age verification companies are using information gained from seeking to verify children's ages to target them with advertising. She points to California-based eGuardian, which solicits personal information concerning children from parents-including kids' birthdates, as well as their addresses, schools and genders. The company then offers schools the entire $29 sign-up fee collected from parents for every parent the school steers to the site.

The company's business plan is to solicit websites that are willing to pay a commission for each eGuardian member, which would allow them use the data collected to tailor their advertising. eGuardian Chief Executive Ron Zayas notes that parents are provided with the choice to opt out of having data shared with advertisers, and says the privacy concerns are a "tradeoff."

"When children go to Web sites today, they are already exposed to ads," Zayas said. "We make sure the ads are appropriate for children. We do not increase the volume of ads shown, nor do we ‘sell them out' in any way to advertisers."

Read more about the controversy at nytimes.com