Oklahoma Joins the Rapidly Growing Number of States with Social Media Password Laws

This post was written by Rose Plager-Unger and Frederick Lah.

On May 21, 2014, Oklahoma enacted H.B. 2372, following the trend outlined in our earlier article on the growing number of states prohibiting employers from requesting employee or applicant social media account passwords.  H.B. 2372 prohibits employers from requesting or requiring the user name and password of employees’ or applicants’ personal social media accounts or demanding employees or applicants to access the accounts in front of the employer.  The law also prohibits employers from firing, disciplining, or denying employment to employees or applicants who refuse to provide the requested information.

Currently, fifteen states have enacted laws similar to Oklahoma’s law, and legislation has been introduced in at least thirteen more states.  This issue only continues to grow as Facebook has publically supported laws restricting employer access.  Further, even if a state has not adopted a law specifically prohibiting employers’ access to employees’ or applicants’ passwords, employees and applicants can invoke common-law privacy principles to protect their privacy rights.

While questions remain as to how widespread the practice of requesting employee or applicant social media passwords was, there is no denying the fact that Oklahoma’s new law represents the latest in a growing trend.  Employers should take proactive steps to protect themselves from potential violations including updating internal policies and procedures to comply with the new legislation.  Given the patchwork of laws governing this issue, it is the best practice for an employer who maintains employees in many of these jurisdictions to comply with the most restrictive state’s law.  Further, once employers revise their policies and procedures, they should make sure that employees involved in the hiring process and in managerial positions are aware of what they can and cannot ask of applicants and employees.

Online Advertising Targeted by Federal Trade Commission

On May 15, 2014, Maneesha Mithal, Associate Director of the Division of Privacy and Identity Protection at the Federal Trade Commission (“FTC” or “Commission”) testified, on behalf of the FTC, before the U.S. Senate Committee on Homeland Security and Governmental Affairs addressing the Commission’s work regarding three consumer protection issues affecting online advertising: (1) privacy, (2) malware and (3) data security. Below is a summary of the Commission’s testimony regarding these three key areas and the Commission’s advice for additional steps to protect consumers.

Privacy

Privacy has been a top priority for the Commission since the early 1990s. In March 2012, the Commission released its Privacy Report, and it continues to engage in privacy enforcement actions involving the online advertising industry. In its testimony, the Commission highlighted several key enforcement actions in this area that demonstrate significant principles regarding privacy:

  • Chitika, Inc., No. C-4324 (F.T.C. June 7, 2011) – The FTC alleged that Chitika, an online advertising network, violated section 5 of the FTC Act when it offered consumers the ability to opt out of the collection of information to be used for targeted advertising – without telling them that the opt-out lasted only 10 days.
  • ScanScout, Inc., No. C-4344 (F.T.C. Dec. 14, 2011) – The FTC charged that ScanScout deceptively claimed that customers could opt out of receiving targeted ads by changing the computer’s web browser settings to block cookies, when, in fact, ScanScout used Flash cookies, which browser settings could not block.
  • Epic Marketplace, Inc., No. C-4389 (F.T.C. Mar. 13, 2013) – The company settled charges that it used “history sniffing” to secretly and illegally gather data from millions of consumers about their interest in sensitive medical and financial issues, ranging from fertility and incontinence to debt relief and personal bankruptcy. 
  • Google, Inc., No. C-4336 (F.T.C. Oct. 13, 2011) – Google agreed to pay a $22.5 million civil penalty to settle charges that it misrepresented to Safari browser users that it would not place tracking cookies or serve targeted ads to them, violating an earlier privacy order with the Commission.

Spyware and Other Malware

The Commission’s testimony emphasized that spyware and malware can cause substantial harm to consumers and to the Internet as a medium of commerce. Since 2004, the Commission has initiated a number of malware-related enforcement actions, which focus on three key principles:

  • Installing Software: A consumer’s computer belongs to him or her, not to the software distributor, and it must be the consumer’s choice whether or not to install software. Downloading spyware to a consumer’s computer without his/her knowledge is a violation of section 5 of the FTC Act.
  • Disclosures: Buried disclosures of material information necessary to correct an otherwise misleading impression are not sufficient in connection with software downloads. Burying material information in an End User License Agreement will not shield a malware purveyor from liability under section 5 of the FTC Act.
  • Removal of Malware: If a distributor puts a program on a computer that the consumer does not want, the consumer should be able to uninstall or disable it.

Data Security

The security measures implemented by companies to protect consumer data from third parties accessing such data without permission has been the focus of 53 enforcement actions initiated by the FTC. In addition to enforcement of the FTC Act, the Commission enforces several specific statutes and rules that impose obligations upon businesses to protect consumer data, including the Commission’s Safeguard Rule implementing the Gramm-Leach-Bliley Act, the Fair Credit Reporting Act, and the Children’s Online Privacy Protection Act. Through its 53 data-security-related enforcement actions, the Commission has developed the following principles, which apply to online advertising networks, as well as other businesses:

  • Reasonable and appropriate security measures must be continuously assessed to address risks
  • There is no one-size-fits-all data security program
  • The Commission does not require perfect security, but assesses security measures for reasonableness in light of the sensitivity and volume of consumer information held by the company, the size and complexity of the company’s data operations, and the cost of the available security tools

Recommendations for Next Steps

To continue to protect consumers in the areas of privacy, malware, and data security – particularly with respect to online advertising – the Commission offered three recommendations:

  • More widespread consumer education about how consumers can protect their computers against malware
  • Continued industry self-regulation to ensure that ad networks are taking reasonable steps to prevent the use of their systems to display malicious ads to consumers
  • Enactment of a strong federal data security and breach notification law to prevent breaches and protect consumers from identity theft and other harm

Privacy Issues Continue to Be Focus of State AGs

States continue to focus their investigation and enforcement efforts on privacy issues, with no sign that the focus will shift anytime soon. The most recent example is a $17 million settlement between 37 states (and D.C.) and Google related to Google’s use of tracking cookies on Safari browsers. For more information about this case and some other developments on the state level, read "State Attorneys General Maintain Sharp Focus on Privacy" on Reed Smith's Global Regulatory Enforcement Law Blog.

California's Eraser Button Law Triggers Privacy Requirements

Widely known as California's "Eraser Button" law, SB 568 recently cleared the governor’s desk and has been signed into law. The new law, which takes effect on January 1, 2015, adds two key privacy-related requirements for operators of websites, online services, and mobile apps directed toward minors. Find out whether it will apply to your business by reading the latest post on our sister site, the Global Regulatory Enforcement Law Blog.

Privacy Concerns to be Addressed at FTC Workshop in November

In response to the mounting data privacy concerns attributed to the proliferation of smart devices, the FTC will be holding a public workshop on November 21, 2013, addressing questions over the “Internet of Things.” Two public interest groups, Electronic Privacy Information Center (EPIC) and Center for Digital Democracy (CDD), have already submitted comments expressing their concerns over the privacy implications related to this topic, which include the tracking of daily behaviors and personal habits. To find out more, please read the latest post on our sister blog, the Global Regulatory Enforcement Law Blog.

Model Suing Lions Gate Over Opening Credits of Mad Men

The next time you watch Mad Men, you may find yourself paying a little closer attention to the opening credits. Last week, Lions Gate Entertainment Corporation was sued by a model from the 1950s and 1960s who is alleging the company violated her publicity and privacy rights by using a photograph from her in the show’s opening credits without her permission.

According to a complaint filed in California state court (PDF), the woman in the picture is Gita Hall May, now 79 years old. In the opening credits of Mad Men, a shadow of a businessman is shown falling against the backdrop of office buildings and advertisements from the 1950s and 1960s. One of the advertisements, per the complaint, depicts a cropped photograph of May from an old Revlon campaign. According to May, she consented to the use of her likeness and the photo embodying it only for the Revlon campaign, not 40 years later as part of a title sequence of a television series. May brings her complaint under a number of charges, including misappropriation of the right of publicity for commercial purposes and invasion of privacy.

You can view the opening credits below. The image at issue appears around the 22 second mark.

 

 

FTC Addresses Guidance on Mobile Privacy

On February 15, Chris Olsen of the FTC’s Division of Privacy and Identity Protection spoke at the National Telecommunications and Information Administration (NTIA) stakeholders’ meeting in Washington, D.C. to address its recently released Mobile Privacy Disclosures Guidance.

For a full list of the meeting's highlights, please read the latest post on our sister blog, Global Regulatory Enforcement Law Blog.

Our attorneys will be monitoring the situation closely and will keep you informed of any updates as they develop.

Who's Watching You? Facebook Agrees to Provide More Transparency with Online Advertising

This post was written by Christine Nielsen, John Feldman, and Caroline Klocko.

Yielding to pressure from advertisers, ad agencies, the media, consumers, and, perhaps, the FTC, Facebook has agreed to place The Digital Advertising Alliance’s (DAA) “AdChoices” logo on ads served on its site via its FBX ad exchange. The move makes Facebook more accountable for educating users about online behavioral advertising and allowing them to opt-out.

The AdChoices icon alerts users when an ad has been placed on their screen based on behavioral-targeting methods. When clicked, the icon takes the user to an informational page that allows the user to learn more about targeted marketing and to opt-out of such advertising. The industry has promoted the opt-out, understanding that information such as products viewed online by users, who their friends and connections are on Facebook and Twitter, which websites they have visited, and relevant search history, while not historically defined as “personally identifiable” is nonetheless very private, and very valuable.

The Council of Better Business Bureau’s Online Interest-Based Advertising Accountability program has lauded the move, finding in a recent decision that it is “a meaningful step in increasing transparency and choice.” However, the change, which Facebook has said will take place by the end of the current quarter, is not as transparent as some would like, and it remains to be seen whether it complies with the FTC’s “clear and prominent notice,” standard. The social media giant has only committed to displaying the AdChoices logo when a user hovers over the ad with the mouse and then hovers over the grey “x” that appears. This unconventional display of the icon, which usually appears on the ad regardless of whether the user hovers over the ad, was approved by the Online Interest-Based Advertising Accountability program.

Facebook users are accustomed to clicking on the “About this ad,” link to find information about online behavioral advertising methods. The text of this link, while not fully descriptive, indicates to the user that something might be going on behind the scenes that they might want to check out. By agreeing to participate in the DAA’s program, Facebook is adding something new to its current practice of informing consumers about advertising: accountability.

Facebook’s voluntary agreement to participate in the program means that the social networking site will now be subjected to compliance reviews and scrutiny by the entities that enforce the AdChoices program. Facebook’s participation also increases the icon’s visibility, which in turn can help to make more consumers aware of their choices to receive targeted advertising. Such increase in visibility is also due to the use of the icon in the mobile space, which began last August and was covered in a previous Reed Smith blog post.

In contrast with this self-regulatory approach, Blackberry has just announced what it touts as a technology-driven solution to the consumer choice conundrum in the mobile and web environments. Please click here to read about Blackberry’s announcement on our sister blog, Global Regulatory Enforcement Law Blog.

FTC to Mobile Industry: $800K... Can You Hear Me Now?

On February 1, the FTC announced its largest settlement to date with a mobile app developer – an $800,000 penalty – in conjunction with the release of guidance on mobile app best practices for app platforms, app developers, third parties and app trade associations.  The guidance document and the enforcement action together demonstrate the need for companies with mobile apps to review disclosures, vendor agreements, and consumer consent mechanisms for data collection.

Please click here to read the complete report on our sister blog, Global Regulatory Enforcement Blog.

Privacy Stakeholders Meet Again Over Mobile Privacy Best Practices

The National Telecommunications and Information Administration (NTIA) of the U.S. Department of Commerce held its latest meeting with privacy stakeholders November 30, to discuss privacy best practices for the mobile environment. We have previously blogged about some of these earlier meetings on our sister blog from our Global Regulatory Enforcement Group.

This latest meeting once again joined members of the app developer industry with members of consumer advocate and privacy groups, along with other stakeholders from government, academia, and the private sector. Discussion focused on the latest draft of the Mobile Application Transparency Code of Conduct. Some of the issues considered in this latest draft are:

  • The scope of the definition of “Mobile Devices,” and whether, in addition to smart phones and tablets, that term should include “similar portable computing devices”
  • What type of data should be covered by the Code
  • Whether third-party service providers should be subject to the Code
  • Whether mobile app providers should be required to provide a “Short Notice” in addition to other Notice
  • What elements should be included in the Notice
  • Whether the format of the Short Notice should be standardized or within the provider’s discretion
  • How the Short Notice should be presented at download, if at all
  • Whether the Code should require that companies establish a mechanism for consumers to access data

Why This Matters. While the Code is not yet finalized, app developers and marketers who include apps in their communications strategy should pay close attention to these proceedings. There is clearly a focus among all stakeholders on the broader issues of how widely the Code should apply and what type of notice app providers should be required to give consumers, as well as how that notice should be delivered. In an age when consumer consent, opt-in vs. opt-out, and geo-location are front and center in the privacy and mobile debate, the outcome of these discussions will have a major impact on the market. While the Code will only represent a series of best practices, as opposed to binding law, it will still set standards for mobile app providers and marketers regarding how they operate in the mobile environment with respect to consumer privacy.

We will continue to report on this process as the meetings continue to move forward. The next meeting is scheduled for December 18, 2012.

A Growing Trend - Employers Prohibited from Requesting Employee or Applicant Social Media Log-In Information

Earlier this year, Maryland enacted Labor and Employment Code §3-712, becoming the first state to pass a law explicitly prohibiting employers from requesting or requiring employees or applicants to disclose their usernames and passwords for their personal social media accounts. The law also prohibits an employer from discharging, disciplining, or penalizing the employee (or threatening to do so) or refusing to hire an applicant for refusal to disclose this information.

This law only applies to Maryland employers, but employers across the nation should take note as a number of other states are contemplating similar laws. Illinois’ version has already passed both legislative houses and is likely to become law within the next couple of months. California, Delaware, Michigan, Minnesota, New Jersey, New York, Ohio, South Carolina, and Washington are also contemplating their own versions of the law. The federal government has introduced similar bills in both the Senate and House as well.

While the need for such legislation has been debated, there is undeniably a growing trend among the states to introduce legislation prohibiting employers from asking employees or applicants for their social media log-in information. In response, employers should take steps now to account for this growing trend, such as reviewing and revising their internal policies and procedures so that they are consistent with the legislation. Further, employers should communicate these revised policies and procedures across the company so that hiring personnel are aware of what they can and can not ask from employees or applicants.

We will continue to monitor this arena for developments.

European Mobile Operators Back App Privacy Guidelines

On February, 27 2012, with the support of Europe’s largest mobile operators, the GSMA published a set of global Privacy Design Guidelines for Mobile Application Development. These guidelines come just days after the largest US based app providers, including Google, Apple and Amazon, agreed to legally enforceable privacy standards. 

Learn more by visiting our sister blog, Global Regulatory Enforcement Law Blog.

Ad industry to honor do-not-track browser header

Amy Mushahwar was featured in ADWEEK article "Regulators and Business to Work on Privacy Bill of Rights: Ad industry to honor do-not-track browser header". The full article can be found here.

Facebook Gets In Your Face

This post was written by Spencer Wein.

Facebook has rolled out a new feature that uses photo recognition technology to suggest friends’ names to tag in uploaded photos. While certainly an impressive feature, the problem is that the social network giant introduced the feature as a default setting rather than as an opt-in option. This has left privacy advocates up in arms. 

Prior to facial recognition technology, Facebook users could manually tag pictures without permission from their friends. Under the new default settings, when a Facebook user adds photos to his or her pages, facial recognition software suggests names of people in the photos to tag based on pictures in which that user’s friends have already been identified. The feature is active by default on existing users’ accounts, though Facebook does explain on its blog how users can disable the function if they don’t want their names to be automatically suggested in their friends’ photographs.

On June 10, Washington-based Electronic Privacy Information Center (“EPIC”) filed a complaint with the Federal Trade Commission regarding Facebook’s new automated tagging feature. EPIC uses strong, Orwellian language, alleging that “[u]sers could not reasonably have known that Facebook would use their photos to build a biometric database in order to implement a facial recognition technology under the control of Facebook.” EPIC further warns that “absent injunctive relief by the Commission, Facebook will likely expand the use of the facial recognition database it has covertly established for purposes over which Facebook users will be able to exercise no meaningful control.” In its request for relief, EPIC urges the FTC to force Facebook to suspend photo-recognition “pending a full investigation, the establishment of stronger privacy standards and a requirement that automated identification, based on user photos, require opt-in consent.”

Facebook, on the other hand, has downplayed the recent complaints, writing on its blog that “Tag Suggestions” had already been widely available in most countries as it had been phased in over several months. “We launched Tag Suggestions to help people add tags of their friends in photos; something that’s currently done more than 100 million times a day,” Facebook further noted in an emailed statement. “Tag suggestions are only made to people when they add new photos to the site, and only friends are suggested.”

By limiting the feature to a user’s friend list, Facebook has attempted to minimize privacy infringement. However, there is no guarantee that Facebook does not eventually extend the tag suggestions to its complete user base. A person could conceivably take a picture in a public place, and then easily learn a stranger’s identity. At worst, this could help facilitate criminal acts. At a minimum, the technology could create online reputation problems. 

Additionally, it is unclear if Facebook will make the technology available to advertisers. Though certainly not an imminent danger, it is possible that the technology will reach a point where advertisers will recognize people in physical spaces and present personally tailored ads. Further, we cannot know how Facebook will respond to subpoenas and government requests for such data. 

Regardless of the technology’s “Big Brother” potential as well as the ongoing backlash that Facebook seemingly continues to experience with the rollout of each new feature and tool, it's quite surprising this company maintains the same, standard modus operandi of making its privacy-related features and tools the "default" setting. It will be both important and interesting to monitor how Facebook and its opponents handle this latest privacy issue.

California Senate Committee Passes Broad Online Privacy Bill

The Judiciary Committee of the California State Senate on Tuesday passed SB761, legislation that would broadly regulate all online data collection and impose a “do not track” regulatory regime for online behavioral advertising.  The bill was approved on a 3-2 vote and now goes to the Senate Appropriations Committee.  We need your help to defeat this proposal.

Senate Bill 761 was introduced on March 24th by Senator Alan Lowenthal.  The bill covers an overly broad range of information collected online, both personally identifiable information and anonymous data such as IP addresses and click-stream data.  It would direct the California Attorney General, in consultation with the California Office of Privacy Protection, to develop a regulatory regime to require companies engaged in online behavioral advertising or other Internet information collection to provide consumers with notice and the ability to opt out of the collection and use of information for those purposes.  Any company that failed to comply with the regulations would be subject to civil penalties of up to $1,000 and possibly punitive damages.  The bill authorizes the Attorney General to impose a requirement that companies provide access to consumers to all of the information collected about them online. 

ANA joined with a broad coalition of state and national industry groups to file comments describing the serious problems with this legislation.  We also filed separate comments with the Judiciary Committee members.  We will now be reaching out to the Appropriations Committee members.

It would be very helpful if your company would also contact the members to express your opposition.  Contact information is available at http://www.sen.ca.gov.

This bill is bad for consumers, bad for the business community and bad for the national economy.  We agree with the bill’s sponsor that consumers are entitled to notice about the collection of data for interest-based advertising and should have the ability to opt out of those practices.  However, we believe this can be best accomplished through strong, effective self-regulation rather than through government regulation.  ANA and other leading marketing associations launched a comprehensive new self-regulatory program for online behavioral advertising last October.  Information about the program is available at http://www.aboutads.info.

Consumers can go to that website today and opt-out of information collection for interest-based advertising.  We have encouraged all of our members to join this program, which covers the entire online ecosystem.  You can access our OBA toolkit here.

By imposing state-specific rules on information practices in a global medium, SB761 raises very serious interstate commerce concerns.  The bill is similar to the approach of legislation introduced in Congress by Congresswoman Jackie Speier (D-12/CA), the “Do Not Track Me Online Act” (H.R.654).   A government-imposed “Do Not Track” regime is a bad idea at the federal level and an even a worse idea at the state level where a checkerboard of inconsistent state regulation is almost certain to occur.

If you have any questions or comments about the California bill, please contact Dan Jaffe (djaffe@ana.net) or Keith Scarborough (kscarborough@ana.net) in ANA’s Washington, DC office at (202) 296-1883.

McCain-Kerry Bill Introduced

After months of deliberations, Senators John McCain and John Kerry introduced a comprehensive privacy bill entitled, the Commercial Privacy Bill of Rights Act of 2011 (the Act). Released in a press conference held by McCain and Kerry last week, the bill establishes a baseline framework for the privacy, security and management of personal information, an issue of growing concern to all businesses and consumers.

We have provided a summary of the bill’s definitions and key provisions (which contemplates five FTC rulemakings), all of which might change once the bill is debated within the Senate. To learn more, I encourage you to read our recent client alert.

The Online Behavioral Advertising Debate Heats Up in D.C.

A draft Commerce Department report being reviewed by the White House that recommends the creation of a privacy policy office and passage of legislation that establishes "a baseline privacy framework" was leaked yesterday and is proliferating as we speak (or write). In all, the report makes 10 recommendations and poses dozens of questions on many of the proposals. The department plans to seek formal comment on the questions in a separate Federal Register notice.

The 54-page draft document, entitled "Privacy and Information Innovation: A Dynamic Privacy Framework for the Internet Age," is the work of Commerce’s Internet Policy Task Force. The Task Force held more than six months of consultations, issued a notice of inquiry in April 2010, and held a symposium in May. The document is expected to be released in the coming weeks. The Task Force is a joint effort of the Office of Commerce Secretary Gary Locke, the National Telecommunications and Information Administration, the International Trade Administration, and the National Institute of Standards and Technology.

Recently, the Obama administration created a federal interagency panel to work on privacy and Internet policy. It is chaired by Commerce General Counsel Cameron Kerry and Assistant Attorney General Christopher Schroeder.

The report seeks to demonstrate that a compelling need exists "to provide additional guidance to businesses, to establish a baseline privacy framework to afford protection for consumers, and to clarify the U.S. approach to privacy to our trading partners – all without compromising the current framework’s ability to accommodate new technologies."

However, several industry groups, like broadband industry providers, have staunchly opposed any legislation, recommending in its stead that online privacy protections be pursued through self-regulation, industry standards, and best practices.

The Commerce's report said that baseline legislation should be "built on an expanded set of Fair Information Practice Principles (FIPPs). Widespread adoption of comprehensive FIPPs is essential to achieving the goals we have set for the Dynamic Privacy Policy Framework. Widespread adoption of FIPPs would protect privacy interests in data that currently receive little or no statutory privacy protection. Also, given the flexibility inherent in the individual principles, a FIPPs baseline would help ensure consumer privacy protection as new technologies emerge. Finally, the FIPPs-based framework that we envision would allow companies to direct resources to the principles that matter most for protecting privacy in a particular technological, business, or social context. Legislation would authoritatively establish a FIPPs-based framework, but action by industry, civil society, the Executive Branch, and enforcement agencies can also help this framework take hold." It asks whether the Federal Trade Commission should be given authority to impose rules implementing the privacy principles adopted by Congress.

As for other congressional action, the report said that lawmakers "should pass a data breach law for electronic records that includes notification provisions, encourages companies to implement strict data security protocols, and allows states to build upon the law in limited ways. The law should track the effective protections that have emerged from state security breach notification laws and permit enforcement by state authorities." And while it called for "baseline" privacy legislation, the report said that such a measure "should not preempt the strong sectoral laws that already provide important protections to Americans, but rather should act in concert with these protections."

In addition, the document said that "[a]ny federal law or regulation should seek to balance the desire to create uniformity and predictability across state jurisdictions with the desire to permit states the freedom to protect consumers and to regulate new concerns that arise from emerging technologies when federal law lags behind privacy issues created by a rapidly changing technological environment." Among the questions posed is whether state attorneys general should be given the authority to enforce national legislation.

The report also called on the Obama administration to "review the Electronic Communications Privacy Act (ECPA), paying particular attention to assuring strong privacy protection in cloud computing and location-based services. The goal of this effort should be to ensure that, as technology and market conditions change, the ECPA continues to provide a fair balance between individuals' expectations of privacy and the legitimate needs of law enforcement to gather the information it needs to keep us safe."

Regarding the privacy policy office (PPO), the Task Force has suggested that it could either be housed within Commerce or in the Executive Office of the President. The office would not have enforcement authority. As both a convener of diverse stakeholders and a center of Executive Branch privacy policy expertise, the PPO would work with the FTC in leading efforts to develop voluntary but enforceable codes of conduct. Voluntary principles developed through this process would be enforceable by the Federal Trade Commission and would serve as a safe harbor for companies facing complaints about their privacy practices.

We will certainly report on more developments with respect to this topic, as these leaks turn into babbling brooks and streams of information.

Survey on Privacy Law Developments

Although somewhat outside our traditional scope, I came across an excellent PowerPoint that our colleague, Christopher Walina, presented to Council of Chief Privacy Officers in New York yesterday. We all deal with privacy issues in the workplace, and these materials will hopefully provide our readership with a contemporary and comprehensive state of the law summary.

Enjoy...

Teen COPPA?

The Federal Trade Commission testified that while teens are heavy users of the digital environment and may benefit from using the Internet to socialize with peers, learn about issues that interest them, and express themselves, it also can pose unique challenges for them. The FTC testimony to the Senate Committee on Commerce, Science, and Transportation, Subcommittee on Consumer Protection, Product Safety, and Insurance notes that the Commission will continue to use law enforcement, education, and policy tools to protect teens in the digital environment.

@SecuredTweets: Twitter settles privacy charges brought by Federal Trade Commission

Today, Twitter and the Federal Trade Commission settled charges that the micro-blogging site had engaged in unfair and deceptive trade practices because of “serious lapses in the company’s data security.” The FTC began an investigation into Twitter after hackers obtained administrative control of the service, accessed tweets that consumers had designated private, and sent out phony tweets (from then-Presidential candidate Barack Obama, Fox News, and others).

In its complaint, the FTC alleged that Twitter was vulnerable to these attacks because it failed to take certain reasonable steps to prevent unauthorized administrative control of its system. Those steps included:

  • Requiring employees to use hard-to-guess administrative passwords that are not used for other programs, websites, or networks
  • Prohibiting employees from storing administrative passwords in plain text within their personal e-mail accounts
  • Suspending or disabling administrative passwords after a reasonable number of unsuccessful login attempts
  • Providing an administrative login webpage that is made known only to authorized persons and is separate from the login page for users
  • Enforcing periodic changes of administrative passwords by, for example, setting them to expire every 90 days
  • Restricting access to administrative controls to employees whose jobs required it
  • Imposing other reasonable restrictions on administrative access, such as by restricting access to specified IP addresses

Under the settlement, the FTC will require Twitter to set up a new security program to be assessed by a third party. It will also be prohibited from what the agency described as “misleading consumers about the extent to which it maintains and protects the security, privacy, and confidentiality of nonpublic consumer information, including the measures it takes to prevent authorized access to information and honor the privacy choices made by consumers.”

According to the FTC, this marks the 30th case brought as a result of lax security procedures, and the first against a social network.

Why This Matters:  As we have known for some time now, privacy is a hot-button issue at the FTC. To avoid an FTC investigation, you must consider whether your current privacy practices live up to both: (1) what the Commission considers “standard, reasonable” security procedures; and (2) your own privacy policy, which operates as a set of promises to consumers who use your service/patronize your business. If your security procedures fall short of either mark (or worse, both), the FTC could come calling. This then begs the question, when was the last time you audited your security and privacy procedures?

What the New Consumer Privacy Bill Means for Data Collection

On Monday, May 10th, 2010, the article "What the New Consumer Privacy Bill Means for Data Collection" appeared on Mobile Marketer, a widely read publication within the mobile marketing and advertising community. The article, written by Adam Snukal, summarizes the proposed privacy legislation introduced in the U.S. House of Representatives last week. If you have any questions about the article or the new bill, please contact Adam Snukal or another attorney within Reed Smith.

Arms' War in Italy: Aggressive Marketers Versus Privacy Watchdog

This post was written by Avv. Felix Hofer, and first appeared in Volume V of the Gala Gazette.

1. Implementing both, EU Directive 2002/58/EC of July 12th, 2002 (Directive on privacy and electronic communications) as well as Directive 2000/31/EC of June 8th, 2000 (Directive on electronic commerce) the Italian legislator decided that unsolicited commercial communication must always to adopt a strictly “opt-in” approach. The choice clearly didn’t drive marketers into a state of happiness: they felt that their business was unnecessarily harassed by complex and costly burdens. Therefore they decided initially not to care too much about the requirements set by the new regulations and to continue in their proven aggressive marketing techniques.

In doing so they nevertheless had underestimated a couple of factors: on one hand, consumers’ reaction (who became more and more annoyed by SPAM and behavioural targeting and were no longer tolerant of disturbing intrusions into their sphere of personal intimacy), on the other hand, the role of a Special Authority (the Privacy or Information Commissioner - DPA) in charge – in all countries members to the EU - of supervising proper compliance with the key principles of protection of personal data (and quickly focusing on the purpose of achieving a correct balance between consumers’ privacy and electronic marketing.
 

Click here to read the full article as published in Volume V of the Gala Gazette.

What, Me Worry? Legal Best Practices for Small Publishers An Informative Webinar Sponsored by the IAB Long Tail Alliance And Presented by Reed Smith

If you haven’t already registered for the Interactive Advertising Bureau’s education webinar, entitled “What Me Worry? Legal Best Practices for Small Publishers”, THERE IS STILL TIME! The webinar, which is scheduled for this coming Friday, October 23, 2009 from 12:00 pm – 1:00 pm (Eastern US Time), is being presented by Reed Smith and sponsored by the Long Tail Alliance Program of the Interactive Advertising Bureau (IAB). The Webinar will provide an overview of the legal issues and suggested best practices in the following areas:

Advertising Compliance          Privacy              Social Media

There will be a Q&A session as time permits at the end of the session, and a .PDF copy will be available on Legal Bytes after the seminar is over.

The webinar is open not only to IAB members and Reed Smith clients, but also to anyone who is interested - on a first-come, first-served basis. So register now. You can get more information and register right here for What, Me Worry? Legal Best Practices for Small Publishers. 

About the Long Tail Alliance Program

The IAB formed the Long Tail Alliance program in summer 2008 to encourage involvement with individuals and small business who, powered by interactive advertising, have turned their interests and passions into a media revolution. The Alliance is the beginning of something the IAB envisions as a much larger portrait of American entrepreneurs who are pursuing and achieving the American dream, even as they row hard against strong economic currents. The IAB hopes to expand its Long Tail Membership in order to encourage advocacy, training, and a coming-together of smaller publishers across America as their businesses grow, all while the dynamic of technology and media continues to change.

For more information, click here: http://iamthelongtail.com/707346

About the IAB

The Interactive Advertising Bureau is comprised of more than 375 leading media and technology companies who are responsible for selling 86 percent of online advertising in the United States. On behalf of its members, the IAB is dedicated to the growth of the interactive advertising marketplace, of interactive's share of total marketing spend, and of its members' share of total marketing spend. The IAB educates marketers, agencies, media companies and the wider business community about the value of interactive advertising. Working with its member companies, the IAB evaluates and recommends standards and practices, and fields critical research on interactive advertising. Founded in 1996, the IAB is headquartered in New York City, with a Public Policy office in Washington, D.C.

About Reed Smith

Reed Smith is a global, full-service law firm with nearly 1600 lawyers in 23 offices around the world. Joseph I. (“Joe”) Rosenbaum, a partner in the New York office, chairs the firm’s global Advertising Technology & Media law practice, is the editor and publisher of Legal Bytes, is Corporate Secretary & General Counsel to the IAB, and is an ex-officio member of the IAB Board. Adam Snukal is a senior associate who works with Joe in the Advertising Technology & Media law group and is editor of Adlaw by Request, the gold standard in advertising legal publications in the industry.

Join us for this exciting and timely IAB Long Tail Alliance webinar presented by Reed Smith. We look forward to your participation.

Ten Data Security Questions Faced by Every Company

This post was written by Paul Bond

When emergencies hit, Reed Smith's clients routinely call upon the Firm's Privacy, Security, and Management Group. We've dealt with everything from lost laptops to international hacking, database thefts by employees to pharmacy dumpster diving. On the litigation side, we have defended more than sixty (60) consumer class actions arising from privacy incidents, along with significant government relations and insurance recovery work. We are assisting a wide variety of clients with emerging challenges to what personal information they capture and utilize, both in connection with marketing as well as for day-to-day business operations.

There is no perfect privacy compliance program. However, in an article recently published by The Privacy & Data Security Law Journal, Paul Bond has presented a series of "Ten Data Security Questions Faced by Every Company." A comprehensive approach to privacy compliance will address each of these questions, and reduce the incidence and likely severity of privacy events going forward.

If you have questions about this article, feel free to contact Paul directly, the Group's head Mark Melodia, or the Reed Smith attorney with whom you regularly work.

Reed Smith DC Office Hosting FCBA Privacy/Data Security & Legislative Committees Meeting Tomorrow

Reed Smith will host the next brown bag lunch meeting of the Federal Communications Bar Association’s joint Privacy/Data Security and Legislative Committees. The meeting will be held tomorrow, October 13, 2009, between 12:00 noon – 2:00 p.m. at Reed Smith’s Washington, D.C. offices (1301 K Street, NW, Suite 1100 East Tower). The Committees will discuss the legislative priorities for the 111th Congress with special emphasis on behavioral marketing and data security legislation. The following speakers are confirmed to-date:

  • Amy Levine, Legislative Counsel to Congressman Rich Boucher; and
  • Paul Cancienne, Legislative Aide to Congresswoman Mary Bono Mack.

We also have invited staff from the U.S. Senate. It's not too late if you'd still like to attend -- please RSVP to Desiree Logan at dlogan@reedsmith.com.

Don't Worry, Be Informed!

On Friday, October 23, 2009, from 12 – 1 p.m. (Eastern US Time), Joseph I. (“Joe”) Rosenbaum, Partner at Reed Smith and General Counsel of the IAB, assisted by Adam Snukal, Senior Associate at Reed Smith, will be presenting an educational webinar, sponsored by the Long Tail Alliance Program of the Interactive Advertising Bureau (IAB), entitled: What Me Worry? Legal Best Practices for Small Publishers.

During the webinar, Joe and Adam will be discussing and fielding questions in each of the following areas:

Trademarks: Buying someone else’s key words? Displaying advertising? Sponsoring or hosting contests, sweepstakes, co-branded promotions? Using social media or virtual worlds? Trademarks are everywhere. When should you worry?

Compliance: What’s new at the FTC and FCC? Industry groups want self-regulation. Privacy and consumer advocacy groups want more regulation. Congress is poised to “do something.” What you need to know about marketing to children, adults, compliance with sectoral advertising regulations, from finance and health care to product safety. The new FTC guidelines for Endorsements and Testimonials, and the slow death of diet commercials.

Privacy: Behavioral targeting has everyone up in arms. What should a small publisher do if she feels her privacy policy has been violated?

Social Media: Blogs, splogs and vlogs. Virtual worlds, avatars and pseudonyms. Profiles and networks, friends and fans. Testimonials and endorsements – from celebrities to consumers, paid and unpaid. Buzz, viral and word of mouth. Defamation, libel, copyright and personalized URLs. Sound confusing? It is. But ignorance won’t insulate you from liability. Don’t want to become a regulatory target? What you should know.

Q&A: IAB and Reed Smith to answer questions from participants.

The webinar is open to IAB members, to Reed Smith clients, and to the general public on a first-come, first-served basis. Register now. You can get more information and register right here for What Me Worry? Legal Best Practices for Small Publishers

About the Long Tail Alliance Program

The IAB formed the Long Tail Alliance program in summer 2008 to encourage involvement with individuals and small businesses who, powered by interactive advertising, have turned their interests and passions into a media revolution. The Alliance is the beginning of something the IAB envisions as a much larger portrait of American entrepreneurs who are pursuing and achieving the American dream, even as they row hard against strong economic currents. The IAB hopes to expand its Long Tail Membership in order to encourage advocacy, training, and a coming-together of smaller publishers across America as their businesses grow, all while the dynamic of technology and media continues to change.

For more information, click here.

About the IAB

The Interactive Advertising Bureau is comprised of more than 375 leading media and technology companies that are responsible for selling 86 percent of online advertising in the United States. On behalf of its members, the IAB is dedicated to the growth of the interactive advertising marketplace, of interactive's share of total marketing spend, and of its members' share of total marketing spend. The IAB educates marketers, agencies, media companies and the wider business community about the value of interactive advertising. Working with its member companies, the IAB evaluates and recommends standards and practices, and fields critical research on interactive advertising. Founded in 1996, the IAB is headquartered in New York City, with a Public Policy office in Washington, D.C.

About Reed Smith

Reed Smith is a global, full-service law firm with nearly 1600 lawyers in 23 offices around the world. Joseph I. (“Joe”) Rosenbaum, a partner in the New York office, chairs the firm’s global Advertising Technology & Media law practice; is the editor and publisher of Legal Bytes; is Corporate Secretary & General Counsel to the IAB; and is an ex-officio member of the IAB Board. Adam Snukal is a Senior Associate who works with Joe in the Advertising Technology & Media law group and is editor of Adlaw by Request, the gold standard in advertising legal publications in the industry.

Join us for this exciting and timely IAB Long Tail Alliance webinar presented by Reed Smith. We look forward to your participation.

Maine Children's Privacy Law Update

This post was written by Dan Jaffe.

The business community has won an important victory in a lawsuit challenging a Maine law that severely restricts the collection, transfer and use of “personal information” or “health-related information” from minors.  The Maine Attorney General has publicly committed not to enforce the law, which was scheduled to take effect on September 12th.  Although the federal court stopped short of granting a preliminary injunction, it sent a clear message that any private cause of action under the new law could suffer from “constitutional infirmities.”  We are very hopeful that this will give the business community an opportunity to work with the Attorney General, the bill’s sponsor and others in the Maine Legislature to resolve the serious defects with the legislation.

On August 26th, a lawsuit was filed in federal court in Maine by the Maine Independent Colleges Association, the Maine Press Association, NetChoice and Reed Elsevier challenging the Maine “Act to Prevent Predatory Marketing Practices Against Minors.”  The lawsuit argues that the law is unconstitutional on both First Amendment and dormant commerce clause grounds and is preempted by the federal Children’s Online Privacy Protection Act (COPPA).

After hearing arguments yesterday on the motion for a preliminary injunction against the Act, the federal court found that the Plaintiffs had “met their burden of establishing a likelihood of success on the merits of their claims that Chapter 230 is overbroad and violates the First Amendment.”  The court’s order specifically noted that the Attorney General has publicly acknowledged First Amendment concerns and has committed to not enforce the Act.  In addition, the order put potential third parties on notice that any private cause of action under the Act could suffer from “the same constitutional infirmities.”  We are very hopeful that this will discourage any such private lawsuits.  With these strong findings of the court, the parties agreed to dismiss the lawsuit without prejudice, allowing the parties to relitigate if some third party tries to enforce the law. 

ANA has provided financial support for the lawsuit and we are pleased with this result.  Also, there has been a commitment to revisit and consider carefully revising the law when the Maine Legislature reconvenes this January.

If you have any questions about the Maine lawsuit, please contact Dan Jaffe or Keith Scarborough in ANA’s Washington, DC office at (202) 296-1883.

"No Credible Risk of Enforcement" - Opponents of Maine Privacy Law Await Decision

The lawsuit filed in Maine to stay enforcement of a Maine privacy law targeting minors, received a hearing today in federal district court. The Maine attorney general argued that the motion for a preliminary injunction should be denied and that the case should be dismissed. MediaPost reports that Attorney General Janet Mills, having already stated that she will not enforce the law, sought dismissal of the case on the grounds that "It is well-established that a federal court has no jurisdiction over a challenge to a state statute when there is no credible risk of enforcement." Even though the plaintiffs in the case fear that the private right of action in the statute (which becomes effective Sept. 12, 2009) could bring an avalanche of lawsuits, the Maine AG contends that those lawsuits are hypothetical. She states in her papers, "Essentially, the courts do not require state officials to defend against theoretical lawsuits that might be brought by private parties against private parties." The judge in the case, the Hon. John A. Woodcock, did not rule from the bench at today’s hearing. He indicated that he would have a ruling no later than Friday (Sept. 11, 2009). Stay tuned. . . .

Maine AG Supports Stay on Privacy Law Targeted at Minors

The news from the front is that progress is being made toward staying enforcement of the Maine privacy law targeting minors. The law, which contains a private right of action, has caused many to void Maine in their promotional plans for the fall and to adjust their data collection practices.

Background

The new Maine privacy law targeted at minors suffers from serious constitutional flaws. 

Under the new Maine law, which is scheduled to take effect Sept. 12, 2009, an entity may not collect, receive or use personal or health-related information from a minor for marketing purposes without first obtaining “verifiable parental consent.” To obtain such consent, the entity must undertake a “reasonable effort, taking into consideration available technology” to notify the parent and obtain parental consent. Any such notice must describe the entity’s practices regarding the collection, use, and disclosure of the information, and the consent provided must authorize such practices before any information may be collected, received or used.

Maine is following the lead of other states that have tried to expand the federal Children’s Online Privacy Protection Act (“COPPA”) to address adolescents between 13-17 years of age and their use of social networking websites. Like COPPA extension proposals in New Jersey (extending COPPA to cover the 13-17 age range) and Illinois (applying COPPA to most social networking sites), the Maine law tries to build on COPPA’s "verifiable parental consent" requirement for the 13-17 age range.

But, the Maine law addresses the following additional items:

  • Online & Offline Information Collection: The Maine law applies to all collection, receipt or use of information from a minor, whether online or offline, whereas COPPA only applies to online activities.
  • Personal Information: Although both COPPA and the Maine law define “personal information” generically as any “individually identifiable information,” the examples provided in the Maine law are less focused on the online collection of information than COPPA.
  • Health Related Information: The Maine law applies to the collection and use of both “personal information” and “health-related information,” whereas COPPA only applies to personal information. 

This statute potentially could greatly complicate children’s marketing compliance, because it will create a marketing environment in Maine that is inconsistent with COPPA. Because the Maine legislature will not be in session until Jan. 6, 2010, and there have been no rumors of a special legislative session before September, the industry has been busy seeking a way to stay enforcement of the law. Among the bases for challenge that could forestall enforcement of the law might be:

  • Statutory Preemption: Section 1303(d) of COPPA preempts state or local government laws that are inconsistent with COPPA. The legislative history of COPPA reveals Congressional findings that: (1) adolescents over the age of 13 have privacy rights and a greater understanding of commercial content, and (2) a national uniform standard was necessary because of the global distribution of the Internet. With this knowledge, Congress chose to regulate only the online collection of information from children younger than 13, and included this preemption provision to specifically guard against a patchwork of inconsistent regulation.
  • Dormant Commerce Clause: Under Pike v. Bruce Church, 397 U.S. 137 (1970), if “the burden imposed . . . is clearly excessive in relation to the putative local benefit, and if the local interest can be promoted by other regulations that have a lesser impact on interstate activities,” the court may strike down a state law that burdens interstate commerce. Courts have invalidated a number of Internet-related state laws (regarding matters such as obscenity and SPAM regulation) on these grounds. In this case, the Maine law would be excessive because it forces out-of-state websites to treat Maine users differently – or to treat all Internet users as if they were located in Maine. Further, the interest of protecting children’s activities online is already addressed in COPPA, a uniform federal statute that has less impact on interstate commerce.First Amendment Commercial Speech: Under Central Hudson Gas v. Public Service Commission, 447 U.S. 557 (1980), commercial speech that is not illegal or deceptive is afforded First Amendment protection. Courts may overturn statutes where the government does not demonstrate that its regulation: (1) directly advances a substantial government interest, and (2) is no more restrictive of speech than necessary. In this case, the Maine statute is overbroad and would not directly advance the government’s interest of protecting children’s activities online – the statute pertains to any collection of a youth’s information whether online or offline. Likewise, advertisers could find less restrictive and less comprehensive approaches to deter the perceived harm. For example, a parent could monitor his child’s computer use, and prevent the child from providing personal information. Or, parents could purchase “Net Nanny” software, which has settings to prevent personal information disclosure. Both of these solutions require no regulation at all.
  • Higher Value First Amendment Concerns: This statute has the potential to raise issues justifying a higher level of judicial scrutiny. For example, if government regulation could cause a chilling effect on any form of speech or regulate political speech, courts generally afford the speech strict scrutiny. In this case, it is not out of the realm of reasonableness to assume that some website operators could avoid information collection to the 14- to 17-year-old age group altogether, chilling all forms of youth marketing. Or, for political speech matters, groups like the Young Democrats or the Young Republicans may want to avoid collecting youth information as well, because much political activity could be viewed as marketing (i.e., party donation solicitations and memorabilia sales e-mails). 

The news on the front is that the AG of Maine understands and supports the stay. At least we know for sure the AG will not be bringing any actions under this law until the legislature revises it. It is critical that a stay be put in place to ensure that the industry is not inundated with nuisance private lawsuits for violation of the law. On the whole, however, things are moving in the right direction.

We will, of course, be following this carefully. Please call if you have any questions.

Secrecy and Blogging - When the Two Don't Mix

Has blogging made critics out of us all? Maybe so, but we still have to watch what we say as illustrated in a recent New York case, Cohen v. Google/Blogger.com. Fashion model Liskula Cohen filed suit demanding that Google disclose the name of an anonymous blogger (who we now know was Rosemary Port) who created and operated the blog now infamously known as “Skanks in NYC.” Cohen alleged that Port posted sexually suggestive pictures featuring her, together with derogatory comments about her — labeling her as, among other things, “skank,” “ho” and “whoring.” Google refused to reveal the blogger’s IP address, citing its policies on protecting the privacy of bloggers. 

In New York, the elements for a cause of action for defamation “are a false statement, published without privilege or authorization to a third-party, constituting fault as judged by, at a minimum, a negligence standard, and, it must either cause special harm or constitute defamation per se.”   Cohen petitioned the court that because Port posted, essentially, “per se” defamatory content about her, the blogger’s identity must be disclosed to enable her to pursue her viable claim for defamation. Port filed papers on an anonymous basis in response to the petition, claiming that the statements were “non-actionable opinion and/or hyperbole,” and further argued that even if the words were capable of defamatory meaning, “the context here negates any impression that a verifiable factual assertion was intended” since blogs “have evolved as the modern day soapbox for one’s personal opinions,” by “providing an excessively popular medium not only for conveying ideas, but also for mere venting purposes, affording the less outspoken, a protected forum for voicing gripes, leveling invective, and ranting about anything at all.” While this pleading is certainly an accurate description of how blogs are frequently used by bloggers, the court was not persuaded that Port’s identity should be protected from disclosure because her statements were “reasonably susceptible of a defamatory connotation and are actionable.” The court held that Cohen was entitled to an order directing Google to disclose information as to the identity of the blogger. 

It turns out that Port is an acquaintance of Cohen who frequently attended the same social functions as she did. With Port's identity discovered, it has been reported that Cohen will file a defamation suit against her. In turn, Port has indicated that she will sue Google for $15 million for failure to protect her privacy, claiming that Google “breached its fiduciary duty to protect her expectation of anonymity.” While the merits of Cohen’s claim against Port and Port’s claim against Google are yet to be determined, the lesson for every blogger is that he or she may not hide behind a mask of anonymity with respect to blogs that may cross legal lines and create causes of action, such as defamation. 

Companies also need to be vigilant in connection with the development of policies around blogging by employees or agents. Imagine the scenario where an employee of a consumer products manufacturer posts malicious statements on a consumer opinion blog regarding a competing product. The rival company petitions for the identity of the blogger, and ultimately discovers that the blogger is an employee of a competitor. In this situation, not only the individual, but potentially the company as well, may be subject to a claim by the rival company for unfair competition, or for certain other Lanham Act or Communications Decency Act claims.

Tread carefully with your blogging, or you might get a flogging. (Sorry, we couldn’t resist).

FTC Regulators Take New Approach to Online Advertising and Consumer Protection

This post was written by Rachel Rubin.

Website users have grown accustomed to the quid pro quo of Internet use and advertising: we browse websites, and those same website collect customer personal data or habits that are used to generate targeted advertising. But how far is too far in terms of data collection? Is our current system of consumer privacy protection a functional one, or one that falls short of adequately protecting the individual and his/her personal information and data?

According to the Federal Trade Commission’s new chief of the Bureau of Consumer Protection, David C. Vladek, the answer is the latter, and our system gets a failing grade. The FTC has expressed both distrust and displeasure with the current standard practice of online disclosure statements, and one-click, cookie-cutter privacy statements that consumers rarely read or understand, as neither may be enough to protect consumers from increasingly invasive Internet tracking practices and technologies. Also, the FTC sees this issue as having a consumer dignity interest element at stake, not merely consumer economic interests. The New York Times and the Wall Street Journal recently reported that Mr. Vladek will be scrutinizing online advertising and consumer privacy issues closely. Within his first few days on the job, Mr. Vladek announced that one of his “major goals” was “rethinking” the FTC’s approach to consumer privacy issues. 

As yet, Mr. Vladek has not articulated what these changes will be, but said he is not committed to “imposing regulation.” [quote from NYT article]. In his first few weeks in office, Mr. Vladek has been working with companies, public interest groups, and academics to evaluate the current rules and to suggest new ways to better protect consumer privacy. According to the Wall Street Journal, “the goal [is to have] new privacy guidelines in place by next summer.” 

These changes are part of the FTC’s move toward close evaluation of online advertising practices, which included the June settlement of a case with Sears Holdings Management Corp. In that case, Sears invited customers to download onto their computers, “research” software that allowed the company to track their online browsing. In return, customers were paid $10. The FTC found that the software also tracked consumer bank statements and prescription records, which some consumers did not realize, despite a lengthy privacy policy. The company was required to stop the program and to destroy the information it had collected. Mr. Vladek emphasized that the FTC was not just interested in protecting consumers from economic harm, but also in protecting consumers’ “dignity interest[s] wrapped up in having somebody looking at your financial records when they have no business doing that.” [quote from NYT article].  

How and what consumer data is collected has been a hot issue in recent months, with the release of a report from the FTC on its online behavioral advertising principles, followed shortly thereafter with self-regulation guidelines from industry groups. 

Why This Matters

Some industry groups fear the potential stricter regulations will harm their business models. It is clear that the FTC will expect more transparency from companies, but Mr. Vladek’s approach seems to be a collaborative one so far. Advertisers and industry groups should take advantage of this opportunity. As a practical matter, advertisers should be vigilant in adhering to consumer privacy and consumer information protection guidelines already in place, and should stay abreast of any and all developments in this area. Advertisers should also evaluate the programs and policies they use to protect the consumer information they collect, and alternative means of communicating the extent and use of personal data collected to consumers.

Ninth Circuit CDA Decision

In what is likely to be seen as a watershed moment for the application of the Communications Decency Act of 1996 (the "CDA"), the Ninth Circuit Court of Appeals has released an opinion in Barnes v. Yahoo that has the potential to dramatically increase the cost of defending social media and computer service providers.

The Barnes case centered around the posting of defamatory "fake" profiles on Yahoo's social networking pages. The profiles, which appeared to be from Ms. Barnes but were in fact created by her ex-boyfriend, included several pictures of her in the nude. Ms. Barnes asked Yahoo to remove the profiles, but Yahoo took no action until local media did a story on the events, wherein Yahoo promised to remove the fake profiles. Two months after that, the profiles still appeared on the Internet, and Ms. Barnes sued Yahoo.

Yahoo sought a motion to dismiss based on the immunity provided to it by the CDA. The dismissal was granted and Ms. Barnes appealed to the Ninth Circuit. In deciding to remand the case to the District Court, the Ninth Circuit did two things that can be problematic for the future of the CDA.

First, it held that a promissory estoppel-like claim can survive CDA immunity (at least at the motion to dismiss stage). At its core, a promissory estoppel claim requires someone to make a promise, and someone to rely upon that promise to his/her detriment. The court explained that Yahoo could be seen as having made a promise to Ms. Barnes, as part of its privacy policy and terms of service, and reiterated through local media, that it would take down profiles such as the one at issue. The making of a promise would be an activity that would fall outside of the CDA's scope. Thus, a promissory estoppel claim can survive a CDA-based motion to dismiss.

The second, and potentially more problematic, result of this decision is the treatment of the CDA as an affirmative defense, and the basis for lawsuit immunity. Although this may seem like a small detail, the proverbial devil is in the detail. If the CDA is a source of lawsuit immunity, then this supports a motion to dismiss for failure to state a claim (a 12(b)(6) motion). A 12(b)(6) motion must be dispensed with before the filing of answer, and before the opening of discovery. An affirmative defense, on the other hand, is dealt with by a motion for a judgment on the pleadings. For this type of motion, the defendant must file an answer along with the affirmative defense. The filing of an answer is where things go awry. Upon the filing of an answer, the court can open discovery. If the case was presided over by an overly cautious judge, discovery could be mandated prior to the issuance of a ruling on the summary judgment motion. Given that discovery can be expensive and time consuming, it is not difficult to imagine that the potential costs of exercising CDA immunity may have greatly increased.

Why This Matters: This case should be of great interest to purveyors of social media and those who seek to tap into the power of social networks. Not only does this provide a wake-up call as to what the consequences are of the statements in privacy and terms-of-service policies, but it also defines a way to avoid future promissory estoppel-like claims. Promissory estoppel requires a promise and reasonable reliance – if it is unreasonable to rely on the promise, then the estoppel claim may fail. It is possible that an artful drafting of a terms-of-service document can make this kind of reliance unreasonable, and social media and other interactive website purveyors should think about whether their privacy policies need revision of this type.

Notwithstanding revisions to one's policies, the case is also noteworthy because of the shift in interpretation of the CDA. If the CDA is more properly an affirmative defense than the basis for lawsuit immunity, then the potential cost of tapping into the CDA's protections may rise significantly.

Italy: The Use of a Person's Image

This post was written by Avv. Felix Hofer.

1. The general principle is that the use of a person's image without his/her consent is basically prohibited (this even more if such use is performed for marketing or – in general - commercial purposes). In Italy the right on a person's image is governed both by the Civil Code and the Intellectual Property Act (Law no. 633 dated April 22nd, 1941, amended and integrated in the following).

(i) According to Section 10 of the Italian Civil Code the image of a person, or of his/her parent, spouse, or child can be exhibited or published only if such use is explicitly permitted by law and provided that the use does not cause prejudice to the dignity or reputation of the person represented.

Should abuse occur (save for the cases in which the use performed results in a criminal offense), a local (civil) Court can order termination of the abuse and award damages.

(ii) The local Intellectual Property Act contains additional provisions on the use of a person's image.

(ii.a.) A person's image MAY NOT be exhibited, reproduced or put on sale without his/her consent (so Section 96 of Law n° 633 dated April 22nd, 1941).
(ii.b.) Exceptions to this basic provision (i. e. use without consent) are allowed if the reproduction of a person's image is justified by his/her notoriety (see Section 97) and by a public interest (e. g. for purposes of information to the general public).
(ii.c.) Finally, a person's image may not be exhibited or put on sale if such use causes prejudice to the represented person's honour, dignity or reputation (Section 97 of the Intellectual Property Act). 

Click here to learn more. 

Facebook Makes a U-Turn

On Feb. 4, 2009, Facebook decided to change (aka “update”) its Terms of Use Policy. The new policy provided, essentially, the right of Facebook to continue using a user’s data even once he/she left the service. The following is an excerpt from Facebook’s current Terms of Use Policy:

You hereby grant Facebook an irrevocable, perpetual, non-exclusive, transferable, fully paid, worldwide license (with the right to sublicense) to (a) use, copy, publish, stream, store, retain, publicly perform or display, transmit, scan, reformat, modify, edit, frame, translate, excerpt, adapt, create derivative works and distribute (through multiple tiers), any User Content you (i) Post on or in connection with the Facebook Service or the promotion thereof subject only to your privacy settings or (ii) enable a user to Post, including by offering a Share Link on your website and (b) to use your name, likeness and image for any purpose, including commercial or advertising, each of (a) and (b) on or in connection with the Facebook Service or the promotion thereof.

The change that caused the uproar, however, was the deletion of the following, which appeared at the end of the aforementioned section: “You may remove your User Content from the Site at any time. If you choose to remove your User Content, the license granted above will automatically expire, however you acknowledge that the Company may retain archived copies of your User Content.”

Interestingly, Facebook’s amended policy went largely unnoticed until the popular consumer rights advocacy site, Consumerist.com, brought these changes to light.

This has sparked a very interesting debate on data ownership, and one that Facebook for now has decided to avoid as it backed down last week and reverted to its previous Terms of Use. According to Mark Zuckerberg, founder and CEO of Facebook, “Going forward, we’ve decided to take a new approach towards developing our terms. We concluded that returning to our previous terms was the right thing for now.”

While the arguments supporting why a user should have the right to control his/her data and information are both persuasive and intuitive, one must also consider the “reality” of the situation. For example, Facebook currently boasts a user base of approximately 175 million users around the world. Without having first-hand knowledge of Facebook’s IT policies and protocol, presumably a user’s data is stored across multiple networks and servers that are backed up regularly. Is it even possible for Facebook to delete all of a user’s data when he/she leaves Facebook? It is reasonable to demand that Facebook undertake a search and destroy mission for each departing user by deleting his/her data from each and every server that ever touched such data (including each back-up server), and then scrub the same servers to ensure that the deleted data can never be recovered? Moreover, if a user elects to leave the service without deleting his/her information, should Facebook then be required to do so?

Furthermore, social networking sites like Facebook are designed for data sharing between users—hence the term “social network.” Is it reasonable to expect Facebook to comb through millions of user pages to hunt down data that must be deleted and purged when a user leaves the service? Perhaps the changes reflected above were merely intended to address rights-clearance issues and to protect and insulate Facebook against claims from old users.

Whichever position one wishes to take in this debate, two points are certain: one, the reaction to Facebook’s changes to its Terms of Use reflects a much wider issue about user data, who owns the personal information, and what should happen to it if a user decides to leave a service; and two, the industry will be keen to see what Facebook decides to do next.

What Do We Have to Look Forward to in 2009

It’s a new year, and change is in the air. Although the holidays are over, some groups in Washington are hanging on to their wish lists with the hopes that President Obama will grant their desires.

Over the past few months, Obama has sent agency review teams into dozens of government offices, ranging from the Pentagon to the EPA to the FTC. These teams are dissecting agency initiatives, poring over budgets and reviewing functionality. Many lobbying groups see this time of transition as a prime opportunity to achieve desired changes by gaining the ear of the new administration.

In fact, in December, leading privacy and consumer groups met with leaders of the FTC review team to spread the message that the FTC has allowed industries to self-regulate online privacy practices – to the detriment of consumers – for far too long. Privacy groups are not alone in their concern. Obama himself said during his campaign that “[d]ramatic increases in computing power, decreases in storage costs and huge flows of information that characterize the digital age bring enormous benefits, but also create risk of abuse. We need sensible safeguards that protect privacy in this dynamic new world.” He committed to “strengthen the privacy protections for the digital age and to harness the power of technology to hold government and business accountable for violations of personal privacy.”

During their meeting with the FTC agency review team, privacy groups stressed a need for better (more?) regulation of targeted online marketing, oversight in the data broker industry, and privacy policies for medical information, just to name a few. Susan Grant, director of consumer protection at the Consumer Federation, called the Network Advertising Initiative’s behavioral advertising self-regulatory code of conduct “deceptive on its face,” and called for the FTC to establish a “Do Not Track” registry, similar to the popular “Do Not Call” registry for telemarketing. In support of increased oversight of data brokers, Beth Givens of the Privacy Rights Clearinghouse cited numerous complaints from consumers about use of their personally identifiable information by companies in violation of stated privacy policies.

In addition to Obama taking office, a Democratic shift in Congress has the potential to lead to increased regulation. In fact, two senators (Markey (D-Mass.) and Dorgan (D-N.D.)) have already expressed an interest in introducing Internet privacy legislation that would likely outlaw behavioral targeting, cookies and “deep packet inspection.” In addition, a bill currently pending in Congress would expand and enhance the authority of the FTC, possibly increasing the number of FTC litigations.

What does this mean?

Online privacy issues are just the tip of the iceberg. The combination of the financial crisis (which many blame on self-regulation), and a new Democratic administration and Congress in Washington, will likely lead to both increased regulatory action and legislation in several areas affecting advertising and marketing, including:

  • Increased scrutiny on mergers (note the recent demise of the Google and Yahoo merger)
  • Stronger antitrust enforcement
  • Sweeping Internet privacy legislation and an end to self-regulation
  • A ban on advertising food to children: This is a hot topic and politicians will likely look to regulations of tobacco advertising as a basis for such a ban
  • An end to drug companies’ direct-to-consumer advertising: The United States is one of two countries left around the globe that allows prescription consumer drug advertising. Many politicians feel that it adds to the cost of medicine and health care
  • An end to the corporate tax deduction for advertising: You can expect the government to be looking at any and every way possible to generate tax dollars without raising the income tax