Although not directly linked to litigation, information management planning should be performed by all organizations, and ideally it should be an ongoing effort—it is the most efficient way to prevent pornography from having an impact on the e-discovery process. It is a good idea to develop policies that address the use of personal computers and other devices at work as well as the use of work computers and devices at home. When creating or modifying your own policy, include provisions to deal specifically with pornography. Consider implementing tools such as site blockers or filters, and draft clear corporate guidelines to protect the company from future embarrassment or, worse, spoliation sanctions when litigation arises.
Although there certainly is plenty of inappropriate material that gets into business systems through employee misbehavior, it’s also important to realize that much adult content is present because many email spam filters are insufficient. Spam is not the only source of porn in the workplace, but it’s a very common source of the porn that infiltrates ESI, and it highlights one of the trickier issues that organizations face when developing policies to deal with porn: Much of it is consumed inadvertently. In fact, when litigation arises and collections begin, some organizations may end up relaxing their stated policies to help alleviate employees’ fear they may be held responsible for images or other material they did not actively solicit.
Whether or not you have a formal information policy in place, you need to be able to effectively identify ESI to preserve potential evidence. Inside counsel and outside counsel, as well as corporate chief information officers, are now being held to very high standards with regard to data preservation, and key custodians can also be held personally responsible for data they control. The goal of the identification stage of e-discovery is to determine, preferably in advance of litigation, who is likely to have relevant data, what types of data exist, and where the data reside so that you can effectively manage preservation efforts. As potential sources of relevant information are identified and located, keep in mind that it is quite possible that some of your key custodians’ relevant data sources may include pornography, such as emails with embedded images or attachments, social media entries, and text messages. And the location of such material will probably not be confined to workplace computers or laptops. Adult content may reside on your network servers, third-party sites, cloud storage, as well as smartphones and other devices—any device that that is used to check email or perform other work-related activities. Consider including a clause in your policies to account for these data sources.
Establishing and enforcing strict IT policies dealing with adult content, along with using tools that actively scan and remove pornographic materials from company systems, will go a long way toward minimizing the amount of pornography on employees’ computers and devices, thus reducing the costs and risks associated with preserving pornographic ESI once litigation holds take effect. No matter how many precautions you take, however, it is likely some porn will remain in ESI, and some custodians may be apprehensive. What can be done about this? It may be wise not to tip off relevant custodians that they have been “identified,” lest they take it upon themselves to “cleanse” their data stores of porn—and, in the process, spoliate potential evidence.
During the preservation stage, legal teams take steps to ensure that all potentially relevant evidence is properly retained and protected from inadvertent or deliberate destruction or deletion. Whatever your organization’s normal business policy for data destruction, when there is an event that causes reasonable anticipation of litigation, a legal hold temporarily suspending that policy must be issued immediately. Proper preservation protocols reduce or prevent the loss or destruction of evidence and possible sanctions for preservation failures.
It is especially important to instruct employees not to make a fast attempt to erase inappropriate or “personal” material, because they could inadvertently delete relevant ESI in the process. It simply is not worth the spoliation sanction risk. Remember, the requester is not likely to be looking for X-rated content. Also, deletions are not necessarily going to expunge the offensive material permanently. Nevertheless, a litigation hold may incite fear-driven “delete-o-thons” by employees attempting to hide workplace pornography; people who wouldn’t dream of shredding paper files will hit the “delete” button without a moment’s hesitation to wipe traces of adult content before a collection. Employees need to be advised that the potential for spoliation and sanctions presents a risk of a much higher magnitude than the risk that pornographic material may be discovered. If the fear of a delete-fest is high, you might have to collect to preserve, with the shortest possible gap in time between the issuance of the “litigation hold” and collection.
Another possible approach may be to announce a temporary suspension of HR enforcement of “permissible use” policies with regard to the data in question to give people reassurance and decrease the likelihood that they will delete material in a panic. Whatever you decide, it’s wise to resist the common assumption that the presence of pornography in ESI is a problem confined to low-level employees. It can reach across the entire organization, including top executives.
During collection—the acquisition of potentially relevant ESI—data are retrieved from sources such as computers, cell phones, “smart” devices, and servers controlled by you or hosted service providers. The ESI may include emails, spreadsheets, Word documents, presentations, digital photographs, and so on. ESI should be collected in a manner that is documented and legally defensible. Some organizations self-collect while others opt to use an outsider service provider to collect. The decision of whether to use a vendor generally turns on the purpose of collection (internal investigation or litigation) and who will be the recipient (internal, government, or opposing counsel). Regardless of who does the collection, there can be pushback from custodians who don’t want to cooperate. Sometimes it stems from a belief that the ESI is not actually needed, sometimes it is it is due to a perceived need to protect confidential information, and sometimes it is because there is porn on the machine and they don’t want it collected.
At a recent e-discovery conference, during a session on data mapping, a general counsel from the audience commented that his company executives actually refused to turn over work computers during collections because they wanted a chance to wipe hard drives first to remove some “inappropriate” material before giving up their machines. This raises a legitimate question: If porn is not relevant in civil litigation, why does it have to be collected? The answer is that a quick deletion of porn can inadvertently delete relevant discoverable evidence, too. Again, the requesting party is generally not seeking adult content, and often deletions do not really erase the pornographic content. It’s also important to emphasize to custodians that, in all likelihood, the next phase of the EDRM will probably remove the material they may be trying so hard to hide—and it may even happen automatically.
All of which leads to another question: What is going to happen to all the porn on the devices that were just collected?
Once the ESI is collected and before it moves on to review, it is processed to winnow down the volume of data that goes to attorney review. During this data reduction, the porn will likely be identified as nonresponsive and removed from the document universe that moves on to be reviewed. If you use an external service provider for processing, the vendor may encounter some images or video files, but in most cases, images and video files will be given little attention because they are not likely to be the focus of searching and not what is being sought in most civil litigation matters.
Again, the best practice to reduce costs (porn files, especially video, are often huge and can add up when paying by the gigabyte) and minimize the risk that reviewers will ever see the porn is having strict policies in place that govern Internet access and documenting any offenses. In addition to implementing such a policy (or even without one), corporations and their counsel can regard the processing phase of the EDRM as a “second chance” opportunity to limit the likelihood that reviewers will ever see porn in their ESI. Generally, porn consists of image files and video files, and most relevant evidence in the majority cases will be in written form. In many instances, culling by file type will eliminate most of the offensive material. Even if there are legitimate reasons to search for graphic files, there still may be opportunities to limit exposure to porn. For example, if you know that you are looking for a two-minute HR video, you can probably estimate the file size and eliminate video files that don’t meet that criterion.
In spite of these measures, reviewers may still happen upon adult content; it is probably unrealistic to think you can get rid of all of it. Machines aren’t yet able to tell the difference between a pornographic image, someone’s vacation picture, and an image that may actually have relevance to specific litigation. But there are plenty of ways to minimize the amount of porn that slips through the cracks. There are even tools that can scan image files for pornographic material, although they are not designed specifically for e-discovery. Basically, they search for large percentages of skin tone material. While the level of precision may not be high, such tools are beginning to be used by corporations to scan stored content and root out potential problems before they expose an organization to serious risk or embarrassment.
Following processing, a team of lawyers will review documents to prepare for production. If you are using predictive coding, lawyers aided by machines will perform the first-pass review to locate documents for relevance. The second-pass review is then performed to find and remove privileged documents. Almost anything containing porn will not be found relevant and thus not produced. However, reviewers may be distracted. They may try to find more images or start texting friends about it. They may be shaken up or offended by it and demand to be removed from the project. Meanwhile, money and time are consumed in the process.
Other problems with serious consequences for the custodian or company can arise. For example, if an image is of a child, it is probably illegal, and there may be an affirmative duty to report it. Even if the image is a not illegal, it may violate strict company policies and trigger an internal investigation. How this should be handled is a topic for another article, but you’ll need to be aware of the possibility and prepare for it. It may make sense for the designated supervising attorney to provide instructions to reviewers on what to do when porn is found during the normal course of their duties.
When parties to litigation or an investigation fulfill the discovery request to produce documents, they are required to turn over responsive non-privileged information to the requester in a pre-specified production format—it can be as simple as handing over a DVD containing electronic documents in the form of TIFF or PDF files or in “native” formats (the original file type). Time lines are often short, and with large volumes of ESI being produced, documents—including porn—can be produced inadvertently. It’s not an uncommon occurrence, as Craig Ball has noted in a blog post entitled “Do We Need a Porn Pass?” (Oct. 1, 2011). Ball writes: “You may wonder, ‘does that really happen?’ Let me assure you it occurs with astonishing regularity; and I expect it to happen more as we trade human review for mechanized categorization techniques like predictive coding.” Ball suggests stepping up review quality control processes before production.
United Factory Furniture Corp. v. Alterwitz, No. 2:12-cv-00059-KJD-VCF (D. Nev. Apr. 5, 2012), a recent opinion that has attracted considerable attention in e-discovery circles, has nothing to do with porn, but a key ruling in the case reminds us of an important principle governing the problem of porn in e-discovery: Personal machines are subject to discovery. The plaintiff’s motion to compel mirror-imaging of the defendant’s personal computer as part of required preservation, in spite of the defendant’s claim that such a practice would be “personally intrusive,” was granted. The decision can be taken as a warning to individuals—and organizations—that the line between work and private life is no longer as clear as it used to be. There is perhaps no better example of that cultural shift than the pervasive presence of porn in ESI. Workplace porn is not likely to disappear, but a solid information management policy will go a long way toward minimizing the amount of porn that finds its way into productions. Just make sure the policy accounts for all the potential locations of relevant information. In spite of the fact that people do things on smartphones and personal laptops that they would never dream of doing on a work computer, if they also perform work on those “personal” devices, the data on those devices may be subject to collection. That’s something we all would do well to remember.
Keywords: litigation, pretrial practice and discovery, data destruction, data preservation, Electronic Discovery Reference Model, EDRM, litigation hold, spoliation
Rebecca James is a program manager at Fios, Inc., in Portland, Oregon.