Tag Archives: eDiscovery case law

Is eDiscovery Existing in a Post-Sanctions World?

eDiscovery Sanctions

Is eDiscovery Existing in a Post-Sanctions World?

The short (and obvious) answer is no. Rule 37(e) isn’t going anywhere. But recent case law indicates a trend where sanctions seem to be harder to come by, which may play into what concerns in-house legal teams as they consider the technologies they may need.

A recent infographic, General Counsel: From Lawyers to Strategic Partners (released by Raconteur with data from Walters Kluwer) showed 66% of corporate legal teams saying “Data Breaches and Protection of corporate data” was a top issue keeping them up at night. Sanctions didn’t even make the list.

Recent case law from 2019 supports the lack of sanction fear, as several cases showed that even when evidence was deleted (sometimes knowingly), courts aren’t doling out sanctions in the same way since the 2015 amendments to the Federal Rules of Civil Procedure (FRCP) went into effect.

As a refresher, Rule 37(e) of the FRCP lays out the threshold for sanctions as follows:

If Electronically Stored Information (ESI) was lost because:

A party didn’t take reasonable steps to preserve it when they should have (i.e. because they knew litigation was imminent)

  • and if the lost ESI can’t be restored or replaced by simply doing discovery again
  • and if there was an intent to deprive the party of information by the loss of the ESI
  • and if the lost ESI actually affects the outcome of the case

…then the court may consider sanctions.

The following examples show that consider is a key word, even when they find the threshold has been met, as in case 3.

United States et al. v. Supervalu, Inc. et al. Nov. 18, 2019 (C.D. Ill. 2019)

Three days after a subpoena, a district pharmacy manager for the defendant sent out an email stating, “Throw away all your competitor’s price matching lists and get rid of all signs that say we match prices.”

The plaintiff alleges there are inconsistencies in both the number and timing of the subsequent litigation holds and accordingly asked the Court for an in-camera review of the three litigation holds which were eventually sent. The plaintiff also believes that the defendant failed to preserve price matching materials responsive to the government subpoena from approximately 80% of their pharmacies nationwide.

But the Court denied sanctions, stating, “Upon reviewing the record, the Court is unable to conclude that Defendants acted in bad faith. If the evidence at trial shows otherwise and bad faith on the part of the Defendants is established, the Court can revisit the issue and consider one or both of the sanctions requested by the Relators or another appropriate sanction.”

Mafille v. Kaiser-Francis Oil Co. May 21, 2019 (N.D. Okla. 2019)

In this case, the plaintiff’s computer was wiped after her termination as part of standard retention policy. When the plaintiff filed for spoliation sanctions, the Court found that the plaintiff’s computer contents were uploaded daily onto the defendant’s LAN server as part of a company policy. So even if her computer were destroyed, the contents could potentially be retrieved if discovery were done on the LAN server. Also, the defendant requested which documents were vital for the plaintiff’s case so they could attempt to retrieve them from the LAN server, but the plaintiff never identified any such items.

Univ. Accounting Serv., LLC v. Schulton. June 7, 2019 (D. Or. 2019)

In this case, the defendant admitted, “I recognize fully that was in violation of the subpoena,” and later said of one particular piece of data, “I deleted the file as fast as I could, because I was petrified at its existence, because it’s exactly the type of damning information that UAS wants to catch me with.”

In US District Judge Michael H. Simon’s Ruling, he states that the Rule 37(e) sanction thresholds “have been satisfied.” Yet, even after meeting the threshold conditions, the judge didn’t order case termination sanctions, but instead chose a permissive inference spoliation instruction against the defendant.

No Sanctions, Why Worry?

Without the specter of sanctions haunting the dreams of in-house legal, does this mean they’ll finally get a good night’s sleep?

Only if they have the processes and technology to manage the exponential growth of data sizes and new file types, which continues to be one of the biggest challenges for corporations, particularly for in-house legal teams who are tasked with mitigating risk involved with enterprise data. To do this, the ability to manage data in a flexible and scalable manner is vital.

Sending a legal hold notice is pretty straightforward. Gaining meaningful and speedy insight into petabytes of data from multiple file types for investigations and subpoenas is much more complex, and forward-looking legal teams are putting their technology to work doing just that.

 

Written by Jim Gill
Content Writer, Ipro

A first step in this process is to use Ipro’s Pre-Litigation Data Checklist as a guide to effectively manage enterprise data in order to avoid potential data pitfalls in the middle of a matter.

Download the Ipro Pre-Litigation Data Inventory Checklist

Ipro Pre-Litigation Data Checklist

Download Ipro’s FRCP Cheat Sheet to brush up on your eDiscovery Rules!

FRCP eDiscovery Rules

Download Ipro’s FRCP Cheat Sheet to brush up on your eDiscovery Rules!

The Federal Rules of Civil Procedure (FRCP) are just that: rules established by the Supreme Court and approved by Congress, specifying procedures for civil legal suits within US federal courts. There are several of these that apply specifically to eDiscovery and understanding their role in the process is vital for any practitioner, whether you’re on a corporate legal team, part of a law-firm, or a specialist at a service provider.

Don’t want to read pages and pages of legalese just to learn the Federal Rules of Civil Procedure (FRCP) that apply to eDiscovery? We’ve got you covered!

Ipro FRCP Cheat SheetNeed to brush up on your Federal Rules of Civil Procedure as they apply to eDiscovery?

Download Ipro’s FRCP Cheat Sheet!

Should Mobile Devices be Imaged for eDiscovery? Recent Case Law Provides Insight

Mobile Devices Imaged for eDiscovery

Should Mobile Devices be Imaged for eDiscovery? Recent Case Law Provides Insight

Deciding whether mobile devices should be imaged can be difficult when it comes to eDiscovery. They contain a large variety of file-types and data intermingled with a lot of private information, which may be privileged. Extracting specific information can be difficult and imaging an entire device can be costly. So the question remains: To image or not to image? But not really. That’s why we have case law.

On the surface, it seems that imaging an entire device would fall beyond the usual scope of a matter as it’s defined under FRCP Rule 26. Specific relevant data would be the obvious choice over everything on a single device. Which is exactly how a magistrate judge saw it last year in Henson v. Turn (N.D. Cal. Oct. 22, 2018), when the court denied requests by the defendants to inspect personal devices of the plaintiffs, collecting web browsing history and cookies in the process, on the basis that the data sought was neither relevant nor proportional to the needs of the case.

But in a recent ruling by Special Discovery Master Hon. Rebecca Westerfield (Ret.) in the class action suit In re: Apple Inc. Device Performance Litigation (N.D. Cal. Aug. 22, 2019), the court determined that imaging a sample set of the plaintiffs’ mobile devices was proportional to the needs of the case under Rule 26. So what sets this case apart from Henson v. Turn?

The Case:

In re: Apple concerns the plaintiffs’ claim that Apple used operating system updates to “throttle” and hamper device performance regarding certain of its iPhone 6 devices, allegedly impairing “the integrity, condition, quality, and usefulness of the Devices without Plaintiffs’ knowledge or consent.”

Each side brought forward forensic experts. The plaintiffs’ expert, Mary Frantz, Managing Partner of Enterprise Knowledge Partners, LLC, asserts that “Apple’s internal databases such as collected historical diagnostics, support, and potentially archived cloud of backup files would be a preferred and sufficient method to determine historical performance,” and that, “there is no difference between what would be found on any specific Devices and what could be found via iTunes and iCloud analysis (which Apple could test without a forensic inspection).”

Apple’s expert, Paul D. Martin, PhD, Computer Science—who has over a decade of experience in technology and forensics, including with performance testing and benchmarking of computer and other technological programs—explained the word “performance” can have a broad variety of meanings for a device. “To assess performance conditions,” Dr. Martin asserts that “it is important to perform tests on a device that is configured in a way that matches, as closely as possible, the configuration of the user’s device. Configuration depends both on the hardware and on what is installed on the hardware, including operating system, applications, and data.” He also adds that, “each user controls the state of his or her Device to the extent that it deviates from the basic iOS configuration,” which, along with, variations of installed software, specific device usage patterns, and network or Wi-Fi variations, will impact performance. Which is why he concluded, “the best record of what is installed on a particular Device is the Device itself.”

The Ruling:

Special Discovery Master Westerfield ruled against the plaintiff on the issue of imaging, stating that “other types of discovery would not provide sufficient information on the issue at hand,” and that “the defendant’s privacy concerns could be addressed through a robust protective order containing the following:

  • “The plaintiffs would select a neutral forensic expert to produce a mirror image of the computer’s hard drive in a timely fashion
  • “That expert would execute a confidentiality agreement and also abide by the protective order in place in the action
  • “Only that expert would be authorized to inspect or handle the computer or the mirror image (the plaintiffs and their counsel would not inspect or handle the mirror image
  • “The expert would not examine any non-relevant files or data on the computer, or anything designated as privileged or work-product protected information
  • “That expert would produce a report based upon his or her inspection that describes the files found and any relevant file-sharing information
  • “And that expert would disclose his or her report only to the defendant’s counsel, who could then lodge objections to the report based on privilege.”

SDM Westerfield also cited Herskowitz/Juel v. Apple, Inc. (N.D. Cal. Feb. 12, 2014) in which the court ordered the plaintiffs to deposit computers and devices at issue with a third-party vendor for forensic inspection, because the “data contained on Plaintiffs’ computers and devices is likely to be highly relevant, and admissible evidence under Federal Rule of Civil Procedure 26(b)(1).”

The SDM noted the potential privacy intrusions in Herskowitz/Juel were not as widespread, because only a small number of devices were imaged. In response, the number of devices available for forensic inspection was limited to much less than the 115 devices Apple had requested.

Meet and Confer:

Cooperation is the name of the game when it comes to determining whether mobile devices should be imaged and other complex eDiscovery cases like this one. And the SDM drove that home by stating the importance of both parties continuing to meet and confer, ensuring the order is carried out as determined by the court.

In her ruling, SDM Westerfield writes, “Given the above direction and following Apple’s designation of specific Devices to be examined, the meet and confer process is likely to be more productive than the parties’ past efforts. In this regard, the parties and their experts are in the best position to meet and confer on a proposal that minimizes exposure of content and private information to Apple, the parties’ experts, and Apple’s outside and inside attorneys and provides an appropriate tailored approach to discovery from these Devices.”

Written by Jim Gill
Content Writer, Ipro

 

Ipro FRCP Cheat SheetNeed to brush up on your Federal Rules of Civil Procedure as they apply to eDiscovery?

Download Ipro’s FRCP Cheat Sheet!

Fitbit Data Provides Clues in Murder Case: eDiscovery & Criminal Investigation

ediscovery criminal investigation

Once again, eDiscovery and emerging data sources are at the center of a criminal murder investigation. A recent article in Wired highlights how investigators used data from the victim’s Fitbit and a neighbor’s Ring digital doorbell camera to establish a timeline, identify a suspect (the victim’s 92 year-old step father), and gain a warrant to search the suspect’s home, all of which led to his arrest.

Similar to other cases involving the use of data from mobile phones, social media, and the Internet of Things, it wasn’t the electronically stored information (ESI) alone leading to the arrest, but instead it gave investigators leads that otherwise wouldn’t have existed in a pre-digital world. These leads then gave them enough evidence to request warrants for further searches and investigations of specific suspects, which then brought about arrests. In other words, solving crimes still boils down to good old-fashioned detective work, only now, investigators have new ways to uncover what might have happened and who may have been involved.

Fitbit first introduced its personal fitness tracking device ten years ago, and today around 27 million people use them. Add that to other competitor’s devices (such as the Apple Watch) and it’s not hard to imagine how investigators often have ready access to biometric data of suspects and/or victims. Last year alone, 170 million wearables were shipped worldwide.

But while much of electronic data is self-authenticating under Federal Rules of Evidence rule 902(14), biometric data gathered from wearable devices is much harder to authenticate as accurate. An analysis of 67 studies on Fitbit’s movement tracking concluded that, “the device worked best on able-bodied adults walking at typical speeds. Even then, the devices weren’t perfect—they got within 10 percent of the actual number of steps a person took half of the time—and became even less accurate in counting steps when someone was resting their wrist on a walker or stroller, for example. ‘It’s not measuring actual behavior,’ says Lynne Feehan, a clinical associate professor at the University of British Columbia and the lead researcher on the paper. ‘It’s interpreting motion.’”

Evidence from fitness trackers has been admitted in homicide cases in Europe and the US, but expert witnesses and analysts are often used in conjunction with the data to authenticate it. Only a few judges have ruled on how to handle evidence from fitness trackers. For example, “In a 2016 Wisconsin case, Fitbit data was used to eliminate the possibility that a woman was murdered by her live-in boyfriend. The judge ruled that an affidavit from Fitbit established the device’s authenticity and allowed lawyers to introduce its step-counting data; at trial, a sheriff’s department analyst vouched for the reliability of the man’s particular device. However, the judge barred the Fitbit’s sleep data, citing a class-action suit that claims the sleep tracking could be off by as much as 45 minutes.”

Similar to older technologies (such as the polygraph), electronically created data isn’t necessarily irrefutable. Antigone Peyton, an intellectual property and technology law attorney who has used data from wearables in civil cases, states that people tend to see “data is equivalent to truth,” but there are “many ways the information on these devices can be interpreted.”

Another aspect that investigators have to consider with this type of data is users’ privacy. Last year, the Supreme Court ruled that police must have a warrant to search phone location data under the 4th Amendment. At times, the companies that have the data, refuse to hand it over to law enforcement if they feel privacy is being infringed upon, but largely, the tech industry seems to cooperate, especially as procedures for handling this type of data are becoming more defined.

What is important for the legal industry to consider is how different types of data and data sources work, and what that means when it comes to authenticating it. It’s similar to when wiretaps, phone data, and other electronic surveillance was introduced into the detective’s toolkit in the 20th century. The difference is the amount of information created today is much greater, and it’s created by every person on a near continuous basis from a large number of sources. And none of this data can be looked at in the same way.

But what this new data does provide is more ways to reconstruct scenes, rule out potential suspects, corroborate or contradict testimony, and gather information which allows investigators to pursue new tactics and lines of questioning which lead to further warrants and arrests. For attorneys, it’s important to stay up on the most recent investigative uses and court-rulings on these data sources, which will continue to define and clarify how ESI is being used in both criminal and civil cases.

 

Written by Jim Gill
Content Writer, Ipro

 

Need to brush up on your Federal Rules of Civil Procedure as they apply to eDiscovery?
Download Ipro’s FRCP Cheat Sheet!

Redaction Errors in Federal Opioid Case Reveal Importance of Legal Technology

legal redaction technology

Redaction Errors in Federal Opioid Case Reveal Importance of Legal Technology

For as long as humans have been writing things down, redactions have been a part of the process. In the beginning, they were used to integrate disparate stories and folktales, but these days, when we hear about redactions, it’s usually in regard to investigations and legal actions.

For the public—especially those who are hungry for conspiracy theories and secrets—redactions are a tantalizing hint at what’s not being said; however, for those in the legal industry, redactions are a part of everyday life. But this doesn’t mean they’re mundane! On the contrary, failure to redact documents or to make sure that redacted content is produced in its redacted format can be case ending (and job ending for the person responsible for the error).

The most common reason this happens, is because law firms are taking the “redact by hand” route instead of using tools that properly manage productions to ensure documents meet the expected production requirements (i.e. making sure the information meant to be kept private is hidden). By not using redaction technology, law firms are flirting with disaster.

Which is exactly where a law firm found itself, after exposing secret grand jury information in a court filing as a result of using Microsoft Word and Adobe Acrobat, instead of specialized redaction technology, which the partner said, “is specifically designed to avoid such issues. The failure to use this software was inadvertent oversight.”

At first glance, the filing appeared redacted, but a member of the press was able to defeat the redaction by simply copying the black-out boxes and pasting the text into a new document.

Ryan Joyce, VP of Strategy at Ipro, commented, “Time and time again we have seen the same headline—a law firm or government agency getting in trouble for not handling their redactions correctly. Why is this still an issue after all these years? Any software can draw a colored box over text, but only the right software will produce it correctly.”

But even when redactions are properly handled, a newly published study by the University of Zurich may make them a moot point. By using a combination of AI and over 120,000 legal records, researchers “were able to identify the participants in confidential legal cases, even though such participants had been anonymized.” And if that doesn’t give pause, they did so with an 84% accuracy rate after mining data for only one hour.

Still, even if anonymized information in legal documents can be defeated—by robots or gross oversight—the requirement to redact documents correctly isn’t going away anytime soon. Which means law firms should ensure they have the technology in place to properly handle redactions, along with the processes in place to ensure that technology is used. If they don’t, it could mean sanctions for the firm and unemployment for the individual who made the error.

How Ipro Can Help with Redactions

Tools like Ipro’s Production Shield (which is included in the enterprise and desktop eDiscovery solutions by Ipro) allow administrators to add another layer of protection for documents that should not be produced. When using Production Shield, such documents are identified during the validation phase of the export process, giving administrators the opportunity to correct conflicts and ensure only appropriate documents are produced.

In addition to Production Shield, Ipro ensures accurate redactions by:

  • Automatically re-OCRing the document to remove any text under the redaction
  • Running validations to ensure redactions are burned in and the text is correct
  • Creating layered redactions, so multiple production sets can be sent to multiple parties
  • Having 30 years’ experience in the legaltech industry – we know our redactions!

 Find Out More About How Ipro Can Ensure Accurate Productions for your Firm!

 

Written by Jim Gill
Content Writer, Ipro

How 3 Cases Involving Self-Driving Cars Highlight eDiscovery and the IoT

Self-driving cars, eDiscovery, IoT

How 3 Cases Involving Self-Driving Cars Highlight eDiscovery and the IoT

Litigation is nothing new for the auto industry. But recent lawsuits surrounding accidents involving self-driving vehicles show that Electronically Stored Information (ESI) is a key component in these cases, because modern cars collect enormous amounts of data, which can be used to determine the fault of accidents, whether it’s human or machine driver error, or some other design flaw in the automobile itself.

Self-driving cars continue to become a regular part of daily life. But like any new technology, flaws and user-errors become more apparent as they’re taken from the test bed to the real world. While we’re still years away from truly autonomous vehicles ruling the roads, the question that continues to come up with driver-assisted technologies is whether they protect us against human error, or if human reliance on technology makes drivers less competent.

The following cases show how self-driving cars (and the data they collect) will continue to highlight the role eDiscovery plays when it comes to the Internet of Things and all the new data sources that everyday devices create.

The Self-Driving Uber:

In March of 2018, pedestrian Elaine Herzberg was killed by one of Uber’s self-driving cars as she crossed a multi-lane road in Tempe, Arizona. The investigation included data from the car as well as dashcam video. While it was determined that the automatic braking system was turned off to avoid erratic driving conditions, the human driver, Rafael Vazquez, was found at fault for the death. Dashcam video showed Vasquez repeatedly looking down at her lap in the final minutes before the crash, including the five seconds before impact. Additional ESI provided from the driver’s Hulu account, shows that Vazquez was streaming the television show The Voice just before the crash. If the driver had been paying attention, the car’s data showed that the accident most likely could have been avoided.

Tesla on Autopilot:

In March 2019, Jeremy Banner’s Tesla Model 3 collided with a tractor-trailer that was crossing his path on a Florida highway. An investigation is currently underway, as his family is suing Tesla for wrongful death. A preliminary report from the National Transportation Safety Board revealed Banner turned on autopilot just before the crash, and the vehicle “did not detect the driver’s hands on the steering wheel.”

This incident is similar to a 2016 accident which killed 40-year-old Joshua Brown when a tractor-trailer crossed his path while he was using autopilot. Tesla said in that investigation that its camera system failed to recognize the white broadside of the truck against the bright sky. However, it was also found that Brown was not paying attention to the road, though the NTSB said a lack of safeguards contributed to his death.

Tesla and the 2-year-old Driver:

In this incident, Mallory Harcourt of Santa Barbara, claims that in December 2018, while unloading groceries from her Tesla Model X parked in the driveway, her two-year-old son jumped in the driver’s seat, and the car unexpectedly lurched forward, ultimately pinning her to the garage wall. Harcourt, who was pregnant, suffered a broken leg and pelvis, and went into labor, which led to the premature delivery of her daughter.

The plaintiff and defendant are in agreement that the vehicle was shifted from Park into Drive. Because the Model X (and older Model S) don’t have a driving-facing camera like the Model 3, Tesla can’t confirm how this happened. But the car’s data does confirm that “the brake pedal was released shortly after the vehicle was put into Drive, and the accelerator pressed for the following seven seconds, interrupted briefly by application of the brake pedal.” The impact occurred with the left rear door still open and prior to an application of the brake pedal heavy enough to activate the anti-lock brakes.

There have been other lawsuits against Tesla for “unintended acceleration” but none have been won.

Conclusion:

What is clear from each of these cases is how vital the role of ESI collected from vehicles and periphery devices is when determining cause and liability. In these cases, the combination of video, car data (indicating things as detailed as how long acceleration and brakes were applied, whether someone was in the driver’s seat or their hands were on the wheel, whether a door was open, and when impact occurred), and ESI from other devices such as a tablet or a driver’s Hulu account, were all part of the investigations. As the Internet of Things continues to grow, items that were once merely mechanical devices – phones, cars, refrigerators, etc. – are now repositories of electronic evidence. And should litigation arise, the data they contain must be made discoverable.

These scenarios involving self-driving cars may not fall under the day-to-day operations of most law firms, but they do highlight new examples of how electronic data is collected during litigation (which is eDiscovery). In the same way that email data was still a novel source of evidence 15 years ago or social media data was the new ESI 5 years ago, IoT data will continue to be requested, and legal teams will have to respond.

 

Written by Jim Gill
Ipro Content Writer

 

Learn more about how Ipro’s hybrid approach to eDiscovery can help your organization meet any challenge.

It’s Not Just About the Money (or Privacy): The Role of Specificity, Technology, and FRCP Rule 26

FRCP Rule 26

What Does FRCP Rule 26 Say about Scope and Proportionality?

In 2015, when the Federal Rules of Civil Procedure were amended, the issue of scope and Rule 26 was a hot topic of discussion, mainly around the issue of costs. But proportionality doesn’t just apply to the cost of discovery. With concerns around privacy becoming a daily headline due to data breaches, privacy laws, and the use of personal data by large corporations and governments, will the cry of “privacy” take the place of “burdensome costs” in proportionality rulings?

Before the 2015 amendments, it was common that broad discovery requests were submitted, which if carried out, would end up costing way more than the lawsuit was worth in the first place. For organizations with deep pockets, large discovery requests became a tactic similar to continuing to raise the bet in a poker hand until your opponent had no choice but to fold.

But since 2015, broad discovery requests or “fishing expeditions,” have been essentially banned. With the new rule, the burden to prove proportionality lies with both the requesting and responding parties. And the first two questions that should be answered are:

  • Is the information requested relevant to the outcome of the case?
  • Is the information privileged?

If the data in question passes these two tests (yes, it’s relevant to the case, and no, it’s not privileged information) then the courts look at the following six factors laid out in FRCP Rule 26(b)(1) to help determine rulings on proportionality.

  • The importance of the issues at stake
  • The amount of information in controversy
  • The parties’ access to the information in question
  • The parties’ resources to obtain the information
  • The importance of the discovery in resolving the issues
  • Whether the burden or expense of the proposed discovery outweighs its likely benefit

 

How Does Privacy Fit into the Discussion of Scope and Proportionality?

Henson v. Turn, Inc (US Court, Northern Dist. California, 10/22/2018) is a fairly recent case that deals specifically with proportionality and privacy. In it, the plaintiffs brought a class action against the defendant, claiming that the defendant engaged in the practice of using “zombie cookies” which users cannot delete, block, or opt out.

In response, the defendant requested the plaintiffs:

  • Produce their mobile devices for inspection or produce complete forensic images of their devices
  • Produce their full web browsing histories from their devices
  • Produce all cookies stored on or deleted from their devices

The court ruled that the defendant’s request to directly inspect the plaintiffs’ mobile devices or for complete forensic images of the devices “threatens to sweep in documents and information that are not relevant to the issues in this case, such as the plaintiffs’ private text messages, emails, contact lists, and photographs.”

And because the parties had protocols in place for producing information from the plaintiffs’ devices or forensic images, the defendant issued nine requests for specific information from the plaintiffs’ devices, which the plaintiffs carried out.

The same happened with the request for the browsing histories and cookies. The plaintiffs produced or offered to produce their web browsing history and cookies associated with the defendant’s partner websites and the date fields of all other cookies on their mobile devices. The plaintiffs also offered to meet and confer with the defendant to consider requests for specific cookies.

And the court ruled with the plaintiff.

 

What is the Role of Technology in Scope and Proportionality?

So why does this matter? The key here is not about scope and proportionality or even privacy. Yes, that’s the topic of the case. But the bigger issue at stake is how will the creators of legal technology respond. With Rule 26, it’s all about specificity. I want this specific data, from that specific custodian, from these specific date ranges, because it affects the case in this way. After that, it’s just an issue of having the tools to get those specific items easily and cost effectively.

In Henson v. Turn, the judge cited a case from 2006 (Sony BMG Music v. Arellanes), where a request was made for an imaging of an entire hard drive, and it was determined that the production would reveal irrelevant data, when all that was needed were specific emails. Now we have technology which allows us to target specific emails and other data on a computer. We can deNist and deDupe, we can redact, we can do all kinds of things within our eDiscovery tools which keep data within the scope and proportionality of a request. It wasn’t always so. It took the creators and innovators of technology to make it a relatively easy and standardized process.

This technology made the cries of “overburdensome discovery” seem moot. No discovery is overburdensome these days when you can pinpoint the exact data that is relevant in the case. You just have to ask for it. With the onus on the requester to follow the guidelines of FRCP Rule 26, if you make a request that’s overburdensome, you’re just being lazy. And judges aren’t having it.

With this case’s highlighting of the role of privacy in Rule 26, I think leaders in the eDiscovery industry should be looking ahead in the same way that at least some of them were in 2006. How can we create tools that allow the handling of electronic data through the entire litigation process? Only now, instead of hard-drives full of emails and word documents, it’s data from a number of unique sources that live across platforms available on mobile devices and the Internet of Things.

The guidelines for proportionality and scope are very clearly laid out in the FRCP. The only difference is the need for tools that make the process easier considering the digital landscape that exists in 2019, not just the one in 2006.

 

Written by Jim Gill
Content Writer, Ipro

 

Two Murder Investigations in the Last Week Highlight the Role of eDiscovery in Criminal Investigations

ediscovery in criminal investigations

More and more, eDiscovery plays a larger role in criminal investigations. These two recent murder cases have again highlighted the use of electronic forensics to solve cases that only a few decades ago, would have been difficult to crack in the relatively short time frame between the crime and the arrest.

The Sydney Loofe Case

In the second week of his trial this June, murder suspect Aubrey Trail tried to cut his own throat in the courtroom. A year ago, the then 51-year-old Trail and his alleged girlfriend 24-year-old Bailey Boswell, were charged with the murder of a 24-year-old Nebraska woman, Sydney Loofe.

Loofe was last seen 11/15/2018 before going on a Tinder date with Boswell. Police found the remains of Loofe’s body in a field a few weeks later, and in the months that followed, used a wide variety of electronic data, along with traditional forensics, to link Trail and Boswell to the murder. From an eDiscovery point of view, the list of evidence pieces together a vivid story:

  • Tinder Profiles: 140 messages between Loofe and Boswell in the days before November 15th were pulled from their online dating profiles. The last was on Nov. 15 at 6:54 p.m., when Boswell said she’d arrived at Loofe’s apartment. Police also found that Boswell went by “Audrey” on her online-dating profile.
  • Snapchat Photo: Loofe sent a selfie to a friend via Snapchat on November 15th with the caption, “Ready for my date.”
  • Facebook Videos: Trail and Boswell both posted Facebook videos claiming innocence while police were looking for them. In one, Boswell said she was “Audrey on Tinder and a few other names because I have warrants.”
  • iPhone Reset: After her arrest, Boswell gave investigators permission to search her iPhone 7, which they found had been reset to factory default settings on November 17.
  • Cellphone Pings/GPS Locations: Loofe’s phone last pinged a cell tower near Wilber, where Boswell and Trail lived in a basement apartment. When detectives searched that residence, the landlord, who lived upstairs, “reported a strong odor of bleach coming from the basement.” Data from Boswell’s phone showed its location was “in close proximity to the area where the remains were discovered Dec. 16th.”
  • Security Video Footage: Security footage from a local Home Depot showed Trail and Boswell on Nov. 15 around 10:35 a.m., shopping for tools and supplies that could be used to cover up the crime.
  • Phone Calls from Jail: In two different phone calls, one to the Lincoln Journal Star and the other to the Omaha World-Herald, Trail gave different accounts, claiming he unintentionally killed Loofe in a sex game gone wrong.

All of this led to a confession from Trail, stating that he had killed Loofe, and then he and Boswell covered up the crime scene and disposed of the body.

The MacKenzie Lueck Case

MacKenzie Lueck was a 23-year-old University of Utah student who disappeared in the early hours of June 17th, 2019. Her parents reported her missing on June 20th, and on June 28th, Ayoola Ajayi, 31, was arrested as a suspect for her murder. Once again, a combination of digital and traditional forensics created a narrative of what happened to her, and led investigators to her remains and the arrest in just over a week.

  • Airport Data: Lueck arrived at the Salt Lake City airport at 2:09am on June 17th from California after attending her grandmother’s funeral. Security photos from the airport show Lueck in the airport after deboarding her plane, giving investigators an image of what she looked like, what she was wearing, and personal belongings she had with her at the time.
  • Text to her Mother: While at the airport, Lueck texted her mother to let her know she had arrived safely.
  • Lyft Data: Lueck requested a ride through the ride-share app Lyft. Data from Lyft showed that the driver took her to Hatch Park, which is 8.5 miles from her home in Trolley Square. It also showed that nothing out of the ordinary happened on the drive, and that the Lyft driver immediately picked up other passengers after dropping her off. An interview with the driver indicated she was meeting someone in the park, but the driver didn’t know who that might be or the make/model of another car they were driving.
  • Park Surveillance: There are no surveillance cameras inside Hatch Park, but police began investigating security footage in the areas surrounding the park.
  • Social Media: Police looked at Lueck’s social media and dating profiles, and also asked the public if they had any information on alternate accounts, due to the growing trend of people keeping online profiles that aren’t known to family or close friends in order to maintain a layer of anonymity.
  • Phone Data: Cell phone data showed that the last person Lueck communicated with before going missing was Ayoola Ajayi. It also showed that Lueck’s and Ajayi’s phones were in Hatch Park the night of the 17th within one minute of each other. Hatch Park was the last place Lueck’s phone transmitted data. Ajayi’s phone also had several pictures of Lueck, including one from an online profile, after he had said in a police interview that he had no idea what Lueck looked like, had never visited her online profiles, and had last texted her at 6pm on June 16th.
  • Suspect’s Online Image: Investigators used data from Ajayi’s online footprint—from sites as varied as LinkedIn and Goodreads—along with interviews of people who knew the suspect, as well as public records, to put together a profile.
  • Search of Suspect’s Home: A forensic excavation found charred items matching Lueck’s clothing and personal belongings, and Lueck’s DNA matched remains found on site. Police also located a mattress and box-spring Ajayi had given away using a social media app just prior to his arrest.

Conclusion:

These tragic events show how law enforcement are more and more relying on electronic data to solve cases. And as these cases move to the courts, it highlights how important eDiscovery is in today’s legal world. Mobile phone data, GPS, social media, surveillance video, and other new media types are continually being used to investigate criminal cases and bring them to trial. Which is why companies that create eDiscovery and trial presentation software must continue to stay on the cutting edge of technology trends in order to maintain data integrity used in criminal and civil court. Since the use of these types of data are being used to solve crimes, attorneys and courts must be prepared to handle the electronic data while trying these cases.

Written By Jim Gill
Content Writer, Ipro Tech

 

Want the latest industry news? Visit the Ipro eDiscovery Newsroom

Deleted ESI Doesn’t Automatically Mean Sanctions: Two Recent Cases Highlight the Spoliation Thresholds in FRCP Rule 37(e)

FRCP Rule 37(e)

Two Recent Cases Highlight the Spoliation Thresholds in FRCP Rule 37(e)  

With the 2015 FRCP amendments quickly nearing a half-decade in existence, case-law continues to define how these rules are upheld in court, especially when it comes to the handing out of sanctions. Two recent cases show how strictly judges are adhering to the thresholds laid out in FRCP Rule 37(e). As a reminder, here they are those thresholds in plain language: 

If Electronically Stored Information (ESI) was lost because a party didn’t take reasonable steps to preserve it when they should have (i.e. because they knew litigation was imminent); and if the lost ESI can’t be restored or replaced by simply doing discovery again; and if there was an intent to deprive the party of information by the loss of the ESI; and if the lost ESI actually affects the outcome of the case, then the court may consider sanctions. 

In other words, sanctions are not given just for spoliation of ESI.  

Mafille v. Kaiser-Francis Oil Co. May 21, 2019 (N.D. Okla. 2019) 

“Lecturing Plaintiffs about their obligation to preserve electronically stored evidence is exceedingly poor form. 

In this case, the Plaintiff, Marlana Mafille, was terminated in part because of alleged performance issues. As part of a standard retention policy, Ms. Mafille’s company computer was given to a charitable organization with other retired computers and the data was presumably destroyed, even though the plaintiff had submitted an EEOC charge of discrimination three months earlier. 

After the defendant tried to blame the plaintiff for not requesting the computer be saved, US Magistrate Judge Frank H. McCarthy stated in his ruling, “…lecturing Plaintiffs about their obligation to preserve electronically stored evidence…is exceedingly poor form and beyond zealous advocacy.” He continues, “The court finds that Mrs. Mafille’s work computer should have been preserved and further that Defendant is solely and entirely at fault for failing to take reasonable steps to preserve the computer. However, that finding does not necessarily equate to an award of the sanctions Plaintiffs have requested.” 

Why were sanctions denied? The plaintiff’s computer contents were uploaded daily onto the defendant’s LAN server as part of a company policy. So even if her computer were destroyed, the contents could potentially be retrieved if discovery were done on the LAN server. Also, the defendant requested which documents were vital for the plaintiff’s case so they could attempt to retrieve them from the LAN server, but the Plaintiffs never identified any such items. Or as Judge McCarthy ruled, In the absence of such a showing the court must find that Plaintiffs have not suffered any prejudice as a result of the destruction of Mrs. Mafille’s work computer.” 

Univ. Accounting Serv., LLC v. Schulton. June 7, 2019 (D. Or. 2019) 

I deleted the file as fast as I could, because it’s exactly the type of damning information they want to catch me with.” 

Ethan Schulton was a lead software developer for ScholarChip and decided to leave the company and start his own endeavor to compete directly with his former employer. He also took his entire email file, ScholarChip’s client list, and some client webinars. Litigation began on March 7, 2018 and four days later, Schulton started deleting files. He did so again on April 9. And then again in August. During his deposition, Schulton admitted, “I recognize fully that was in violation of the subpoena,” and later said of one particular piece of data, “I deleted the file as fast as I could, because I was petrified at its existence, because it’s exactly the type of damning information that UAS wants to catch me with.” 

In US District Judge Michael H. Simon’s Ruling, he states that the Rule 37(e) sanction thresholds “have been satisfied.” It was clear that ESI which should have been preserved was deleted after litigation had begunJudge Simon continued, Schulton has admitted facts sufficient to support the conclusion that he acted with the intent to deprive UAS of the information’s use in the litigation, at least to the extent of depriving UAS of the ability to prove precisely what Schulton took with him when he left ScholarChip’s employment at the end of 2017. Finally, UAS has attempted to restore or replace through additional discovery the deleted information but has been unsuccessful. Thus, UAS has satisfied the four threshold elements under Rule 37(e). 

Yet, even after meeting the threshold conditions, the judge didn’t order case termination sanctions, but instead chose a permissive inference spoliation instruction against the defendant. 

Conclusion: 

Just because spoliation sanctions require proof that the stringent FRCP Rule 37 thresholds were met, doesn’t mean you should be lax when it comes to your eDiscovery processes or technology. In fact, quite the opposite. The key element of the rule is showing that “reasonable steps” were taken, and the best way to ensure that is by having a defensible and repeatable eDiscovery process in place. 

 

Written by:
Jim Gill
Content Writer for Ipro Tech

Stay up to date with the latest eDiscovery and industry news the Ipro Newsroom

Judge Orders Airbnb to Release 17k Listings to Investigators

Big Data, Big Discovery: Judge Orders Airbnb to Release 17k Listings to Investigators

The battle between home-sharing platforms and city governments has been a cold-war of sorts for some time now with the standard back-and-forth of lawsuits, countersuits, and lobbying. New York City in particular has been trying to limit companies like Airbnb, HomeAway, and VRBO, claiming they add to the city’s housing problems and allow people to transform homes into illegal hotels.

On May 14, Airbnb offered an olive branch of sorts, agreeing to “give city officials partially anonymized host and reservation data for more than 17,000 listings,” in response to a subpoena, according to an article in Wired. A spokesperson for Airbnb told Wired, “We hope that our compliance with this subpoena—by providing data in line with our shared enforcement priorities against illegal hotel operators—is a first step toward finding such a solution that is consistent with Airbnb’s legal rights and obligations and allows us to share the kind of actionable data with the level of precision that the city needs.”

Two days later, a judge ordered Airbnb to comply with four additional city subpoenas, which Airbnb had claimed were overbroad and unduly burdensome. Even before the 2015 changes to the Federal Rules of Civil Procedure, this claim was a boilerplate response when large data sets were requested. The amendments to Rule 26’s language on scope made this claim easier to prove, with the data request deadline falling 30 days sooner, along with the requirement of specificity and relevance. But in this case, the City of New York wasn’t simply going on a fishing expedition with their data request but were very detailed with the what and why.

Airbnb says the judge’s order infringes on user privacy. But this ruling sends a message that if data is shown to be relevant to the investigation of criminal activity, it must be handed over. In the past, corporations often worked in good faith with law enforcement, while avoiding the release their users’ data from a direct court order (e.g. Craigslist dropped their personal ads section in lieu of a pending Human Trafficking bill to avoid the hassle all together), and it seems that Airbnb’s offer of anonymized data was an attempt to play the middle: give enough information to show cooperation with the investigation while protecting user privacy.

The amount of data continues to grow exponentially every day, and technology allows larger and larger amounts of data to be quickly collected, reviewed, and analyzed. Are investigators getting savvy with their use of technology and the ability to request data by using targeted requests in subpoenas as a way to force companies’ hands regarding user data?