The Hefty “Costs” Of Not Following e-Discovery Best Practices - Part 2
Part 2 (read Part 1 here)
E-Discovery is a process carried out by trained professionals using appropriate technology to collect, review and disclose relevant documents. When one (or two) of the corners of the People, Process and Technology triangle fail, the result can be disastrous, as illustrated in the decision recently handed down in the MGA v. Cabo Concepts patent claim.
According to the judgement, MGA opted to use their own internal IT resources to collection documents for this matter. MGA had recently adopted the Relativity e-Discovery review platform in house, and two of their IT staff had taken the Relativity processing and administration courses.
The collection involved, among other things, emails within mailboxes for several current and former employees. It appears from the judgment that the former employee mailboxes had been stored as Outlook PST files. While the judgement does not indicate how the mailboxes for the current employees were accessed, it is likely these were exported into PST format as well.
To reduce the volume of data hosted in Relativity (and keep costs down), the collection team decided to only collect emails within a specific date range. While this is commonly implemented as part of a targeted e-Discovery collection process, the way MGA carried out this process was doomed from the start. MGA opted to use Microsoft Outlook as their pre-collection culling tool. Presumably each PST was opened in Outlook, searches were run to find emails within the date range, and only the identified emails were exported for loading into Relativity.
MGA stated that they had issues exporting from Outlook (there is mention of Outlook freezing several times). This is not surprising. Outlook’s search index is part of the Windows operating system and is often incomplete (it runs in a low priority thread in the background – any other process will cause it to suspend the indexing for hour or days). In addition, Outlook was not designed to move large volumes of records between folders - if you have ever tried to copy a lot of emails to a new folder, you probably already know that it can take hours, during which time Outlook appears to be frozen. Finally, the copying within Outlook process sometimes fails to copy all records, and, just for fun, Outlook does not tell you when it has missed copying emails to a folder. Clearly, MGA suffered these issues, but did not carry out any type of validation to determine if their collection sets were complete.
MGA also ran searches within the Microsoft 365 (M365) environment, but apparently did not construct their search queries properly, opting to use the “Creation Date” email metadata field instead of the more common “Sent Date” or “Received Date” metadata fields. “Creation Date” is the date that an email was created in its current location. In practical terms, if a user’s email mailbox is transfer from an older Microsoft Exchange system to the M365 Exchange Online system, the creation date of each of the emails transferred will be set to the transfer date. If MGA migrated their users’ mailboxes to M365 in, say 2020, but used a date range of 2010-2019 for the collect, none of the migrated emails would be identified, even though many may have been sent or received within the 2010-2019 time period.
When MGA finally realized their mistakes and re-did the collection, they determined that the original collection had missed about 40% of the emails.
While MGA clearly made substantial errors collecting their data, the errors themselves were but one of many failures in the process. Others included:
- The MGA process had no quality control stage – the IT technicians assumed that the computer was providing accurate results without actually confirming this;
- There was no legal oversight of the process – MGA’s internal and external legal counsel left the entire collection process to the IT technicians;
- E-Discovery expertise was lacking – while the IT technicians had taken a couple of Relativity courses designed to instruct them on how to use the software, they had no training in e-Discovery best practices, and very little experience in collecting large volumes of data (this was the first UK e-disclosure exercise for MGA, and the largest e-Discovery exercise undertaken by MGA since they began doing self-collections in 2017).
A core pillar of the e-Discovery process is validating the work. This applies are every stage in the process, from collection through to disclosure. Validation involves independently confirming that the results of a step are correct. When it comes to collection, this can run the gamut from simply counting the number of records before and after exporting, and making sure they are same, to examining process logs to find exceptions, to running parallel collection jobs using different technologies and checking that the results match.
While errors do happen, a sound e-Discovery collection can both minimize errors and identify when they occur so that remediation can be performed. MGA apparently did not appreciate the impact of collection errors. When presented with an email provided by the opposing party that should have been included in the collection, MGA responded by saying that “disclosure is an imperfect process and errors occur”, and promptly ignored the issue. A more reasonable approach might have been to determine if the email existed in the custodian’s mailbox, and if so, determine why it was not collected. If MGA had done this, they likely would have discovered their errors much earlier.