Opinion
1:09-cv-01247 MJS
07-22-2016
ORDER DENYING MOTION FOR FURTHER MODIFICATION OF SCHEDULING ORDER
(Doc. 428)
I. Introduction
On June 17, 2009, Plaintiffs commenced this class action against Defendants on behalf of themselves and others similarly situated. History relevant to this motion includes the fact that on April 19, 2014, the parties were granted the right to proceed with merits discovery with the specific goal of seeking representative, statistical evidence to assist with the presentation of claims at trial. In late 2015, Plaintiffs attempt at a mail survey failed, and they moved for, and were granted, a modification of the scheduling order to attempt a second survey, this one to be conducted in person, door-to-door. For reasons discussed below, that survey also failed. Accordingly, Plaintiffs filed the instant motion for a second modification to the scheduling order to enable a third survey attempt. Defendants contend that Plaintiffs have been neither reasonable nor diligent in their efforts and therefore are not entitled to modification of the scheduling order.
More specifically, Plaintiffs filed the instant motion to modify the scheduling order on May 26, 2016. (ECF No. 428.) The motion asks the court to reopen discovery and allow Plaintiffs until September 2, 2016 to complete a new survey and until September 23, 2016, to disclose expert reports.
An opposition to the motion was filed by Defendant Delano Farms Company, and a separate opposition was filed by Defendants T&R Bangi Agricultural Services, Inc. and Cal-Pacific Farm Management, L.P. on June 10, 2016. (Opp'ns, ECF Nos. 433, 436.) Plaintiffs filed a reply on June 24, 2016. (Reply, ECF No. 443.) The parties appeared before the Court for oral argument on July 1, 2016. The matter was submitted. It stands ready for adjudication.
II. Legal Standard
A. Good Cause Standard
The Court has broad discretion in supervising the pretrial phase of litigation. C.F. v. Capistrano Unified Sch. Dist., 654 F.3d 975, 984 (9th Cir. 2011); Zivkovic v. S. Cal. Edison Co., 302 F.3d 1080, 1087 (9th Cir. 2002). Generally, the Court is required to enter a pretrial scheduling order within 120 days of the filing of the complaint. Fed. R. Civ. P. 16(b). The scheduling order "controls the subsequent course of the action" unless modified by the Court. Fed. R. Civ. P. 16(e). Orders entered before the final pretrial conference may be modified upon a showing of "good cause." Fed. R. Civ. P. 16(b); see also Johnson v. Mammoth Recreations, 975 F.2d 604, 608 (9th Cir. 1992).
Rule 16(b)'s "good cause" standard primarily considers the diligence of the party seeking the amendment. Coleman v. Quaker Oats Co., 232 F.3d 1271, 1294-95 (9th Cir. 2000); Johnson, 975 F.2d at 609. The district court may modify the pretrial schedule "if it cannot reasonably be met despite the diligence of the party seeking the extension." Fed. R. Civ. P. 16 advisory committee's notes (1983 amendment); Johnson, 975 F.2d at 609. The Ninth Circuit has not described the diligence standard in detail. However, the determination of "good cause" "focuses on the reasonable diligence of the moving party." Noyes v. Kelly Servs., 488 F.3d 1163, 1174 n.6 (9th Cir. 2007) (citing Johnson, 975 F.2d 604, 609 (9th Cir. 1992)).
Additionally, carelessness is not compatible with a finding of diligence and offers no reason for a grant of relief. Johnson, 975 F.2d at 609. Although the existence or degree of prejudice to the party opposing the modification might supply additional reasons to deny a motion, the focus of the inquiry is upon the moving party's reasons for seeking modification. Id. (citing Gestetner Corp. v. Case Equip. Co., 108 F.R.D. 138, 141 (D. Me. 1985)). If the moving party was not diligent, the Court's inquiry should end. Id.
B. Agency
This inquiry touches on the issue of whether the actions of Plaintiffs' consultants (including expert witness and survey consultants) and attorneys should be imputed to Plaintiff when determining if good cause exists for modification of the scheduling order.
The rules governing the attorney-client relation are "founded on the rules governing the relation of principal and agent." Moving Picture Etc. Union v. Glasgow Theaters, Inc., 6 Cal. App. 3d 395, 403-404 (1970) (quoting Fidelity & Casualty Co. v. Abraham, 70 Cal. App. 2d 776, 783 (1945)). "[N]otwithstanding the lack of express or apparent authority in the attorney, his act is binding on the client if the latter ratifies it or accepts the benefits of the attorney's acts." Id. Accordingly, it is without question that the acts of an attorney are binding on the client and are relevant to a diligence inquiry. See e.g., Haeger v. Goodyear Tire & Rubber Co., 793 F.3d 1122, 1135 (9th Cir. 2015), (A client "is deemed bound by the acts of [its lawyers] and is considered to have 'notice of all facts, notice of which can be charged upon the attorney.'") (quoting Link v. Wabash R. Co., 370 U.S. 626, 634 (1962).
There is little law regarding imputation to the client of actions of consultants hired and supervised by counsel. There is, however, no reason to think such relationships merit deviation from common law notions of agency and respondeat superior. See e.g., Restatement 3d, Agency, § 2.01 (Actual Authority: An agent acts with actual authority when, at the time of taking action that has legal consequences for the principal, the agent reasonably believes, in accordance with the principal's manifestations to the agent, that the principal wishes the agent so to act.); § 2.04 Respondeat Superior: An employer is subject to liability for torts committed by employees while acting within the scope of their employment.). Indeed, allowing a client to insulate himself from actions of his attorney agent, or the attorney's agent, just because the attorney worked through a sub-agent consultant to accomplish tasks for the benefit of the client would defeat the well-established principles of imputed agency liability.
Accordingly, the Court will consider the actions, and inaction, of Plaintiffs' counsel and expert witness/consultant and those hired by them in determining whether Plaintiffs have shown diligence.
III. Relevant Facts
A. Events Precipitating the Most Recent Failed Survey
On April 19, 2014, the Court allowed the parties to proceed with merits discovery as they saw fit, and ordered them to report the status of their efforts on or before March 12, 2015. (ECF No. 330.) Plaintiffs were early put on notice that any proposed survey evidence would be closely scrutinized for admissibility and reliability. (See Decl. of Mario Martinez, Ex. A, ECF No. 443-2 at 1-2. (Plaintiffs' counsel advising Dr. Roberts to anticipate all the ways he will defend the survey from attacks as to the scientific protocol used and its validity.))
Upon the Parties' request, the Court extended the deadline for the joint status report from March 12, 2015, to June 12, 2015. (ECF No. 365.) On June 12, 2015, the parties filed a joint status report. (ECF No. 372.) Defendants notified the Court that their efforts to conduct a pilot study of a random cross section of the class members were frustrated by the low response rate -- only 23% of the selected class members were willing to provide testimony. (Id.) Defendants determined that they could not guarantee that the results of such a small sampling would be representative of the whole, and made the decision to abandon that effort. (Id. at 2-9.) Despite the fact the discovery period had been open for over a year, Plaintiffs had yet to conduct their mail survey, and requested the Court allow them until September 25, 2015 to do so. (Id. at 13.) The Court granted Plaintiffs' request, and extended the deadline for the completion of Plaintiffs' mail survey until September 25, 2015. (ECF Nos. 373, 377.)
Plaintiffs attempted a mail survey in late summer of 2015, nearly a year and a half after the discovery period had opened. The response rate to the survey was unexpectedly low (3%, i.e., three completed responses per one hundred surveys). (ECF No. 393-4.) Defendants declined Plaintiff's invitation to stipulate to extend the survey deadline, and so Plaintiffs moved to modify the scheduling order to provide additional time so they could conduct the survey in a different manner.
Specifically, on October 15, 2015, Plaintiffs requested a 120 day extension of the deadline to implement a new survey based on "tightly controlled direct interviews." Upon a showing of good cause, the Court granted the motion on January 14, 2016, and, on January 28, 2016, issued an amended scheduling order. (ECF Nos. 405, 407.) The amended scheduling order gave Plaintiffs until January 23, 2016, an additional 120 days from the previous, September 25, 2015, deadline, to complete the new survey. (Id.)
The Court granted the motion to modify the scheduling order based on the representation from Plaintiffs' expert, Dr. William Roberts, that the original questionnaire process, though scheduled for implementation during the 2015 peak harvest season to ensure maximum response, had a response rate of only three percent; Roberts had projected a response rate close to 30 percent based on his prior experiences. (Roberts Decl., ECF No. 393-2 at ¶ 12-13.) Roberts advised counsel that more representative results could be obtained through a "tightly controlled, direct interview process". (Roberts Decl., ECF No. 393-2 at ¶ 14.) He expected the direct interview process to take eight to ten weeks, and requested an additional sixty days thereafter to compile and analyze the data and create a final report. (Id. at ¶ 15-16.)
The Court found that Plaintiffs' expert had provided sufficient justification for delaying the mail survey until peak harvest season and that Plaintiffs were diligent in initiating their request to modify the scheduling order promptly upon learning of the problems with the mailed survey results. Accordingly, the Court granted the motion to modify the scheduling order.
While Plaintiffs' counsel was seeking this modification of the scheduling order, Plaintiffs' expert was securing survey administrators to implement the door-to-door survey. He discussed the project with California Survey Research Services ("CSRS"). (Decl. of Anna Walther, Ex. J., ECF No. 429-2 at 266-268.) On September 22, 2015, CSRS in turn contacted Bakersfield Market Research ("BMR") about actually conducting the door-to-door interviews. (Depo. of Margarita Rodriguez at 221:10-16.).
On October 1, 2015, CSRS provided Roberts a bid proposal for the project. It called for CSRS to coordinate door-to-door, face-to-face, interviews of 300 or 400 absent class members, with the actual interviews to be done by a "partnering firm." (Walter Decl., Ex. J., ECF No. 429-2 at 266-268.) CSRS stated that it would be responsible for "overseeing the project management, Spanish language interviewing, training and quality control". (Id.) Additionally, CSRS provided an option for validating 20% of the interviews by follow-up phone calls to verify responses reported by the survey takers. (Id.) If such a validation raised questions, CSRS was to validate additional interviews and replace those that were problematic. (Id.) The CSRS employees principally responsible for managing the Delano Farms Survey were Margarita Rodriguez, vice president of operations for CSRS, and Al Noiwangmuang, vice president for online data collection processing for CSRS. Both Rodriguez and Roberts confirmed that the proposal set forth the terms under which CSRS agreed to participate in the Delano Farms study. (Rodriguez Depo., 231:8-24; Roberts Depo., 156:15-18.)
In providing the bid on October 1, 2016, CSRS advised Roberts that it would need six to eight weeks to complete 300 interviews, and an additional two weeks if 400 interviews were requested. (Decl. of Greg Durbin, Ex. L, ECF No. 434-12.)
Roberts did not authorize CSRS to proceed with the survey until November 2, 2015. (Maricruz Estrada Depo. 180:1-181:10; Decl. Greg Durbin, Ex. J.) On November 10, 2015, Rodriguez traveled to Bakersfield, met with BMR personnel, provided them iPads with which to administer the survey and provided training on the survey software. (Rodriguez Depo., 108:16-23, 240:3-7; Depo Ex. 49; Durbin, Ex. N.)) The iPads containing the survey software were wirelessly linked to a server at CSRS. (Rodriguez Depo., 243:13-21; Noiwangmuang Depo.207:15-209:14.) When a survey was completed, the information on the iPad was uploaded to a CSRS server. (Id.) The data received from the iPads included: the responses to the survey questions; the respondent's signature; the respondent's phone number, if provided; the GPS coordinates where the survey was initiated; the start time and length of the interview; and the language used for the survey. (Noiwangmuang Depo. 29:8-18, 109:15-25, 111:17-21, 225:7-226:24; Durbin Decl., Exs. D, M.)
B. Administration of and Failure to Validate Survey
From November 13, 2016 until November 28, 2016 (notwithstanding Thanksgiving Day), BMR administered the survey. (See Walther Decl., Ex AA.) During this time, employees from BMR were assumed to be traveling to residences of named absent class members and obtaining their answers to the survey questions. Most of the addresses were located in Bakersfield and smaller surrounding communities, including Delano, MacFarland, Wasco, Shafter, Lamont, and Arvin. (See Walther Decl., Ex YY.) According to BMR, its employees attempted to contact 467 class members, and were able actually to complete 305 surveys, during this roughly two week period. (Decl. Fink, ¶ 13; Walther Decl., Exs. N, W.)
CSRS was receiving the results of these surveys as they were being administered. It checked with BMR from time to time to compare computer and BMR's manual records as to the number of completed interviews. (Rodriguez Depo., 282:16-283:8) CSRS and Roberts communicated on November 17 and 18, 2015, while BMR was conducting the survey, regarding the script to be used for telephonic validation. (Rodriguez Depo., 257:17-25; Depo. Ex. 57; Roberts Depo., 128:3-129:16, 279:6-17; Depo. Exs. 100 & 112.)
Roberts confirmed the validation questions were acceptable, and Rodriguez agreed validation calls would begin the next day. The script for the validation calls asked respondents to confirm their identity, state whether they had recently completed a survey about their employment with Defendants, and confirm their answers to a few questions on the survey. (Rodriguez Depo. Ex. 100, p. 3.) As contracted, CSRS was to randomly contact by phone 20% of the respondents, i.e., roughly 60 of the 305 people surveyed.
Validation calls commenced on November 19, 2015, less than a week after BMR started administering the survey. (Roberts Depo. Ex. 112.) CSRS quickly identified substantial problems with the validation study: It was unable to validate any of the responses. CSRS then unilaterally decided to expand the validation study from 20% to 100% of the reported 305 survey respondents. (Rodriguez Depo., 273:6-13.) However, 136 of the respondents had no recorded telephone number. Of the remaining 169 with phone numbers listed, the validation study produced: 7 busy numbers; 32 answering machines; 8 business numbers; 2 call back "non-specified"; 1 call back specified; 1 complete validation; 83 disconnected numbers; 6 modem/fax numbers; 10 no answer; 17 wrong numbers; and 3 duplicates. (Rodriguez Depo., 279:23-280:11; Depo. Exs. 60 & 62; Walther, Ex. Y.) Thus, by early December 2015, with a single exception, CSRS had failed in its efforts to validate by phone any of the purported 305 responses. (Rodriguez Depo., 248:2-7.)
CSRS communicated the results of the validation efforts to Roberts at that time. (Rodriguez Depo., 248:8-249:5, 277:10-279:3.) The communication is corroborated by an e-mail from Rodriguez to Roberts on November 30, 2015, advising him that the door-to-door interviews were complete, but that they needed to discuss concerns regarding the validation calls. (Rodriguez Depo., 284:16-285:3; Depo. Ex. 61; Durbin Decl., Ex. V.) Later that day, after speaking to Roberts, Rodriguez sent an email to BMR telling it to stand by in light of the difficulty in validating interviews. (Rodriguez Depo., 283:14-23; Depo. Ex. 5, Durbin Ex. 4.) She explained that Roberts needed to discuss the results with the legal team to see if further work was required. (Id.) Rodriguez suggested to Roberts the possibility of conducting the validation study in person rather than by telephone. (Rodriguez Depo., 291:16-21, 293:22-294:3.) More than a week later on December 8, 2015, Rodriguez asked Roberts by email if there were updates on how to handle the validation issues. (Rodriguez Depo., 168:18-22, 173:16-24; Depo. Ex. 50; Durbin Decl., Ex. O.) Ultimately, Roberts advised CSRS there was nothing further CSRS needed to do. (Rodriguez Depo., 290:21-291:2.)
Roberts did discuss the validation failure with Plaintiffs' counsel. (Roberts Depo., 140:9-24.) He suggested at least some optional means of validating the study, such as by door-to-door interviews, but left the decision as to how to proceed with counsel. Plaintiffs' counsel admit they discussed the matter with Roberts in early December 2015. (ECF No. 429 at 9.) Roberts and Plaintiffs' counsel decided collectively "to live with the validation we have." (Roberts Depo., 141:12-17, 144:11-20.) Counsel told Roberts that difficulties contacting farm workers by telephone was understandable and consistent with counsels' prior experience dealing with migrant farm workers in low-income Latino communities. (Id.)
In short, notwithstanding the colossal validation failure and offers to undertake additional verification, Plaintiffs decided to use the existing data without further inquiry. Roberts proceeded to analyze that data and provide Defendants with his expert report based on its results.
It was not until Defendants began expert discovery that the problems with validation and the resulting suggestion that the results may have been falsified were exposed to the light of day. Specifically, during March 30, 2016, depositions of BMR employees, Defendants discovered that BMR effectively used only one interviewer to conduct all of the surveys, rather than the four to five as proposed. At CSRS's training of survey takers, BMR had had two or three other individuals attend and submit nondisclosure agreements. (Armwood Decl., 28:7-30:25) Those individuals were paid for attending the training, but did not participate in any of the interviews. (Id.) The deposition also revealed that neither BMR nor any of its employees had ever before conducted door-to-door interviews. (Estrada Depo., 183:7-16.)
Maricruz Estrada performed most of the interviews. Timothy Atwood drove in the same car and accompanied her, but as he spoke little Spanish, he was unable to assist with the interviews. (Armwood Depo. at 33:18-24, 54:22-55:8.)
CSRS employees were interviewed at the end of April. Given the questions raised and explanations given, Plaintiffs' counsel undertook a review of that survey data which had been in their, or at least in their survey company's, possession since the end of the previous year. Upon reviewing the time stamps and GPS locations recorded, it became apparent results had been fabricated: Interview times overlaped, and the GPS locations were not near the respondents' residences, but instead grouped at or near fast food eateries (i.e., Starbucks, McDonalds, etc.) or public places (i.e., libraries, school parking lots, etc.). On May 5, 2016, Plaintiffs notified Defendants of these latter findings. On May 23 or 24, 2016 Plaintiffs formally withdrew Roberts expert report based on the questionable survey results and asked Defendants to stipulate to modify the scheduling order to allow time for another survey. Defendants declined to so stipulate. Accordingly, on May 26, 2016, Plaintiffs filed the instant motion to modify the scheduling order to provide time for Plaintiffs to undertake a third survey using procedures not then or even yet specifically identified.
Defendants' experts identify such actions of survey takers as "curbstoning", reportedly a common term used to describe a survey taker who sits at a curb and fills out surveys himself rather than going door to door and soliciting genuine responses. (Decl. Arline Fink at ¶ 9, ECF No. 439.)
IV. Arguments
A. Plaintiffs' Arguments Summarized
Plaintiffs assert that there is good cause to extend the discovery period for taking survey evidence because BMR's acts were fraudulent and concealed from Plaintiffs though March 2016. Plaintiffs acknowledge that CSRS was in possession of the GPS and time-stamp data that revealed the anomalies, but explain that CSRS failed to properly review the data and discover BMR's "curbstoning". Plaintiffs argue further that they were not able to interpret the data from CSRS until after Defendants' deposition of Al Noiwangmuang who testified about what CSRS did with the collected data. His deposition responses motivated Plaintiffs' counsel to review the data the following week and determine that the survey had been compromised. Promptly after their requests to postpone depositions and seek a stipulation to modify the scheduling order were rejected by the defense, Plaintiffs filed the instant motion.
Though the Court accepts that Plaintiff's did not look into these matters until then, it sees no reason they could not have done so earlier. Thus, the question is whether they should have inquired earlier and, if so, did their failure to do so reflect lack of diligence.
In short, Plaintiffs primary contention is that they acted reasonably and diligently in reliance on their hired consultants and agents and should not be charged with knowledge of what appears to have been fraud actively committed and concealed by sub-agent BMR nor should they be charged with knowledge of CSRS's failure to quality-check BMR's work or Robert's failure to supervise those he had doing the survey work.
B. Defendants' Arguments Summarized
Defendants dispute Plaintiffs claim of reasonable diligence, asserting that Plaintiffs had been placed on notice of serious problems with the survey as early as the first part of December and yet did nothing to investigate or promptly address the problems. They contend that the wholesale failure of the validation process was sufficient to alert any reasonable person to the possibility, if not the likelihood, of very significant problems with the survey. Rather than undertake other methods to validate the study, or direct CSRS to do so, they did nothing. Defendants note that Plaintiffs' primary representative in this process, Roberts, spent little time monitoring the survey and took no real steps to ensure its quality. They claim Roberts stated reasons for his inaction do not logically explain the inaction or show any realistic measure of diligence. They argue, in effect, that Plaintiffs cannot escape their obligation of diligence by arguing that decisions in these regards were left in the hands of their attorneys who delegated survey responsibility to Roberts who delegated much of his responsibility to CSRS who delegated the actual surveying to BMR, with each disclaiming responsibility for the acts of his respective sub-agents in perpetrating the alleged fraud or at least in failing to supervise and catch the alleged fraud.
V. Framework for Diligence Inquiry
It is important to the analysis of diligence on the part of Plaintiffs to note the context in which the Court's analysis took place.
Plaintiffs argue that they should not be held accountable for what they characterize, reasonably, as out and out fraud on the part of BMR. Instinctively, the Court is inclined to agree. See, e.g., Restatement 3d of Agency, § 7.07 But the Court need not resolve that issue. It focuses not on the question of whether a principal should be charged with the fraud of its agent when that fraud is concealed from the principal, but rather on the question of whether a principal can abdicate responsibility to supervise its agents, or ensure its agents supervise sub-agents, and then rely upon that abdication to justify its failure to discover fraud that was readily apparent from information actually in the possession or at least readily available to the principal's prime, non-fraudulent, agents. More specifically, when, in December, counsel and their agent Roberts were confronted with profound questions raised by the abject failure of the validation study, they consulted one another and decided to do nothing. Since both were actively involved in making this decision, there is no need to question whether the fraud of BMR should be imputed to Plaintiffs. The failure to act is the alleged lack of diligence, and that failure to act is directly attributable to Roberts, to counsel and hence to Plaintiffs.
Comments to Section 7.07 state: "[E]mployee's tortious conduct is outside the scope of employment when the employee is engaged in an independent course of conduct not intended to further any purpose of the employer. ... When an employee commits a tort with the sole intention of furthering the employee's own purposes, and not any purpose of the employer, it is neither fair nor true-to-life to characterize the employee's action as that of a representative of the employer. The employee's intention severs the basis for treating the employee's act as that of the employer in the employee's interaction with the third party."
VI. Analysis
Several alarms sounded to alert Plaintiffs to problems and the need to act. Some were faint, but one was very loud. Regardless of volume, they were loud enough that it took Defendants only a few short weeks to hear and respond to them and identify their cause. Plaintiffs did not respond at all until after Defendants effectively forced them to do so. The question here is whether that failure to respond reflected diligence. The Court concludes it did not.
A. The Alarms
Plaintiffs, through their expert and his consultants, designed the door-to-door study and the methods for validating its results. Plaintiffs were well aware that the study would be rigorously reviewed and challenged by defendants. (See Martinez Decl., Ex. A, ECF No. 443-2 at 1-2.) Accordingly, Plaintiffs opted to have 20% of the survey respondents called back and asked validating questions. Plaintiffs chose that method in lieu of other available validation methods. Presumably they did so for good reason, i.e., to satisfy themselves (And presumably the Court) of the validity of the survey results and to be able to respond to Defendant's' challenges to those results.
The failure of that validation should have set off a very loud alarm and signaled a critical event for Plaintiffs: as noted, it was a total failure and necessarily called into question the survey's design, results and reliability. Only one respondent of over 300 was able to be contacted by phone. Roberts thought that they might need to attempt to call 30% of the respondents to reach and verify 20% of the responses. Reportedly, only about half of the respondents even provided a phone number. Of the 166 who supposedly did so, 100 of the numbers did not correspond to any survey respondent's phone. CSRS efforts to determine if the roughly 60 remaining numbers were valid or not failed. Roberts himself was "surprised" with the failure of validation. (Roberts Depo., 260:23-231:2.)
106 of the numbers were either disconnected, to a modem or fax machine, or a wrong number. (Walther Decl., Ex. Y.)
At the very least, the failure of validation coming only a few days or a week after the purported interviews should have raised questions in the mind of any reasonable person as to why the chosen validation method could not be completed, whether an alternative validation study should be undertaken, whether the data, including backup GPS data, should be scrutinized more carefully. It certainly was in Plaintiffs' interest to do what could be done to validate survey responses they intended to rely upon at trial and which they knew would be closely examined, to say the least, by opposing counsel and their experts.
Plaintiffs' responded to the failure of the validation process by ignoring its results or, at best, trying to rationalize them away despite the fact they had designed them and made them an essential element of their case. These rationalizations left them with a failed validation for which they pursued no alternative. They, struthiously decided to accept the survey results, have their expert extrapolate from them and opine on them, and present the survey and the expert's analysis to Defendants. It was not until several months later, when the Defense in very short order brought the deficiencies to the fore via limited discovery, that Plaintiffs finally delved more deeply and, based upon review of what had been within their reach for months, asked for a redo and more time to do it.
Other, albeit less noisy, alarms went off as well: 1. This survey produced a reported 65% response rate over a two week period. Prior efforts by Plaintiffs generated only a 3% response. Defendants' deposition subpoenas produced responses from only about 20% of those identified. Surely something, enough to at least raise eye brows, was likely amiss. 2. This survey reportedly contacted 400 people, 306 of them successfully, in a mere two week period, one third or less of the time that had been projected. 3. That which the survey designers had designed for validation purposes did not work. The designer took it upon itself to attempt 100% validation. That did not work. The survey designer recommended alternatives. Surely there was a reason for its original validation and for its recommendation of alternatives. Did Plaintiffs ignore the warnings implicit therein when they rationalized away the reasons and recommendations?
Plaintiffs contend that their actions and rationalizations were reasonable,
Defendants disagree. In support of its opposition to Plaintiffs' motion, Delano Farms present a declaration from Dr. Arlene Fink, a professor of medicine and public health at the University of California, Los Angeles, and president of a survey research company. (See Fink Decl., ECF No. 439.) She holds a Ph.D. in education, with a specialization in research methodology, and has spent most of her career designing and evaluating surveys. (Id.) Dr. Fink opines, consistent with what the Court believes to be common knowledge, that proper survey practice includes adoption of measures to validate the survey results. (Id. at ¶ 23.) She defines validation as having subsequent attempts to gather the same information produce the same results as an original survey. (Id.) Discrepancies in the validation process necessitate an investigation into their cause. The failure to investigate such discrepancies is not professionally appropriate. (Id.) Fink approved Plaintiffs' plans to validate 20% of the results by phone. (Id. at ¶ 24.) However, she characterized the decision to ignore the failure of validation as reckless and inconsistent with industry standards, and "indefensible." (Id. at ¶ ¶ 24,25.) Fink suggested two reasonable reactions to the validation failure - implementation of another method to validate the survey or re-do the survey. Either option should have identified the alleged fraud or at least produced truthful survey results before the January 24, 2016 deadline.
Plaintiffs argue that they were dealing with other complex circumstances when the failed validation presented itself. Their original mail survey had failed and the initial September 24, 2015, discovery deadline had passed; Plaintiffs motion to modify the scheduling order to allow additional time to complete a new survey was still pending at the time the new door-to-door survey was going on. Thus, discussions and decisions about investigating the survey results were occurring at a time when Plaintiffs did not know if the Court would allow the results to be submitted at all.
To the extent there is significance to these latter claims, the said uncertainty was brought on by Plaintiffs themselves. Merits discovery had been open for over a year. Nothing had prevented Plaintiffs from initiating surveys earlier. By waiting to the end of the discovery period, which incorporated one extension at Plaintiffs' instigation, Plaintiffs left themselves no time to address problems or do additional surveying if needed, as indeed it was.
B. Would Diligence Have Been Productive
Plaintiffs argue that there was no reason for them to have investigated or tried to validate further. They contend that they reasonably relied upon Roberts who reasonably relied on CSRS to ensure the quality of the data collected. They argue that it was reasonable to attribute the validation failure to class member reticence about talking to strangers given their immigration and legal status.
The Court disagrees.
1. Plaintiffs' Counsel Had Actual Knowledge of the Failure of Validation
Again, the issue of whether in the normal course of events all of the actions and inactions of the various sub-agents of Plaintiffs should be attributed to the Plaintiffs need not be resolved here. Plaintiffs and everyone else down the line, at least through CSRS, apparently thought that a validation process was necessary to the integrity of the results and knew that the validation process used failed completely to validate; it was an abject failure. Yet none of them did anything substantive in response to that failure even though they were sitting on GPS data that documented the fraud of their survey takers.
Shared knowledge of this failure cannot be denied. When the attempt to validate 20% of the surveys failed, CSRS took it upon itself to call all of the respondents. When that too failed, CSRS asked Roberts how to proceed. (Walther Decl., Ex. O; Roberts Depo. 132:20-135:10.) A CSRS e-mail to BMR on November 30, 2015, noted that "getting validations was an important piece of the study" and that more interviews or in person valuations might be necessary. (Walther Decl., Ex. V.) CSRS suggested door-to-door validations. (Id.) Roberts thought that would be "fairly expensive," and not worth the price. (Id.)
2. Plaintiffs Did Nothing with this Knowledge
Despite the failure of the validation process and suggestions that alternative steps be taken in response, Plaintiffs did nothing. They did not attempt alternative validation, a renewed survey, or to investigate the reasons for the failure. They did not take CSRS up on its offer to undertake other review of the survey data.
Robert's discussion with counsel about the failed validation and the availability of other options led to a decision not to take further action. Roberts believes that he told counsel of CSRS's offer of door-to-door validation, but counsel declined. (Id. at 271:8-272:13.)
During the hearing on this motion, Plaintiffs argued that no such action was necessary because there no reason to suspect fraud had taken place. Fraud or not, the validation Plaintiffs had contracted for, even when extended to all respondents, had failed miserably. Plaintiffs cannot credibly say that they elected, for good reason, to have a validation process and then, when that process failed in its essential purpose, claim, in essence, that it was unnecessary to begin with. Clearly it was a warning sign calling for deeper inquiry into its cause. This is particularly so where, as here, falsification and "curbstoning" are known problems necessitating safeguards and safeguards exist. (Fink Decl., ¶ 9.) Plaintiffs' assumption that the survey results were valid was not reasonable.
Similarly, Roberts now claims that further validation was unnecessary since survey responses included the respondent's name, location and signatures, and Roberts believed and/or assumed that CSRS was spot-checking GPS data. (Roberts Depo., 276:3-277:9.) Roberts made no effort to confirm this latter belief or assumption. (Id. at 122:8-123:22.) The inclusion of the names and addresses of the alleged survey respondents is not reassurance; those were provided to the interviewers who apparently falsified the responses. Fraudulent pollsters certainly are capable of forging respondents' signatures. Roberts also found reliability in the consistency in the data showing greater tool expenditures for those who worked both pre-harvest and harvest. (Reply at 16.) It is not mere hindsight that suggests a potentially fraudulent surveyor might well be able to reason out such logical responses and falsify surveys accordingly.
3. Plaintiffs' Agent Negligent?
Apparently counsel delegated to Roberts supervision of the survey. He was counsels' primary contact with CSRS. However, Roberts had only minimal communication with CSRS about hiring BMR to conduct the interviews; he in fact had only minimal communication with CSRS on any topic, or at all, during the survey period. (Bigelow Decl., Ex. D, ECF No. 437-1 at 23 (Roberts billing records show little to no time spent consulting with CSRS during the implementation of the survey.)) Further, as Defendants argue, the Court believes Roberts should have known from past studies how unlikely it was that the survey could have been completed, and completed with such an unusually high response rate, in a two week period; CSRS had estimated it would take 6-8 weeks to complete. (Fink Decl. at ¶¶ 13-18.)
Roberts counters in his declaration that there were ten "brief" e-mails and "numerous phone contacts" with CSRS during the survey period, but the contacts were so de minimis Roberts did not charge for the time spent. (Roberts Decl. at ¶ 19, ECF No. 443-3.) To the Court's mind this supports the conclusion that Roberts had little to do with administration of the survey. --------
Roberts did not check the GPS data or check to see if anyone else was checking it. Roberts "thought" CSRS was confirming the GPS coordinates of interviews, but was not sure. (Roberts Depo., 122:22-124:21.) There is no evidence that Roberts questioned CSRS regarding its review of the GPS data or requested the data to review himself even though he was concerned when he was told of the failure of the validation study. (Roberts Depo at 136:1-14.) CSRS did perform a "spot check" of a limited amount of the data, but noted no inaccuracies. (Rodriguez Decl., ¶21.) Nevertheless, discussion with counsel led to a decision not to take further actions to validate. As noted, Roberts believes that he told counsel of CSRS's offer of door-to-door validation, but counsel declined. (Id. at 271:8- 272:13.)
4. Irregularities Discoverable?
Plaintiffs argue, in effect, that even had they dug deeper into the validation problems they could not have discovered the real irregularities, i.e., the alleged fraud of BMR, until Defendants deposed CSRS employees and pinpointed misrepresentations.
The Court's simple response is that it does not find credible the claim that Plaintiffs, who had access to and control of all the salient information for months, could not with reasonable diligence (and, one would expect, equivalent motivation) have discovered what Defendants discovered in a few short weeks of discovery and adversarial questioning of Plaintiffs' agents and sub-agents.
It is in any event inescapable that by December 1, 2015, Plaintiffs had access to and control of GPS location and timing data which readily revealed major problems and even outright fraud. (Walther Decl., Ex. Y, ECF No. 429-2 at 450-552.) It was not reviewed.
C. Conclusion
Rule 16(b)'s "good cause" standard primarily considers the diligence of the party seeking amendment. Plaintiffs have not shown diligence in conducting discovery. Rather than investigate the failed phone validation, Plaintiffs elected to "live with" what they had. Rather than test the survey results further, they assumed that validation had failed for innocuous reasons. In so doing, Plaintiffs denied themselves the opportunity to discover the anomalies and undertake to obtain usable survey evidence before the discovery deadline. Failing to follow up on a failed survey validation does not meet the standard of reasonable diligence. Plaintiffs present several justifications for not doing so, but the Court finds them unpersuasive.
For all the reasons set forth above, the Court is unable to conclude that Plaintiffs acted reasonably and diligently in delaying their motion for further modification of the Scheduling Order. That finding, alone, warrants denial of their motion. There is no need to delve into issues of whether further delay of this seven year old case would prejudice Defendants (beyond perhaps invoking the old saw about justice delayed being justice denied). There is no need to note that notwithstanding the inferences one might draw from the failure of three attempts to survey representative class members, Plaintiffs have proposed nothing that would suggest a greater likelihood of success even if more time had been given.
The lack of diligence alone is compelling and determinative. Plaintiffs' motion to modify the scheduling order is denied.
VII. Sanctions
Delano Farms asks that if the Court grants Plaintiffs' motion, it award monetary sanctions to it to recoup the fees and costs incurred in discovery into Plaintiffs now-withdrawn study. Federal Rule of Civil Procedure 16(f) authorizes sanctions for violations of pretrial orders and states in relevant part:
On motion or on its own, the court may issue any just orders, including those authorized by Rule 37(b)(2)(A)(ii)-(vii), if a party or its attorney... fails to obey a scheduling or other pretrial order.Fed. R. Civ. P. 16(f).
Imposing Fees and Costs. Instead of or in addition to any other sanction, the court must order the party, its attorney, or both to pay the reasonable expenses--including attorney's fees--incurred because of any noncompliance with this rule, unless the noncompliance was substantially justified or other circumstances make an award of expenses unjust.
Defendants seek sanctions only in the event the Court grants Plaintiffs' motion to modify the scheduling order. Since the motion to modify is denied, Defendants' request is moot.
But even if the contrary were true, the Court equates Plaintiffs actions with a constructive, perhaps compelled, decision not to introduce that survey evidence which, if it were to be produced at all, had to have been produced by the January 23, 2016 deadline This is not seen as a sanctionable refusal to comply with a deadline or a failure to produce that which it had.
In the same vein, the Court notes that while Plaintiff's inaction was the antithesis of diligence, it appears not to have been motivated by bad faith, ill will or other sanctionable activity.
No sanctions shall be awarded.
VIII. Order
IT IS HEREBY ORDERED that Plaintiffs' Second Motion for a Modification of the Scheduling Order is DENIED.
IT IS FURTHER ORDERED that each of the parties are to submit, within fourteen days of the date of this Order, a status report setting forth their respective views as to how the case shall henceforth proceed. The Court will schedule further proceedings or a conference to discuss further proceedings by way of a separate, later order. IT IS SO ORDERED.
Dated: July 22, 2016
/s/ Michael J . Seng
UNITED STATES MAGISTRATE JUDGE