Opinion
No. 11-16146 D.C. Docket No. 1:09-cr-483-ODE-AJB-1
08-16-2013
[DO NOT PUBLISH]
Appeal from the United States District Court
for the Northern District of Georgia
Before WILSON and COX, Circuit Judges, and VOORHEES, District Judge. VOORHEES, District Judge:
Honorable Richard Voorhees, United States District Judge for the Western District of North Carolina, sitting by designation.
Defendant-Appellant Dr. Rajashakher Reddy ("Dr. Reddy") was named in a thirty-seven count Indictment alleging wire fraud in violation of 18 U.S.C. § 1343 (Counts 1-25), mail fraud in violation of 18 U.S.C. § 1341 (Counts 26-32), health care fraud in violation of 18 U.S.C. § 1347 (Counts 33-36), and falsifying records in a federal investigation in violation of 18 U.S.C. § 1519 (Count 37). Following a seven-day trial, a jury convicted Dr. Reddy of all offenses, except for five wire fraud counts (Counts 5, 12, 15, and 20-21). Dr. Reddy was sentenced to fifty-four months imprisonment.
On appeal, Dr. Reddy contends that the trial judge committed reversible error in two of his evidentiary rulings, entitling him to a new trial. More specifically, Dr. Reddy asserts that the trial judge abused his discretion by 1) excluding the proposed defense expert testimony of Dr. Benjamin Sacks pursuant to Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579, 113 S. Ct. 2786 (1993), and 2) permitting government witness, Mark Bronkalla, to testify on matters Dr. Reddy suggests were beyond the scope of the expert testimony previously identified in the Government's Rule 16(a)(1)(G) summary. Dr. Reddy also challenges the legal sufficiency of the health care fraud counts as alleged in Counts 33 through 36 of the Second Superseding Indictment. We reverse and remand.
Absent the need to distinguish between the original, First Superseding, or Second Superseding Indictments, the Second Superseding Indictment will be referred to within this opinion as the "Indictment."
I.
Dr. Reddy, a licensed and board-certified radiologist, was the owner and President of Reddy Solutions, Inc. ("RSI"), a teleradiology business in Atlanta, Georgia. The criminal charges brought against Dr. Reddy stem from his operation of RSI. The Indictment alleges that Dr. Reddy perpetrated an eighteen-month health care scheme, whereby Dr. Reddy fraudulently signed and submitted radiology reports electronically and through the U.S. Mail for "tens of thousands of patients . . . in cases where neither he nor any other RSI physician had ever reviewed and analyzed the film." Allegedly, Dr. Reddy would either sign off on a draft report prepared by a technician without independently reviewing the image himself or he would instruct another RSI employee to sign off on a report using his electronic signature.
Radiology is an area of medicine wherein a highly trained specialist analyzes various images to help the patient's treating physician diagnose the medical problem and / or develop a treatment plan. Radiology modalities include X-rays, "CAT" scans, "MRI's," and other methods for assessing parts of the human anatomy. RSI's business relied upon "teleradiology," whereby images and reports were in digital format and shared electronically via the Internet.
There was no Government evidence of patient harm resulting from the alleged fraud. In other words, there was no evidence that any patients were harmed as a result of misinterpreted or misread images.
The defense theory at trial was that Dr. Reddy performed the services as represented, but that remote access and other indicia that he actually did the work were not adequately reflected by RSI systems and records. As a result, evidence was presented by both sides concerning the accessibility of radiological images by RSI physicians and how access was recorded. An appreciation of the workings of RSI is required to fully understand the factual issues presented.
RSI contracted with hospitals as well as rural and smaller clinics to provide radiology services, and issued as many as 1500 to 2000 reports to its clients every day. RSI employed board-certified radiologists to read and interpret images sent for evaluation. With the exception of Dr. Reddy, RSI radiologists were compensated based upon productivity and the relative complexity of the work. RSI also employed support personnel with specialized training in radiology.
RSI did not ordinarily bill government programs or insurance companies. Rather, RSI billed its clients who in turn would bill the government entity or the patient's insurer.
The method for calculating payment to RSI doctors was known as the Relative Value Unit ("RVU"). There is mention of RSI radiologists selecting the more complex films or images to analyze as a matter of course because the complex work was compensated at a higher rate.
In addition to radiologists, RSI employed non-physician technicians known as Radiology Practice Assistants ("RPAs") to conduct preliminary image review and prepare draft reports. RPAs were not qualified to diagnose or provide any final radiology assessment.
RSI's internal operating system included a software program referred to as the Picture Archival and Retrieving System ("PACS"). PACS was used to contain and transmit the actual radiology images, while a second software program called "Thinair" was used to store and transmit the report dictated by the doctor. The PACS logs only documented views of an image directly from the RSI "hub" server in Atlanta. PACS did not record when images were viewed by RSI employees from remote client locations via RSI "spoke" servers.
RSI radiologists were permitted to work remotely. On occasion, RSI provided services to a client that required an on-site radiologist. The Atlanta server was able to capture the access of radiologists working remotely but not on-site for a client.
In the Government's case-in-chief, evidence was presented questioning Dr. Reddy's ability to review as many images and generate as many reports as he claimed to have interpreted. Analysis of PACS access logs by Government agents documented "views" by Dr. Reddy for a mere 5,840 images compared to 71,512 reports ultimately issued under his name. Certain of the 71,512 reports were recorded as being issued while Dr. Reddy was on an overseas flight with no internet access. In addition to the PACS logs, the Government's expert radiologist testified that Dr. Reddy's numbers were far more than the national average and opined that it would have been physically impossible to produce the results Dr. Reddy allegedly produced. Several RSI employees testified that they observed Dr. Reddy sign off on reports so rapidly that it appeared that he was affixing his electronic signature without opening up the related image.
The Government's case was met with defense witnesses, including a former spouse, who testified that Dr. Reddy was known for his extraordinary work ethic. Evidence was presented to the effect that Dr. Reddy regularly reviewed radiological images for inordinate periods of time and often in extraordinary conditions (e.g., while on vacation in South America). Witnesses also presented testimony explaining why an observer who did not realize that a doctor could access images remotely, or receive reports in batches, might be legitimately concerned when watching Dr. Reddy issue reports one right after another, just as quickly as the computer would permit him to do it. The RSI transcriptionist testified that Dr. Reddy's voice might be heard on 150-plus dictated reports each day. The transcriptionist further explained that she returned her edited versions of reports to Dr. Reddy in large batches so that he could sign off on a group of dictated reports by processing one after the other.
The PACS logs reporting were shown to be less than comprehensive and sometimes incomplete or missing. RSI IT Director Dan Rabideau testified that he had no faith in the accuracy of the PACS logs; that the PACS logs were "junk." In addition, RPA Mike Lowery testified concerning July 4, 2007, which Lowery specifically recalled because he and Dr. Reddy set a company record that day. The PACS access logs for July 4, 2007, reported that Dr. Reddy viewed only twenty-six images compared to issuing 253 reports under his signature. According to Lowery, the twenty-six "views" figure was "ridiculous" because Dr. Reddy was "in his ear" all day long providing intense feedback on hundreds of reports. Lowery further testified that Dr. Reddy's supervision of his work included Lowery being "hammered" for mistakes in his draft reports, indicating that Dr. Reddy had reviewed and analyzed the images.
One of the arguments advanced by Dr. Reddy at trial was that he spent a significant period of time working on-site at Upson Hospital, an RSI client that preferred to have a radiologist working within the hospital. According to Dr Reddy, his access of images from Upson through a "spoke" server was not reflected in RSI's PACS logs and not taken into account by the Government analysts. On appeal, Dr. Reddy asserts that another weakness in the Government's "volume-based" evidence is that it failed to consider the RVU method for measuring performance.
Corroborating evidence was presented that another RSI physician who had worked from Upson's facility was shown to have viewed only 64 images out of the approximate 7,077 reports issued under his name during the relevant time frame. The Government presented a witness who testified that the efficiencies of working via the "spoke" servers were not as represented by Dr. Reddy.
The RVU system assigns values to the work generated based upon the relative complexity, as opposed to calculating the raw number of reports issued. In seeking to exclude the peer review study, the Government contended in part that Dr. Reddy had relatively fewer complex assignments and was typically responsible for simpler tasks not sought out by other radiologists for renumeration reasons. If Dr. Reddy's work was less complex, one would expect that his review of a plain film would take less time than a more complex modality such that more reports could be completed in a given amount of time. Dr. Reddy suggests that if the same alleged disproportionate numbers were converted to RVU figures, then his work would be only slightly more prolific than the next highest performing RSI radiologist.
Dr. Reddy did not testify on his own behalf at trial. Dr. Reddy's proposed defense expert, Dr. Benjamin Sacks, was to provide testimony summarizing his "peer review" of a sampling of images which Dr. Reddy had purportedly examined and diagnosed.
The Government moved to exclude the testimony of Dr. Sacks based upon its claim that the entire peer review study was unreliable for various reasons and that the testimony would not assist the jury. The district court adopted the findings and recommendation of the magistrate judge and agreed that the proposed peer review testimony failed all three aspects of the Daubert analysis.
II.
We review the district court's decision to exclude proposed expert testimony under Daubert for an abuse of discretion. See Rosenfeld v. Oceania Cruises, Inc., 654 F.3d 1190, 1192 (11th Cir. 2011).
The Supreme Court's decision in Daubert, which applies Rule 702 of the Federal Rules of Evidence, is our seminal authority for issues concerning the admissibility of expert testimony. See Fed. R. Evid. 702. When determining the admissibility of expert testimony, trial courts must consider whether:
A witness who is qualified as an expert by knowledge, skill, experience, training, or education may testify in the form of an opinion or otherwise if:
(a) the expert's scientific, technical, or other specialized knowledge will help the trier of fact to understand the evidence or to determine a fact in issue;Fed. R. Evid. 702.
(b) the testimony is based on sufficient facts or data;
(c) the testimony is the product of reliable principles and methods; and
(d) the expert has reliably applied the principles and methods to the facts of the case.
(1) the expert is qualified to testify competently regarding the matters he intends to address; (2) the methodology by which the expert reaches his conclusions is sufficiently reliable as determined by the sort of inquiry mandated in Daubert; and (3) the testimony assists the trier of fact, through the application of scientific, technical, or specialized expertise, to understand the evidence or to determine a fact in issue.United States v. Frazier, 387 F.3d 1244, 1260 (11th Cir. 2004) (en banc) (quoting City of Tuscaloosa v. Harcros Chems., Inc., 158 F.3d 548, 562 (11th Cir. 1998)).
It is important to note, however, that "it is not the role of the district court to make ultimate conclusions as to the persuasiveness of the proffered evidence." Quiet Tech. DC-8, Inc. v. Hurel-Dubois UK Ltd., 326 F.3d 1333, 1341 (11th Cir. 2003). Instead, "[v]igorous cross-examination, presentation of contrary evidence, and careful instruction on the burden of proof are the traditional and appropriate means of attacking shaky but admissible evidence." Daubert, 509 U.S. at 596, 113 S. Ct. at 2798. For "in most cases, objections to the inadequacies of a study are more appropriately considered an objection going to the weight of the evidence rather than its admissibility." Hemmings v. Tidyman's Inc., 285 F.3d 1174, 1188 (9th Cir. 2002). See also Quiet Tech., 326 F.3d at 1346 (stating that, "[n]ormally, failure to include variables will affect the analysis' probativeness, not its admissibility").
Moreover, with respect to admissibility, we held in City of Tuscaloosa, that an expert's proffered opinion may be unpacked such that what is admissible under Daubert is permitted, and any portion that does not meet Daubert standards is not permitted. 158 F.3d 548, 564-65 (11th Cir. 1998). Therefore, when irrelevant data can "readily be separated from" that which is germane, the relevant portions can still be used by the proponent of the evidence. Id., at 567. The proponent of the evidence has the burden of establishing its admissibility by a preponderance of the evidence. See McCorvey v. Baxter Healthcare Corp., 298 F.3d 1253, 1256 (11th Cir. 2002).
Dr. Manju Morrissey, RSI's Chief Administrative Officer, took the lead in designing how the peer review would be conducted. The data were originally compiled using the "RAT-STATS" Manual which instructs regarding the management and analysis of voluminous data. The peer review sample was obtained from RSI files dated March 2007 through March 2008. The data, comprised of 90,000 images, were separated according to the month RSI performed the work, and then broken out into multiple subsets of fifty images each. RAT-STATS required random selection and review of a minimum of forty-nine images in order for the peer review to be representative of the larger array of 90,000 images with a twenty-five percent precision rate. In order to achieve a greater confidence level, Dr. Morrissey determined that Dr. Sacks should review a minimum of 287 images. As it happened, Dr. Sacks reviewed 1060 images.
RAT-STATS was authored by the U.S. Department of Health and Human Services ("HHS") and recommended by the Office of Inspector General of HHS. The Government analysts also used RAT-STATS. The Government contended below that because Dr. Morrissey did not adopt and use all of the technique set out within RAT-STATS, the peer review sample was necessarily skewed. Dr. Reddy's position on this issue, supported by Dr. Morrissey, was that portions of RAT-STATS were not applicable because the sample data for the peer review was finite or predetermined. This is another potential area that would have been appropriately taken up on cross-examination of Dr. Morrissey.
Because of a computer crash in April 2007, RSI reportedly had no available records preceding April 2007.
Dr. Sacks began his peer review with images from March 2008 and worked backwards. The RAT-STATS system, in conjunction with a software program designed by Dan Rabideau, made it possible for Dr. Sacks to click on his computer screen and prompt the computer to generate a randomly selected sample of fifty patient images and corresponding reports available for review. Once Dr. Sacks selected an image and report to review, a new, randomly selected set of fifty patient images and reports would be presented from the same month's data. In a week's time, Dr. Sacks had reviewed more images and reports than the forty-nine RAT-STATS required and more than the 287 Dr. Morrissey initially contemplated. Dr. Morrissey instructed Dr. Sacks to wrap up his study without reviewing records prior to October 2007.
The "RADPEER" protocol (to be distinguished from RAT-STATS) was implemented by Dr. Sacks in performing his substantive medical review. The RADPEER protocol utilizes a scoring system designed to standardize the work of radiologists in comparing prior images with current ones and looking for any discrepancy between them. Dr. Sacks's substantive findings were that Dr. Reddy's work as a radiologist was "spot on" in that Dr. Sacks concurred with Dr. Reddy's conclusions in 1053 out of the original 1060 reports reviewed.
RADPEER's scoring system goes from "1" to "4." A score of "1" means the reviewer concurs with the earlier report; "2" means the reviewer noted something not observed by the first radiologist, but that the new observation is understandable because it is likely that the new observation could not be seen the first time; "3" means there was something in the first report that was a little more apparent and should have been caught by the radiologist most of the time; and "4" means the first radiologist completely missed something that a competent doctor would find every time.
Of the seven reports where Dr. Sacks's assessment varied, Dr. Sacks assigned six reports a score of "2" (minor discrepancy) and one report a score of "3" (indicating that he would have preferred a "greater view" of the area).
We turn now to our discussion of the Daubert criteria: the proposed expert's qualifications, reliability of the methodology, and relevance.
1. Qualifications
Pursuant to Rule 702, an expert witness includes persons qualified "by knowledge, skill, experience, training, or education." Fed. R. Evid. 702.
In this case, the district court precluded all of the proffered testimony from both Drs. Morrissey and Sacks. With respect to Dr. Morrissey, the court explained, "As the person who developed the statistical strategy to collect the sample for Dr. Sacks to review, Dr. Morrissey must qualify as an expert in statistics." The district court noted that Dr. Morrissey had "some knowledge of statistics" since "she studied statistics for nine months at Harvard and had exposure to statistics" in her prior work experience. Nevertheless, bothered that Dr. Morrissey had never used the RAT-STATS program before, and had never designed a peer review study, the district court ultimately found that Dr. Morrissey was not "an expert qualified to employ statistical techniques to develop the peer review procedure she developed here." In like fashion, the district court found that Dr. Sacks was "not qualified to testify that the group of images he selected to review was statistically accurate as a random selection of images." The district court was overly focused on what it deemed an unreliable sample selection process.
The district court's reliance on its finding that neither Dr. Morrissey nor Dr. Sacks possessed the requisite expertise in the field of statistics, is perplexing. Dr. Morrissey, admittedly not a master's level statistician, is a board-certified internist with a master's degree in public health from Harvard University. More importantly, Dr. Morrissey completed significant coursework in the area of statistics while at Harvard and had a wealth of prior experience, including working with statisticians as a peer reviewer. Questions concerning Dr. Morrissey's qualifications were for cross-examination. See, e.g., United States v. Valencia, 600 F.3d 389, 425-26 (5th Cir. 2010) (affirming trial court's admission of statistical study and noting that challenges to the study could be highlighted by opposing counsel on cross-examination). Similarly, it was not seriously disputed that Dr. Sacks was qualified to provide competent opinion testimony concerning his medical peer review. Dr. Sacks was a graduate of Cornell University, obtained his medical degree from UCLA, had completed a four-year residency in diagnostic radiology at Emory University, and was board-certified. In short, Dr. Sacks did not have to be trained in statistics to successfully conduct a peer review of RSI radiology reports.
As far as Dr. Morrissey's qualifications to determine the array of 90,000 sample images for the peer review, Dr. Morrissey was assisted by IT Director Dan Rabideau from the outset. Rabideau, who testified for the prosecution at trial, provided the same sample in the same fifty-image subset format to the Government's analysts.
Further, the authority relied upon by the Government for this proposition is inapposite. See, e.g., Johnson v. DeSoto Cty. Bd. of Comm'rs, 204 F.3d 1335, 1342, 1342 n. 14 (11th Cir. 2000) (voting rights case ruling that statistical evidence derived from voter registration data is admissible under Daubert to determine relevant population features; noting that if voter registration figures are less reliable than other sources of population demographics, and that if proposed expert makes faulty assumptions, these go "to the weight of the evidence" and are "freely challengeable on cross-examination"). The purpose of the peer review study in the instant case was not to assist the jury in determining if a certain fact or inference was statistically likely, but rather, whether the consistent radiological findings of Dr. Sacks imply that it was more likely that Dr. Reddy in fact reviewed the images for which he attested. We conclude that these findings amount to clear error.
2. Reliability
The reliability of the methodology employed by Drs. Morrissey and Sacks in conducting the peer review was amenable to testing by cross-examination.
First, according to Dr. Reddy, the RADPEER method is endorsed by the American College of Radiology and considered the "gold standard" in the area of medical peer review. In fact, as we understand the district court's analysis, use of RAT-STATS and the RADPEER protocol, in and of itself, was not directly at issue. The district court was troubled instead with the application.
The magistrate judge specifically found that Dr. Reddy failed to present evidence that these resources or methodologies were generally used for medical peer reviews.
During oral argument, the Government contended that the peer review study was "fundamentally flawed" because much of the sample selection was improperly left to Dr. Sacks. Dr. Sacks's ability to select the images and reports to review from within the randomly-selected array of each fifty reports presented the gravest concern because there was no set criteria provided to guide his choices. The Government also took issue with Dr. Sacks's effort to include all representative studies from the various modalities into his peer review even though the majority of the images were "plain films." The Government suggests that a statistically superior study would not have left the actual peer reviewer, Dr. Sacks, any role in the process of selecting reports or images to review. Dr. Sacks reviewed 1060 images and was instructed to stop before he had the opportunity to work his way through the entire peer review sample. The district court found that it would be "factually impossible" for Dr. Sacks to assert that the sample he selected was statistically random because he "excluded a portion of the entire sample in making his selections." At the same time, the voluminousness of the data was cited by the parties as an obstacle in establishing the parameters of a random, statistically sound model for peer review. Although the Government's challenges concerning methodology are arguable, we do not agree that Dr. Sacks's role was unduly determinative of the sample, or that his actions critically undermined the randomness of the process for selection of images.
The district court was disturbed that during the Daubert hearing conducted by the magistrate judge, Dr. Sacks described his selection process as being driven by his "gut feeling."
We likewise reject the Government's position that the time frame problems render all of the results and testimony inadmissible. It is undisputed that a portion of the data reviewed by Dr. Sacks fell outside of the indictment period, which alleged that the scheme was perpetrated from the middle of 2006 through January 2008. Dr. Reddy's counsel clarified how this occurred during oral argument and explained that it would have been cost-prohibitive to replicate the study after the Second Superseding Indictment was handed down with different dates. According to the Government, the entire peer review study was tainted because up to 70 percent of the data analyzed was beyond the scope of the Indictment. Nonetheless, Dr. Sacks reviewed a total of 336 images that fell within the Indictment's time period.
Several things contributed to Dr. Sacks's review of data outside the scope of the Indictment: 1) conducting the peer review study before an indictment issued; 2) beginning the peer review with the March 2008 data, working backwards, and ceasing review before exhausting the entire peer review sample; and 3) the Second Superseding Indictment's allegation that the scheme began nearly a year earlier than the period of time represented during the investigation.
On January 28, 2008, an administrative subpoena was served on RSI and the HHS investigation into Dr. Reddy's business was officially underway. The data was collected and provided to Dr. Reddy (and Drs. Morrissey and Sacks) during the pendency of the investigation. The peer review was undertaken in the fall of 2008 and included data spanning from April 2007 through March 2008.
All of the indictments issued afterthe peer review had been completed. The original Indictment, handed down in November 2009, alleged that the charged scheme would have begun in or about May 2007 and ended in or about January 2008 (upon notification of the investigation). The First Superseding Indictment issued in December 2009, and charged more offenses but maintained the same time period for the scheme. In July 2010, the Second Superseding Indictment expanded the alleged scope of the scheme by approximately twelve months by identifying the relevant time period as "at least as of mid-2006, through in or about January 2008."
The court below deemed the peer review evidence beyond the Indictment period irrelevant, yet excluded the entire study, including the portion of Dr. Sacks's peer review that fell within the relevant time period. This finding is inconsistent with our holding in City of Tuscaloosa that the preferred course of action is to carve out such portion of the evidence as is not germane. 158 F.3d at 565 (reversible error where trial court prohibited two of three proffered defense experts from offering any testimony on various grounds; one proposed expert was a CPA, the other was a statistician). City of Tuscaloosa teaches that expert testimony need only assist the trier of fact in understanding the evidence or in determining a fact in issue—not prove the case at all events. Id., at 564-65 (circumstantial evidence "must merely constitute one piece of the puzzle that the [litigant] endeavor[s] to assemble before the jury"). Here, excluding the data outside of the Indictment period, and the time period after RSI was served with a subpoena (upon which RSI was on notice of the allegations), Dr. Sacks conducted a review of 336 cases—more than the minimum number of 287 studies needed to achieve the greater confidence level. The district court could have feasibly limited Dr. Sacks's testimony to this discrete portion of his peer review and allowed the jury to consider its weight.
At oral argument, the Government was unable to give answer to direct and pointed questions concerning the efficacy of allowing the jury the benefit of hearing any portion of the proposed expert testimony, why cross examination would not have been sufficient to highlight the alleged statistically unsound aspects of the Dr. Sacks's review, and the like.
3. Relevance
Moreover, we conclude that the peer review evidence was relevant. In this case, Dr. Reddy's theory at trial was simply that he "did the work" the Government accused him of not doing. The purpose of the peer review was to provide the jury with some sense of the accuracy of the work that was being generated by Dr. Reddy as indicative of the actual attention he must have paid to the images on which he reported. If so persuaded, the jury could infer from Dr. Sacks's peer review and attendant results that a radiologist (as opposed to only an RPA or other lesser trained staff member) had undertaken a review of the image. Dr. Sacks's independent medical review was probative of this fact because Dr. Sacks concurred with the interpretation and diagnosis generated by Dr. Reddy in almost every instance. As contended by Dr. Reddy, "If no doctor reviewed the images, it is inconceivable that the relatively untrained RPA's got it right each and every one of these 65,000 times." For this reason, Dr. Sacks's testimony, while not determinative of Dr. Reddy's guilt or innocence, was highly probative of the central issue in the case as to whether it was more or less likely that Dr. Reddy "did the work" RSI was getting paid to do. We agree with defense counsel's assertion that the peer review evidence had potential as "powerful" defense evidence regarding the alleged perpetration of a fraud. Dr. Sacks's peer review was proffered as a circumstantial evidence (a single piece of the puzzle) probative of a factual question—whether the work had likely been performed by a radiologist, namely, Dr. Reddy, consistent with the representations of RSI.
If Dr. Reddy actually reviewed the images and rendered diagnoses as promised, there could be no misrepresentation or fraud.
Notwithstanding the Government's various challenges to Dr. Sacks's proposed testimony, we are persuaded that what Dr. Sacks had to say about his peer review and the accuracy of the work performed by Dr. Reddy was highly probative and would have likely been helpful to the jury. We further find that legitimate criticisms of the peer review study could have been adequately handled on cross-examination, a proven and effective tool for identifying gaps or weaknesses in testimony. Rosenfeld, 654 F.3d at 1193. Thus, it bears repeating that "in most cases" it is for the jury to consider "inadequacies of a study" and weigh the evidence. Id. For all of these reasons, we hold that the district court abused its discretion in excluding the peer review study as a whole. We further hold that the error was not harmless. While the Government's circumstantial evidence was strong, Dr. Reddy proffered a cogent defense which was erroneously truncated by the Daubert ruling.
As discussed, the Daubert ruling went to the heart of Dr. Reddy's defense. According to counsel, the impact of the district court's ruling excluding Dr. Sacks's testimony in its entirety was devastating.
Accordingly, we reverse on this issue, vacate Dr. Reddy's convictions, and remand for further proceedings consistent with this opinion.
In vacating Dr. Reddy's convictions, we recognize that because Count 37 does not allege fraud, the district court's Daubert ruling would not have had the same impact upon the jury's evaluation of this offense (as compared to the mail, wire, and health care fraud offenses). Nonetheless, we conclude that the defense evidence was curtailed such that, in the interest of justice, a retrial on all counts is warranted.
In light of our decision to vacate Dr. Reddy's convictions pursuant to Daubert, we do not reach the second evidentiary issue concerning the scope of Mark Bronkalla's testimony, which is potentially moot and subject to rectification upon retrial in any event.
III.
For purposes of any further proceedings upon remand, we next consider whether Counts 33-36 were sufficiently alleged within the Indictment. We review the legal sufficiency of the Indictment de novo. United States v. Bobo, 344 F.3d 1076, 1083 (11th Cir. 2003); United States v. Poirier, 321 F.3d 1024, 1028 (11th Cir. 2003).
Counts 33 through 36 of the Indictment charge Dr. Reddy with violations of 18 U.S.C. § 1347. On appeal, Dr. Reddy contends that the Indictment is fatally deficient as to Counts 33-36 for failure to allege the "jurisdictional hook" (or federal nexus) that commerce be affected by the charged conduct. Dr. Reddy points to the failure of the Indictment to expressly mention the phrase "affecting commerce," as well as its exclusion of the statutory definition of "health care benefit program," which requires that the same "affect commerce." See 18 U.S.C. § 24(b).
Under 18 U.S.C. § 1347, health care fraud is defined as a scheme or artifice:
(1) to defraud any health care benefit program; orFor purposes of § 1347, "any health care benefit program" includes:
(2) to obtain, by means of false or fraudulent pretenses, representations, or promises, any of the money or property owned by, or under the custody or control of, any health care benefit program,
in connection with the delivery of or payment for health care benefits, items, or services.
"[A]ny public or private plan or contract, affecting commerce, under which any medical benefit, item, or service is provided to any individual, and includes any individual or entity who is providing a medical benefit, item, or service for which payment may be made under the plan or contract."18 U.S.C. § 24(b) (emphasis added).
Our determination of the Indictment's legal sufficiency depends, in part, upon whether "affecting commerce" is an essential element for purposes of § 1347. Because § 1347 is a federal offense, there is little doubt that the underlying conduct must have an interstate nexus or other "jurisdictional hook." The statutory definition of "health care benefit program" requires that commerce be affected, and so we find, as a matter of law, that "affecting commerce" is an essential element of health care fraud. See, e.g., United States v. Klein, 543 F.3d 206, 211 (5th Cir. 2008) (holding that "affecting commerce" is an essential element of § 1347 without analyzing legal sufficiency of the indictment).
When an Indictment's language generally tracks the statutory language, the Indictment is sufficient to withstand a motion to dismiss. United States v. Gayle 967 F.2d 483, 485 (11th Cir. 1992) (en banc); Fed. R. Crim. P. 7(c)(1). In Gayle, we looked to the underlying reasons for the rule requiring that an indictment set forth the essential elements of an offense. 967 F.2d at 485. We determined that the dual purposes of the rule of law were satisfied by: 1) notice to the defendant of "the nature and cause of the accusation as required by the Sixth Amendment of the Constitution"; and 2) "ensuring that a grand jury only return an indictment when it finds probable cause to support all the necessary elements of the crime" consistent with the Fifth Amendment. Id.
Rule 7 of the Federal Rules of Criminal Procedures reads in pertinent part:
"The indictment . . . must be a plain, concise, and definite written statement of the essential facts constituting the offense charged. . . . A count may incorporate by reference an allegation made in another count. . . . For each count, the indictment or information must give the official or customary citation of the statute, rule, regulation, or other provision of law that the defendant is alleged to have violated."Fed. R. Crim. P. 7(c)(1).
The first criteria, notice of the nature and charge of the accusations, is not seriously contested. Counts 33-36 include a chart that identifies the specifics for each Count such as the offense date, the name of the client hospital (Upson), and the patient.
Here, the language and facts alleged within the Indictment as a whole support the inference that the grand jury understood that "affecting commerce" was part of the § 1347 offense. Counts 33-36 read as follows:
"[T]he Defendant, RAJASHAKHER P. REDDY, knowingly and willfully executed the aforesaid health care fraud scheme by causing a claim for payment to be made by the client hospital . . . from a health care benefit program, for a patient . . . which claim included services that were not, in fact, provided . . . all in violation of Title 18, United States Code, Section 1347."(Indictment, 1-5, 10 ¶ 2) (emphasis added). Thus, Counts 33-36 provide the statutory reference to § 1347 and track the statutory offense language. See United States v. Stefan, 784 F.3d 1093, 1101 (11th Cir. 1986). In addition, Counts 33-36 are prefaced with "paragraphs One through Three of Counts One through Twenty Five," which detail the workings of the alleged health care fraud, including how RSI clients submitted claims for reimbursement nationwide to Medicare and other health care benefit plans for reimbursement, and the reported gain to RSI in revenue upwards of five million dollars. For these reasons, although the "affecting commerce" question was not presented in so many words, we conclude that the grand jury necessarily found that probable cause existed in a context where commerce was affected.
There is no explicit reference in the Indictment to Subsection (1) or (2) of § 1347. Nonetheless, we find that United States v. Scott is distinguishable based upon the clear notice of the underlying factual allegations presented to Dr. Reddy. 993 F.2d 1520, 1521-22 (11th Cir. 1993) (per curiam) (holding indictment insufficient to provide the accused notice of the nature of charges where the statutory reference to 18 U.S.C. § 1703 failed to identify which subdivision of the statute was being charged).
--------
IV.
REVERSED and REMANDED.