Opinion
3351-19
10-07-2020
Defendant Luis Reyes is charged with Burglary in the Second Degree and related offenses. Defendant moves to preclude trial testimony stemming from the use of facial identification software to identify the perpetrator of the crime in question. Not in issue here is that the burglar could be seen in crime scene videos. The case detective was able to recognize defendant as that individual after he examined a police file containing photographs of defendant. Defendant's complaint is that the detective retrieved that file because of an earlier "hit" on defendant obtained by analyzing the crime scene videos with facial recognition software. Defendant asks the court to "preclude the People's use of the results of any use of NYPD's facial recognition software." Defendant also seeks discovery about the facial recognition software used by the police.
A
For current purposes the court assumes these facts. On September 29, 2019, defendant committed a burglary at 507 West 113th Street in Manhattan, entering a mail room there to steal packages. Defendant's actions were recorded by security cameras. The detective assigned to lead the investigation obtained the videos. He made stills from them and sent those stills to the NYPD Facial Identification Section ("FIS") for analysis. Using facial recognition software, the FIS located a single "possible match": one of the burglar's pictures possibly matched defendant's mug shot. The FIS returned a report to the detective bearing a prominent statement that the "match" was only a lead. The report specified that the match did not create probable cause for an arrest, which could be produced only by further investigation.
The case detective therefore obtained defendant's police file. In it he found a copy of the same mug shot along with photos depicting defendant's distinctive forearm tattoos. After studying those photos, the detective again viewed the crime scene videos and recognized that the burglar indeed was defendant. He next prepared a probable cause I-card bearing three photos of the burglar taken from the crime scene videos -- photos which displayed, among other features, his tattoos. The I-card did not include any other picture or defendant's name. On October 14, 2019, officers recognized defendant from that I-card and arrested him.
It is perhaps worthy of note that the three photos of the burglar that were the key component of the I-card were in no way products of the software analysis of the crime scene videos.
During discovery the People gave notice that the case detective had made an "identification" of defendant from the videos. At the same time they stated that they did so out of an excess of caution, in that the viewing of the videos was in their opinion not a police identification procedure.
The People apparently will offer the testimony of a second detective who viewed defendant's photos and then recognized him in the crime scene videos. That will make no difference to the court's analysis.
Defendant now argues that the detective's recognition of defendant on the crime scene videos was in fact the product of an identification procedure and that the use of facial recognition software in the process leading up to that identification requires that he be able to challenge testimony about the viewing in a suppression hearing. And defendant also advises the court that he seeks general discovery about the NYPD's facial recognition techniques. The court rejects defendant's analysis and will not grant relief.
B
There was no "identification procedure" in this case and no identification of defendant that could be suppressed under the terms of Article 710 of the Criminal Procedure Law. In an identification procedure, the police generally arrange for a witness to a crime or a related event to see a person, almost always one suspected of being a criminal. The object is to determine whether the witness recognizes the suspect from the earlier occasion. The procedure may be a "show-up," with the witness looking at one person. It may be a lineup, in which the police display the suspect with other people. It may involve showing only the photo of a suspect to the witness, or showing it with other photos. And if there is no suspect the police may display many photos, in hopes that a picture of the relevant person will be among them. Any identification that results from these procedures may be reviewed pursuant to CPL Section 710.20 to resolve whether the viewing was done under unduly suggestive circumstances, i.e. , circumstances conducive to an erroneous identification of the criminal. See People v. Gee , 99 N.Y.2d 158, 163, 753 N.Y.S.2d 19, 782 N.E.2d 1155 (2002).
But those procedures are not at all like what happened in this case. The challenged witness was not present at the crime. The witness therefore did not look at a known suspect or at a suspect's photo to state from a memory of the crime whether the suspect was the criminal. Nor did he view a group of photos, in hopes of a chance viewing of a picture of the criminal. The witness viewed a recording of the crime itself and of the person committing it to see if he recognized that person. Such a viewing may or may not produce a reliable recognition, depending on such factors as distance and lighting. But it is impossible for the procedure to be conducive to an erroneous suggestion that the person viewed is the criminal. As the Court of Appeals noted in a very closely related context:
[T]he only person the clerk could possibly confirm to be the robber was the person on the videotape who was concededly in the process of robbing her. There were no other choices, and there was nothing resembling a selection process.
People v. Gee , 99 N.Y.2d at 162, supra , 753 N.Y.S.2d 19, 782 N.E.2d 1155; see also People v. Jackson , 23 Misc. 3d 1128(A), 2009 WL 1405511 at 5–6, 2009 N.Y. Misc. LEXIS 1188 at 15-17 (Sup. Ct. N.Y. Co. 2009).
Not helpful to defendant is People v. Jones , 173 A.D.3d 1062, 102 N.Y.S.3d 265 (2nd Dept. 2019). There the police showed a robbery victim a cell phone video depicting the defendant, but it was not a video of the crime with which the defendant was charged. A taser had been employed in the robbery, and the video showed defendant using a taser. And the victim was told that the phone had been found near the location of the crime. Only after identifying the defendant in the video under those suggestive circumstances did the victim pick the defendant from a photo array composed with the help of facial recognition software. The viewing of a video that did not depict the crime was like a single photo showup that may have triggered the later array identification, and was nothing like the viewing of the crime scene video here.
This case differs from Gee in one respect. Here the witness who viewed the crime scene stills was not someone who had been present at the crime. Rather, he was a detective who had familiarized himself with defendant's appearance from his mug shot and other police file photos. But that is a distinction without a difference. The detective was not resolving whether the person in the video was the one who committed the burglary; that was a given. He looked at the video to determine whether he knew someone in it. In such a viewing there is nothing suggesting that your answer should be "yes" or "no."
Nor is it at all clear that the jury will learn of this pretrial recognition at trial.
A witness can of course err in reporting that he recognizes a criminal in a video that fairly shows a crime. Still, such a video is not constructed by the police in a manner that suggests the identity of the criminal. It follows that the People's notice pursuant to CPL Section 710.30 was superfluous, as the People in fact stated while giving that notice. No suppression claim could succeed, and section 710.30 did not require the People to advise defendant of the detective's recognition of the burglar. Whether the witness's recognition testimony is wrong is a jury question.
Whether a video is a fair representation is an evidentiary issue for the trial judge.
--------
C
Defendant's substantive complaint is in fact not that there was anything suggestive about a viewing of a crime scene video. Rather, he seeks relief because he was identified as a suspect after the use of facial recognition software. A brief discussion of that circumstance is therefore pertinent.
First, "facial recognition" involves the use of software to analyze the front and perhaps side portions of the head of an unknown person, usually as depicted in a photo or a video still. The software measures the location and contours of facial features, including hair. It next compares the results with those for photos of known individuals — photos that are digitally maintained for these comparison purposes — to select any possible "matches." The authorities can then investigate whether the individual or individuals in the selected photos could be the unknown person. The results can show, for example, that an applicant for a driver's license has had licenses under different names. See , e.g. , People v. Byrd , 96 A.D.3d 962, 963, 946 N.Y.S.2d 642 (2nd Dept. 2012).
To the best of this judge's knowledge, a facial recognition "match" has never been admitted at a New York criminal trial as evidence that an unknown person in one photo is the known person in another. There is no agreement in a relevant community of technological experts that matches are sufficiently reliable to be used in court as identification evidence. See Frye v. United States , 293 F. 1013 (D.C. Cir. 1923). Facial recognition analysis thus joins a growing number of scientific and near-scientific techniques that may be used as tools for identifying or eliminating suspects, but that do not produce results admissible at a trial. Cf. People v. Williams , 35 N.Y.3d 24, 43-44, 124 N.Y.S.3d 593, 147 N.E.3d 1131 (2020). The People argue that the FIS results were just of this sort, and can provide investigative leads.
Some uses of facial recognition software are controversial. For example, many people fear that employment of such software will erode First Amendment rights by permitting unfriendly officials to identify and take action against those who demonstrate against government policies. That concern is especially pronounced given the ubiquity of security cameras in many big-city areas. See , e.g. , The Secretive Company That Might End Privacy as We Know It , https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html.
It may well be that legislative action should be taken to curtail the use of facial recognition techniques for such purposes. And this court understands what a "slippery slope" argument is. The court notes, however, that these and other common concerns about facial recognition techniques seem dramatically divorced from the use of those techniques to develop leads from photographs of people taken as they commit crimes. And such photos are often obtained from private entities, not governmental ones, who employ cameras precisely to secure their safety from criminals.
That is in fact what occurred in this burglary case. No reason appears for the judicial invention of a suppression doctrine in these circumstances. Nor is there any reason for discovery about facial recognition software that was used as a simple trigger for investigation and will presumably not be the basis for testimony at a trial, except as might otherwise be required by CPL Section 245.20.
* * *
Defendant's application for relief is denied.