Uniloc 2017 LLCDownload PDFPatent Trials and Appeals BoardJul 13, 2020IPR2020-00170 (P.T.A.B. Jul. 13, 2020) Copy Citation Trials@uspto.gov Paper 14 571-272-7822 Date: July 13, 2020 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD GOOGLE LLC, Petitioner, v. UNILOC 2017 LLC, Patent Owner. IPR2020-00170 Patent 6,253,201 B1 Before BRYAN F. MOORE, LYNNE E. PETTIGREW, and JON M. JURGOVAN, Administrative Patent Judges. PETTIGREW, Administrative Patent Judge. DECISION Denying Institution of Inter Partes Review 35 U.S.C. § 314 I. INTRODUCTION Petitioner, Google LLC, filed a Petition for inter partes review of claims 1–14 and 17–20 of U.S. Patent No. 6,253,201 B1 (Ex. 1001, “the ’201 patent”). Paper 1 (“Pet.”). Patent Owner, Uniloc 2017 LLC, filed a Preliminary Response. Paper 7 (“Prelim. Resp.”). Pursuant to our authorization for supplemental briefing on the issue of discretionary denial IPR2020-00170 Patent 6,253,201 B1 2 under 35 U.S.C. § 314(a), Petitioner filed a Reply to Patent Owner’s Preliminary Response, and Patent Owner filed a Sur-reply. Paper 9 (“Pet. Reply”); Paper 12 (“PO Sur-reply”); see Paper 8, 3 (authorizing reply and sur-reply). Under 35 U.S.C. § 314 and 37 C.F.R. § 42.4(a), we have authority to institute an inter partes review if “the information presented in the petition . . . and any response . . . shows that there is a reasonable likelihood that the petitioner would prevail with respect to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a). After considering the parties’ briefing and the evidence of record, we conclude the information presented does not show there is a reasonable likelihood that Petitioner would prevail in establishing the unpatentability of at least one of claims 1– 14 and 17–20 of the ’201 patent. Accordingly, we do not institute an inter partes review. II. BACKGROUND A. Related Matters The parties identify the following district court proceeding related to the ’201 patent: Uniloc 2017, LLC v. Google LLC, No. 2:18-cv-00548 (E.D. Tex., filed Dec. 30, 2018). Pet. 3; Paper 3, 2 (Patent Owner’s Mandatory Notices). That case has been transferred to the Northern District of California. See Ex. 3002 (Transfer Order, June 19, 2020). B. Overview of the ’201 Patent The ’201 patent describes a system and method for matching a target image to images stored in a database. Ex. 1001, 1:5–8. The disclosed method determines similarity between images based on a count of the number of corresponding partitions in each image having similar characteristics. Id. at 2:38–42. For efficiency purposes, indexed lists of IPR2020-00170 Patent 6,253,201 B1 3 image identifiers are maintained, and the count of similar characterizations of an image is determined by the count of occurrences of the image’s identifier in selected lists. Id. at 2:42–46. The selected lists are determined by a characterization of a target image from which similar images are to be identified. Id. at 2:46–48. The ’201 patent states that its indexing and retrieval techniques are suitable for a variety of image characterization techniques, such as those based on color content or edge content of partitions of an image. Id. at 2:48–52. Figure 1 of the ’201 patent is reproduced below: Figure 1 above illustrates a block diagram of an image retrieval system disclosed in the ’201 patent. Id. at 2:53–54. The image retrieval system IPR2020-00170 Patent 6,253,201 B1 4 includes characterizer 120, which produces indices 102 and 112 to lists of image identifiers 130, and search engine 140, which processes selected lists of image identifiers 135 to determine images 181 that have a high number of occurrences 161 in selected lists 135. Id. at 2:54–59. A user provides a graphic representation of target image 101 to the image retrieval system to determine images 111 of reference database 110 that are similar in characteristics to target image 101. Id. at 2:60–63. Source 100 providing target image 101 may be, for example, an input device such as an image scanner, a digitizer, or a camera. Id. at 2:63–65. Reference image database 110 contains images that may be created and stored using similar input devices. Id. at 3:5–9. Each image 111 in reference image database 110 is provided to characterizer 120 to create indexed lists of image identifiers 130. Id. at IPR2020-00170 Patent 6,253,201 B1 5 3:24–26. Figure 2 of the ’201 patent, reproduced below, illustrates an example block diagram of characterizer 120: Id. at 3:26–29. As shown in Figure 2 above, characterizer 120 partitions image 201 into an array of partitions (P) using partitioner 210. Id. at 3:31– 33. Characteristic processor 220 processes each image partition P to produce characteristic measure 221 that describes the partition. Id. at 3:36– 39. Characteristic measure 221 may be, for example, a histogram of the number of occurrences of particular colors or particular types of edges. Id. at 3:39–44. Quantizer 230 quantizes characteristic measure 221 to produce indexed characterizations (Idx). Id. at 3:48–51. For example, IPR2020-00170 Patent 6,253,201 B1 6 quantizer 230 may place the characteristic measure into bins such as quartiles (e.g., 0–25%, 26–50%, 51–75%, or 76–100% blue). Id. at 3:51–56; see Pet. 13. Characterizer 120 outputs index 202, which includes the identification P of each partition and the indexed characterization Idx. Ex. 1001, 3:26–29, 4:10–11, 4:23–25. Partition and characterization information in index 202 is arranged into lists to facilitate searching. Id. at 4:10–5:3. Each image has identifier 215 identifying the image (e.g., by location). Id. at 4:10–22. To arrange the information into lists, the system stores identifier 215 of the image in each list associated with a partition P of the image that has an indexed characterization Idx. Id. at 4:23–25. For example, as shown in Figure 2, if partition P1 corresponds to the upper left corner of each image, and index I1 corresponds to an occurrence of predominantly red and blue colors, list 135a will be a list of the identifiers (A, D, Q, R, K) of all images 111 in database 110 that have predominantly red and blue colors in the upper left corner. Id. at 4:25–31. Similarly, list 135b is a list of identifiers (D, L) of images that have predominantly red and blue colors in partition P2 (e.g., the lower left corner). Id. at 4:31–34. Once the system characterizes and indexes the database images, search engine 140 identifies reference images from the database similar to a target image, which undergoes the same characterization process as the reference database images. Id. at 2:53–59, 5:26–32, Fig. 1. The similarity is determined by counting the number of occurrences of each reference image 111 that has a corresponding partition with the same characteristics as target image 101. Id. at 5:28–32, Fig. 4. To count the occurrences, the system processes in a loop through the reference image database for each indexed characterization and maintains a running count of how many times IPR2020-00170 Patent 6,253,201 B1 7 an image identifier appears. Id. at 5:66–6:33, Fig. 4. The image identifier with the highest count is identified as the closest match to target image 101. Id. at 6:34–43. C. Illustrative Claims Challenged claims 1, 5, and 10 are independent. Claims 2–4 depend directly from claim 1, claims 6–9 depend directly or indirectly from claim 5, and claims 11–14 and 17–20 depend directly or indirectly from claim 10. Claims 1 and 5 are illustrative of the claimed subject matter and are reproduced below: 1. A method of image retrieval, comprising the steps of: partitioning a target image into a plurality of content- independent partitions, characterizing each partition of the plurality of content- independent partitions to form an index value associated with each partition, obtaining a list of image identifiers associated with the index value, accumulating counts of each image identifier in the list of image identifiers associated with each partition of the plurality of content-independent partitions, and retrieving at least one image associated with at least one of the image identifiers, based upon the counts of the at least one of the image identifiers. 5. A method of indexing an image, comprising the steps of: identifying the image by an image identifier, partitioning the image into a plurality of content independent partitions, characterizing each content independent partition of the plurality of partitions to form at least one index value of a plurality of index values, and appending the image identifier to at least one list of a plurality of lists of image identifiers associated with each partition, the at IPR2020-00170 Patent 6,253,201 B1 8 least one list being determined by the at least one index value that characterizes the each partition. Ex. 1001, 8:2–16, 8:29–42. D. Asserted Ground of Unpatentability Petitioner asserts that the challenged claims are unpatentable based on the following ground (Pet. 7): Claims Challenged 35 U.S.C. § References 1–14, 17–20 103(a)1 Hull,2 Barber3 In support of its contentions, Petitioner relies on the Declaration of Dr. John R. Grindon (Ex. 1003). III. ANALYSIS A. Principles of Law A claim is unpatentable under § 103(a) if the differences between the claimed subject matter and the prior art are such that the subject matter, as a whole, would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007). The question of obviousness is resolved on the basis of underlying factual determinations, including (1) the scope and content of the prior art; (2) any differences between the claimed subject matter and the prior art; (3) the level of ordinary 1 The Leahy-Smith America Invents Act (“AIA”), Pub. L. No. 112-29, 125 Stat. 284, 285–88 (2011), revised 35 U.S.C. § 103 effective March 16, 2013. Because the ’201 patent has an effective filing date prior to the effective date of the applicable AIA amendment, we refer to the pre-AIA version of § 103. 2 U.S. Patent No. 5,465,353, issued Nov. 7, 1995 (Ex. 1005). 3 U.S. Patent No. 5,579,471, issued Nov. 26, 1996 (Ex. 1006). IPR2020-00170 Patent 6,253,201 B1 9 skill in the art; and (4) when in evidence, objective indicia of non-obviousness.4 Graham v. John Deere Co., 383 U.S. 1, 17–18 (1966). An invention “composed of several elements is not proved obvious merely by demonstrating that each of its elements was, independently, known in the prior art.” KSR, 550 U.S. at 418. In an obviousness analysis, “it can be important to identify a reason that would have prompted a person of ordinary skill in the relevant field to combine the elements in the way the claimed new invention does.” Id. Further, “[t]o satisfy its burden of proving obviousness, a petitioner cannot employ mere conclusory statements. The petitioner must instead articulate specific reasoning, based on evidence of record, to support the legal conclusion of obviousness.” In re Magnum Oil Tools Int’l, Ltd., 829 F.3d 1364, 1380 (Fed. Cir. 2016). B. Level of Ordinary Skill in the Art Petitioner asserts that a person of ordinary skill in the art at the time of the alleged invention of the ’201 patent would have had “a bachelor’s degree in electrical engineering, computer science, or computer engineering and had two years of relevant experience, or a graduate degree with a focus, in the field of image processing.” Pet. 18 (citing Ex. 1003 ¶¶ 52–53). Patent Owner does not offer a proposal regarding the level of ordinary skill in the art. Prelim. Resp. 24. On the present record, we determine that Petitioner’s proposed level of ordinary skill in the art is consistent with the ’201 patent and asserted prior art. See Okajima v. Bourdeau, 261 F.3d 1350, 1355 (Fed. Cir. 2001). 4 With respect to the fourth Graham factor, the parties do not present arguments or evidence regarding objective indicia of non-obviousness. Therefore, the obviousness analysis at this stage of the proceeding is based on the first three Graham factors. IPR2020-00170 Patent 6,253,201 B1 10 Therefore, we adopt Petitioner’s proposal for purposes of deciding whether to institute inter partes review. C. Claim Construction In an inter partes review, we apply the same claim construction standard that would be used in a civil action under 35 U.S.C. § 282(b), following the standard articulated in Phillips v. AWH Corp., 415 F.3d 1303 (Fed. Cir. 2005) (en banc). 37 C.F.R. § 42.100(b). In applying this standard, we generally give claim terms their ordinary and customary meaning, as would be understood by a person of ordinary skill in the art, at the time of the invention and in the context of the entire patent disclosure. Phillips, 415 F.3d at 1312–13. Neither party proposes a construction for any claim term. Petitioner contends that no claim terms require express construction for purposes of applying the prior art. Pet. 19. Patent Owner argues that, at this stage of the proceeding, the Board should adopt the ordinary and customary meaning of the claim terms. Prelim. Resp. 25. For purposes of this decision, we determine that no claim terms require express construction. See Nidec Motor Corp. v. Zhongshan Broad Ocean Motor Co., 868 F.3d 1013, 1017 (Fed. Cir. 2017) (holding that only claim terms in controversy need to be construed, and only to the extent necessary to resolve the controversy (citing Vivid Techs., Inc. v. Am. Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999))). D. Asserted Obviousness over Hull and Barber Petitioner contends that claims 1–14 and 17–20 of the ’201 patent are unpatentable under 35 U.S.C. § 103 as obvious over the combined teachings of Hull and Barber. Pet. 19–81. For the reasons discussed below, we conclude that Petitioner has not demonstrated a reasonable likelihood it IPR2020-00170 Patent 6,253,201 B1 11 would prevail in showing that any of the challenged claims are unpatentable on this asserted ground. 1. Overview of Hull Hull discloses a document matching and retrieval system that matches an input document against a database of documents. Ex. 1005, code (57), 1:6–9, 4:33–36. Documents are digitally represented, and the content of a document may be “text, line art, photographic images, computer-generated images, or other information, or combinations of these types of content.” Id. at 1:10–11, 1:27–29. In addition to a document database, the system includes a descriptor database, which lists descriptors derived from features of documents. Id. at 4:36–38. An entry in the descriptor database represents a descriptor and points to a list of all documents in the document database that include the feature associated with the descriptor. Id. at 4:39–42. “The descriptors are selected to be invariant to distortions caused by digitizing the documents and are redundant to accommodate noise in the digitization of the documents or differences in the input document and its match in the document database.” Id. at 4:42–46; see id. at code (57). IPR2020-00170 Patent 6,253,201 B1 12 Figure 1 of Hull is reproduced below: Figure 1 above illustrates a block diagram of a document storage system in which digital representations of documents are stored along with descriptor cross-references. Id. at 5:54–57. Document storage system 100 stores a digital representation of a document in document database 108 while the digital representation is passed to feature extractor 110. Id. at 6:59–61. Feature extractor 110, which is supplied with descriptor rules for determining which features of a document are extracted and converted to descriptors, produces descriptor elements from the document’s digital representation. Id. at 6:54–57, 6:62–63. “The descriptors are preferably invariant of translation, rotation, scaling, format font and subdivision, so that a given descriptor which is extracted from a document would still be extracted, even if the document is scanned in a different orientation, is resized or otherwise distorted.” Id. at 6:63–7:1. Feature extractor 110 provides descriptors extracted from a document, along with a document tag IPR2020-00170 Patent 6,253,201 B1 13 provided by document tag manager 112, to descriptor database 116 when hash generator 114 is not in use. Id. at 6:45–50; see id. at 7:66–8:1 (describing operation “[i]f hash generator 114 is not used”). Figure 5 of Hull, annotated by Petitioner, is reproduced below: Pet. 10 (citing Ex. 1005, Fig. 5). Annotated Figure 5 above illustrates the organization of descriptor database 116, showing each descriptor followed by a linked list of document tags identifying each document in document database 108 from which the descriptor is extracted and saved. Ex. 1005, 21:7–11. IPR2020-00170 Patent 6,253,201 B1 14 Figure 2 of Hull is reproduced below: Figure 2 above illustrates a block diagram of a query engine used to locate stored documents that match in whole or in part with a query document. Id. at 5:58–60. Query engine 200 accepts, for example, electronic document 203 or paper document 202 scanned by scanner 204 and, using descriptor database 116 and document database 108, outputs a match document (e.g., paper representation 220 or electronic document 222). Id. at 7:15–20. To perform a query, the system inputs a document’s digital representation to feature extractor 210, which, using the provided descriptor rules, derives descriptors from the query input. Id. at 7:61–65. If hash generator 214 is not used, feature extractor 210 outputs the derived descriptors to document matcher 216, which retrieves records for each of the descriptors from descriptor database 116. Id. at 7:66–8:10. Each retrieved record includes a list of documents associated with that descriptor. Id. at 8:10–12. Document matcher 216 accumulates votes for documents by counting one vote for each time a document’s tag appears on a list associated IPR2020-00170 Patent 6,253,201 B1 15 with a descriptor. Id. at 8:17–21. Document retriever 218 retrieves the document with the most votes as the output match, or alternatively retrieves all documents with more than a threshold number of votes. Id. at 8:23–32. Hull describes the selection of descriptor rules provided to the feature extractors. Id. at 9:45–10:19. “The descriptors provide a distortion- invariant representation for local features in an image, and therefore a descriptor is an independent source of information.” Id. at 9:47–49. “The use of distortion-invariant descriptors requires that the feature extraction stage be specialized . . . and dependent on the type of content of the image.” Id. at 9:57–60. For example, when documents contain images of text, word length pattern (i.e., an ordered list of the lengths of consecutive words) is a good descriptor because it is not a function of the rotation, translation, or scale of a document. Id. at 9:60–64. Hull provides an example implementation of the disclosed system using such a descriptor for text images. Id. at 13:4–15:61. Hull further explains that its system is also applicable to graphic document images and images with combined text and graphics. Id. at 20:34–37. For such images, feature extractors 110 and 210 would include the ability to locate and characterize graphic features. Id. at 10:37–40. In one embodiment, for example, the feature extractor scans the document for regions of high spatial frequency objects or sharp contrast edges, referred to as “interesting points.” Id. at 20:40–43. “To be invariant through translation, rotation and scaling, a descriptor could be used which describes the angular relationship between three or more such interesting points. Additional relationships between interesting points can be used to normalize the descriptor so that it is also invariant of aspect ratio changes.” Id. at 20:46–51. IPR2020-00170 Patent 6,253,201 B1 16 2. Overview of Barber Barber discloses a system for retrieving images from a database in response to queries based on image characteristics such as colors, textures, shapes, and sizes. Ex. 1006, code (57), 1:13–15. In one embodiment, the system presents an image query window together with one or more image characteristic windows, each representing a particular predefined image characteristic and including a set of one or more thumbnails (i.e., icons) corresponding to various values of the image characteristic represented by the characteristic window. Id. at 3:8–14. The thumbnails represent aspects of the image such as color, texture, shape, and area, and may be dragged to the image query window where they may be arranged into a desired spatial orientation corresponding to the positioning of image features. Id. at 3:14– 19. In a second embodiment, an image characteristic selection area includes a color palette or wheel, and a cursor-based mechanism enables a user to point at a color in the selection area, select the color by clicking, and draw or paint with the selected color in the image query area. Id. at 3:51–56. This embodiment finds images according to a user-specified pattern consisting of image characteristics at specified positions. Id. at 12:40–42. Each image in the database is stored as a pixel array that is “spatially partitioned, such as into a rectangular grid, a radial grid providing finer granularity in the center of the image, or any other quantization that fits the image data and application.” Id. at 12:48–52. Then, for each image in the image database, using the areas defined by partitioning in the previous step, the system computes a set of image characteristics for each area. Id. at 12:64–67. Image characteristics can include measures such as image color, texture, or edge content. Id. at 12:67–13:2. The system stores the image IPR2020-00170 Patent 6,253,201 B1 17 characteristics in a database organized by image and spatial partition. Id. at 13:5–46. When a user specifies a query by drawing or painting features in an image query window, the system computes a similarity score for each image in the database, ranks the images by their scores, and returns the images with the best scores. Id. at 13:48–14:67. 3. Proposed Combination of Hull and Barber In its proposed combination of the teachings of Hull and Barber, Petitioner generally relies on Hull’s document matching algorithm and database arrangement and Barber’s image partitioning and characterization technique. See Pet. 19–20. For example, with respect to independent claim 1, Petitioner cites Hull for teaching a “method of image retrieval,” as recited in the preamble. Id. at 29–30 (citing Ex. 1005, code (57), 1:18–28). For the step of “partitioning a target image into a plurality of content- independent partitions,” Petitioner asserts that Hull teaches matching an input document (i.e., the claimed “target image”) against a database of documents (e.g., images) and cites Barber’s spatial partitioning (e.g., into a rectangular grid) for teaching partitioning an image into content-independent partitions. Id. at 30–32 (citing Ex. 1005, code (57), 1:18–28; Ex. 1006, 12:48–62, Figs. 11A, 11B). With respect to the next limitation of claim 1, “characterizing each partition of the plurality of content-independent partitions to form an index value associated with each partition,” Petitioner relies on a combination of Barber and Hull. Id. at 33–44. First, Petitioner contends that Barber’s disclosure of computing a set of image characteristics for each image in an image database, using the areas defined by the partitioning, corresponds to “characterizing each partition of the plurality of content-independent partitions.” Id. at 33–35 (citing Ex. 1006, 12:63–67; Ex. 1003 ¶¶ 83–84). IPR2020-00170 Patent 6,253,201 B1 18 For the claimed “index value,” Petitioner turns to Hull’s disclosure of a database organized by feature descriptors, in contrast to Barber’s database organized by images. Id. at 35–39 (citing, e.g., Ex. 1005, 21:23–35, Fig. 5; Ex. 1003 ¶¶ 87, 89). Petitioner contends that a person of ordinary skill in the art would have understood the descriptors of the input document in Hull to be within the scope of the claimed “form[ing] an index value” because the input document descriptors serve as an index to address or find database documents with the same descriptors. Id. at 39 (citing Ex. 1005, Fig. 5; Ex. 1003 ¶ 91). Further, Petitioner asserts that a person of ordinary skill in the art would have understood Hull’s descriptors (corresponding to the claimed “index value[s]”) to be “associated with each partition” in the proposed combination of Hull and Barber. Id. (citing Ex. 1003 ¶ 92). That is, Petitioner contends it would have been obvious to a skilled artisan to include partition information as part of the database index along with the descriptors in the combined system. Id. at 39–40 (citing Ex. 1003 ¶ 92). Petitioner relies on Hull for teaching the remaining limitations of claim 1. First, Petitioner contends that Hull “obtain[s] a list of image identifiers associated with the index value” when Hull’s descriptor database 116 returns document tags associated with a descriptor. Id. at 44 (citing Ex. 1005, 8:6–16, 8:60–64, 21:7–11). Next, Petitioner asserts that Hull “accumulat[es] counts of each image identifier in the list of image identifiers associated with each partition of the plurality of content- independent partitions” when Hull’s document matcher 216 accumulates votes for documents by counting one vote for each time a document’s tag appears on a list associated with a descriptor (which in the combination with Barber includes partition information). Id. at 45 (citing Ex. 1005, 8:17–22). Finally, Petitioner asserts that Hull “retriev[es] at least one image associated IPR2020-00170 Patent 6,253,201 B1 19 with at least one of the image identifiers, based upon the counts of the at least one of the image identifiers” when Hull’s document retriever 218 retrieves the document with the most votes as the output match. Id. at 45–46 (citing, e.g., Ex. 1005, 7:56–60, 8:6–32). We now turn to Petitioner’s reasoning for the proposed combination of Hull and Barber. See Pet. 19–29. Citing the testimony of Dr. Grindon, Petitioner contends it would have been obvious to a person of ordinary skill in the art to implement Hull’s search algorithm and database arrangement using known image characterization techniques and image similarity measures like those of Barber. Id. at 19–20 (citing Ex. 1003 ¶¶ 57–75). Petitioner begins with Hull, asserting that it discloses a matching and retrieval system that is equally applicable to text document images and non-text images (i.e., graphics or pictures). Id. at 20 (citing Ex. 1003 ¶ 59; Ex. 1005, 20:34–45). Petitioner contends that for graphic images, Hull “broadly explains that . . . feature extractors would need to ‘include the ability to locate and characterize graphical features,’ and provides, ‘in a specific embodiment,’ a single example where the feature extractor scans for edges.” Id. (quoting Ex. 1005, 20:34–45) (citing Ex. 1003 ¶ 59). Based on Hull’s “broad instruction that its matching method is applicable to graphical images and may use other feature extractors, a person of ordinary skill in the art would have been motivated to look to other known methods for detecting the ‘locat[ion] and characteriz[ation] [of] graphic[al] features,’” such as Barber’s methods, “which characterize color and location.” Id. (quoting Ex. 1005, 20:38–39) (citing Ex. 1005, 20:34–61, 22:45–60; Ex. 1003 ¶ 60). Thus, Petitioner argues Hull’s disclosure “provides an express teaching, suggestion, and motivation to modify Hull to use image feature extractors from the prior art, such as those disclosed in Barber, to extract features IPR2020-00170 Patent 6,253,201 B1 20 suitable for characterizing, and locating those characterizations within, graphical images.” Id. at 20–21 (citing Ex. 1003 ¶ 60). Petitioner provides additional reasons a person of ordinary skill in the art allegedly would have been motivated to use Barber’s feature extractor. Id. at 22–24. For instance, Petitioner contends that Barber’s feature extractor “specifically focuses on and accounts for the use of color in characterizing images,” whereas “the single non-text example disclosed in Hull teaches using spatial frequency components (e.g., edges), but is silent on color.” Id. at 22 (citing Ex. 1006, 16:20–43; Ex. 1005, 20:37–61; Ex. 1003 ¶ 62). Petitioner also argues that Barber’s image characterization algorithm “simply characterizes content-independent partitions and would likely be computationally easier than the single non-text descriptor example provided in Hull, which is more computationally demanding, involving characterization of areas as well as comparing characteristics to nearby areas.” Id. at 23–24 (citing Ex. 1003 ¶ 64). In its Preliminary Response, Patent Owner asserts that Petitioner has failed to demonstrate that a person of ordinary skill in the art, considering Hull and Barber as a whole, would have combined the references as proposed. Prelim. Resp. 25–35. Among other things, Patent Owner argues that the Petition fails to acknowledge an essential teaching of Hull—the use of descriptors selected to be invariant to distortions caused by digitizing the documents. Id. at 25–26. Patent Owner further argues that, in view of that key feature, Barber’s spatial partitioning technique is incompatible with Hull. Id. at 26. For the reasons discussed below, we agree with Patent Owner that Petitioner has not explained adequately why a person of ordinary skill in the art would have been motivated to combine the teachings of Hull IPR2020-00170 Patent 6,253,201 B1 21 and Barber in the manner proposed by Petitioner to achieve the claimed invention. As Patent Owner correctly points out, the second sentence of Hull’s Abstract states that descriptors are selected to be invariant to distortions caused by digitizing the documents. Ex. 1005, code (57); see Prelim. Resp. 26–27. Similarly, Hull states in its Summary of the Invention that “descriptors are selected to be invariant to distortions caused by digitizing the documents and are redundant to accommodate noise in the digitization of the documents or differences in the input document and its match in the document database.” Ex. 1005, 4:42–46; see Prelim. Resp. 27. The Summary of the Invention also identifies one of the problems addressed by invariant descriptors: “The descriptors are preferably invariant of translation, rotation, scaling, format, font and subdivision, so that a given descriptor which is extracted from a document would still be extracted, even if the document is scanned in a different orientation, is resized or otherwise distorted.” Ex. 1005, 6:63–7:1; see Prelim. Resp. 28–29. The description of Hull’s preferred embodiment also emphasizes that Hull’s descriptors are invariant to distortion such as would occur through translation, rotation, or scaling. For example, the first sentence under the heading “Selection of Descriptor Rules” reads: “The descriptors provide a distortion-invariant representation for local features in an image, and therefore a descriptor is an independent source of information.” Ex. 1005, 9:47–49; see Prelim. Resp. 29. Further, the first sentence of the following paragraph under the same heading states that “[t]he use of distortion- invariant descriptors requires that the feature extraction stage be specialized for this requirement.” Ex. 1005, 9:57–50; see Prelim. Resp. 29. And the section of the written description explaining that Hull’s system is applicable IPR2020-00170 Patent 6,253,201 B1 22 to graphic images discloses a descriptor that is invariant through translation, rotation, and scaling, i.e., one that describes the angular relationship between three or more “interesting points” determined by scanning the image for “regions of high spatial frequency objects or sharp contrast edges.” Ex. 1005, 20:40–48; see Prelim. Resp. 30. Based on Hull’s several discussions of descriptors that are invariant to distortion, we determine that a person of ordinary skill in the art would have considered the invariant nature of Hull’s descriptors to be a significant feature of the disclosed system. The Petition refers to this feature only once, asserting that “Hull primarily utilizes text descriptors that are invariant through rotation and scaling,” but that Hull “does not require or limit itself to descriptors that are invariant to translation, rotation and scaling distortions.” Pet. 40 (citing Ex. 1005, 9:46–49, 6:63–7:1 (stating that Hull’s descriptors are “preferably invariant of translation, rotation, scaling, format, font and subdivision”), claim 2 (requiring invariance in a dependent claim)). In asserting that invariance is not required, however, the Petition fails to address Hull’s other passages suggesting invariance is an essential feature of Hull’s descriptors. The Petition also dismisses Hull’s disclosure of a feature extractor for graphic images as a single example that “scans for edges.” Pet. 20 (citing Ex. 1005, 20:34–45). This overlooks Hull’s disclosure of a descriptor based on the angular relationship between “interesting points” identified by the feature extractor. See Ex. 1005, 20:40–48. Thus, rather than provide an “express teaching, suggestion, and motivation to modify Hull to use image feature extractors from the prior art, such as those disclosed in Barber,” as Petitioner contends, Hull describes a feature extractor and a descriptor that IPR2020-00170 Patent 6,253,201 B1 23 are consistent with Hull’s teaching of descriptors invariant of translation, rotation, and scaling. Pet. 20. In view of Hull’s emphasis on descriptors selected to be invariant to distortions caused by digitizing documents, and Hull’s disclosure of a distortion-invariant descriptor for documents containing graphic images, Petitioner does not explain sufficiently why a person of ordinary skill in the art would have modified Hull by replacing its feature extractor and descriptor with Barber’s partitioning and characterizing process. See TriVascular, Inc. v. Samuels, 812 F.3d 1056, 1066 (Fed. Cir. 2016) (holding that an obviousness determination cannot be reached when the record lacks “explanation as to how or why the references would be combined to produce the claimed invention”). This is particularly true because, as Patent Owner contends, Barber’s technique for spatial partitioning and characterization based on measures such as image color, texture, and edge content appears to be inconsistent with Hull’s teaching of descriptors that are invariant of translation and rotation. See Prelim. Resp. 26. And although Petitioner further contends that a person of ordinary skill in the art would have been motivated to use Barber’s feature extractor with Hull’s matching process when “comparison of color information is needed” or “speed/computation is a concern” (Pet. 24), on the present record these arguments do not overcome the Petition’s failure to consider Hull as a whole, including its teaching of distortion-invariant descriptors as a key feature. See Polaris Indus., Inc. v. Arctic Cat, Inc., 882 F.3d 1056, 1069 (Fed. Cir. 2018) (“[A reference’s] statements regarding preferences are relevant to a finding regarding whether a skilled artisan would be motivated to combine that reference with another reference.”). IPR2020-00170 Patent 6,253,201 B1 24 For at least these reasons, based on the record before us, we determine that Petitioner has not adequately shown that a person of ordinary skill in the art would have combined the teachings of Hull and Barber in the manner asserted to achieve the claimed invention. Therefore, we conclude the information presented does not demonstrate a reasonable likelihood that Petitioner would prevail in establishing that claims 1–14 and 17–20 of the ’201 patent are unpatentable under 35 U.S.C. § 103(a) for obviousness over the combination of Hull and Barber. E. Discretionary Denial Patent Owner argues that we should exercise our discretion under 35 U.S.C. § 314(a) to deny institution of inter partes review in this case in view of the overlap between the Petition and district court litigation involving the same parties. Prelim. Resp. 10–20; Sur-reply 1–7; see Pet. Reply 1–7 (Petitioner addressing Patent Owner’s § 314(a) discretionary denial arguments). Patent Owner also argues that we should exercise our discretion under 35 U.S.C. § 325(d) to deny institution of inter partes review. Prelim. Resp. 20–23. Because we have considered the merits of the Petition and decline to institute an inter partes review on that basis, we need not determine whether it would be appropriate to deny the Petition pursuant to our discretion under § 314(a) or § 325(d). IV. CONCLUSION After considering the parties’ evidence and arguments, we determine that the information presented does not show a reasonable likelihood that Petitioner would prevail in establishing that at least one of claims 1–14 and 17–20 of the ’201 patent is unpatentable on the ground asserted in the Petition. IPR2020-00170 Patent 6,253,201 B1 25 V. ORDER Accordingly, it is ORDERED that the Petition is denied, and no inter partes review is instituted. PETITIONER: Erika H. Arner Daniel C. Cooley A. Grace Mills FINNEGAN, HENDERSON, FARABOW, GARRETT & DUNNER LLP erika.arner@finnegan.com daniel.cooley@finnegan.com gracie.mills@finnegan.com PATENT OWNER: Ryan Loveless Brett Mangrum James Etheridge Brian Koide Jeffrey Huang ETHERIDGE LAW GROUP ryan@etheridgelaw.com brett@etheridgelaw.com jim@etheridgelaw.com brian@etheridgelaw.com jeff@etheridgelaw.com Copy with citationCopy as parenthetical citation