Google Inc.Download PDFPatent Trials and Appeals BoardMar 19, 20212019004630 (P.T.A.B. Mar. 19, 2021) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 15/222,024 07/28/2016 Thomas Jenkins GGL-2250-CON 6620 100462 7590 03/19/2021 Dority & Manning P.A. and Google LLC Post Office Box 1449 Greenville, SC 29602 EXAMINER AHMED, SABA ART UNIT PAPER NUMBER 2154 NOTIFICATION DATE DELIVERY MODE 03/19/2021 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): usdocketing@dority-manning.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________________ Ex parte THOMAS JENKINS ____________________ Appeal 2019-004630 Application 15/222,024 Technology Center 2100 ____________________ Before JOSEPH L. DIXON, DAVID M. KOHUT, and JON M. JURGOVAN, Administrative Patent Judges. KOHUT, Administrative Patent Judge. DECISION ON APPEAL Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 2–5, 7, 9–13, 15, and 17–25.2,3 We have jurisdiction under 35 U.S.C. § 6(b). We REVERSE. 1 We use the word “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies the real party in interest as Google LLC. Appeal Br. 1. 2 Throughout this Decision we refer to the Final Office Action mailed September 20, 2018 (“Final Act.”), the Advisory Action mailed November 27, 2018 (“Advisory Act.”), the Appeal Brief filed January 25, 2019 (“Appeal Br.”), the Examiner’s Answer mailed March 18, 2019 (“Ans.”), and the Reply Brief (“Reply Br.”) filed May 16, 2019. 3 Claims 1, 6, 8, 14, and 16 were canceled. Final Act. 1–2. Appeal 2019-004630 Application 15/222,024 2 INVENTION The present invention relates to a computer-implemented method and system for “predictively presenting search capabilities” for “search types associated with a geographic location of a computing device.” Spec., Title (capitalization altered), Abstr. Claim 2 is representative of the invention and is reproduced below. 2. A computer-implemented method comprising: before a user has initiated an image-based search at a mobile device: obtaining data indicating a current context including at least a geographic location associated with the mobile device, the geographic location being a location at which the mobile device is located, determining, using the data indicating the current context including at least the geographic location associated with the mobile device, that the mobile device is physically located proximate to a predefined, geographic location, in response to determining that the mobile device is physically located proximate to the predefined, geographic location: accessing a repository that associates, for each predefined geographic location of a plurality of predefined geographic locations, a respective image-based search type for the predefined geographic location, and wherein the respective image-based search types define multiple candidate image-based search types that each respectively invoke a search type capability of an image search system that is different from a search type capability invoked by each other candidate image-based search type; obtaining data from the repository, the data indicating a particular image-based search type, from among multiple candidate image-based search types, that is identified by a predictive model as likely relevant to the current context including at least the geographic location Appeal 2019-004630 Application 15/222,024 3 associated with the mobile device, the identification based on a determination that the geographic location is within a threshold distance of the predefined, geographic location and that the predefined, geographic location is associated with the particular image-based search type, and providing a user interface including a control for initiat ing an image-based search of the particular image-based search type that is identified by the predictive model as likely relevant to the current context including at least the geographic location associated with the mobile device. Appeal Br. 14 (Claims App.) REFERENCES Name Reference Date Manber US 2006/0089792 A1 Apr. 27, 2006 Florance US 2009/0132316 A1 May 21, 2009 Mei US 2014/0250120 A1 Sept. 4, 2014 REJECTIONS4 Claims 2–5, 7, 9–13, 15, and 17–215 stand rejected under 35 U.S.C. § 103 as being unpatentable over Manber in view of Mei. Final Act. 7–29. Claims 22–25 stand rejected under 35 U.S.C. § 103 as being unpatentable over Manber, Mei, and Florance. Final Act. 29–33. 4 Claims 2–5, 7, 9–13, 15, and 17–25 were rejected under 35 U.S.C. § 101 as directed to patent-ineligible subject matter. Final Act. 3–6. However, this rejection was withdrawn in the Examiner’s Answer, and is no longer pending on appeal. Ans. 2. 5 Although the statement of the rejection does not include claim 21 (see Final Act. 7), claim 21 is rejected based on Manber and Mei (see Final Act. 28–29). We find this to be harmless error and add claim 21 here. Appeal 2019-004630 Application 15/222,024 4 ANALYSIS With respect to claim 2, the Examiner finds Mei teaches the claimed “candidate image-based search types [associated with predefined geographic locations] that each respectively invoke a search type capability of an image search system that is different from a search type capability invoked by each other candidate image-based search type” (emphasis omitted) because Mei’s interactive multi-modal image search tool provides [s]peech recognition module 524 [that] may also access previously stored audio files and other similar data sources to generate textual representations of audio data. Speech recognition module 524 outputs a query in textual form. . . . the search query is audio search type, and [an] optical character recognition capability includes the speech- to-text or the speech recognition module 524 taught by Mei, which is a different search type. Final Act. 9–10 (citing Mei ¶¶ 57, 68, Fig. 5); Ans. 7. In the Advisory Action, the Examiner adds that the claimed “image based search type is taught by the image-address database search function in Manber [0033] and [0045], and also taught by the ‘interactive image search tool’ in Mei [0024] and [0068].” Advisory Act. 2. With respect to the claimed “obtaining,” the Examiner finds Mei selects “a particular image-based search type [from the candidate image-based search types] . . . relevant to . . . the geographic location associated with the mobile device” (as claimed) because Mei’s “GPS module 516 and compass module 518 are used to identify images based on image location.” Ans. 7 (citing Mei ¶ 65); Final Act. 10–11 (citing Mei ¶¶ 6, 53). Appeal 2019-004630 Application 15/222,024 5 Having reviewed the evidence, we do not agree with the Examiner’s findings that Mei and Manber teach selecting “a respective image-based search type [from multiple candidate image-based search types] for the predefined geographic location [identified by, or proximate to, the mobile device]” where each “image-based search type . . . respectively invoke[s] a search type capability of an image search system that is different from a search type capability invoked by each other candidate image-based search type,” as recited in claim 2. As Appellant explains, the claimed “search type capability” of an “image-based search type” is a searching capability of an image search system of a device, such as a barcode scanning capability of a barcode scanner, a quick response (QR) code scanning capability of a QR code scanner, an image recognizer capability of a mobile device’s image recognizer, or an optical character recognition (OCR) capability of an optical sensor/portable scanner. Appeal Br. 10 (citing Spec. ¶¶ 17–18); see Spec ¶¶ 17–18, 33, 38–39, 101. Appellant further explains that Mei’s multi- modal image search tool provides speech-to-text and image searching capabilities, but Mei’s search tool does not select an image-based search type and search type capability based on a geographic location associated with a mobile device. Appeal Br. 10–11; Reply Br. 3. Rather, Mei’s multi- modal search tool “is using the [user’s] query to select search results, not select a ‘search type.’” Reply Br. 3; see also Appeal Br. 11. We agree with Appellant’s arguments, as Mei discloses a search based on a “voice query” from which entities (i.e., words, such as “lake,” “sky,” and “tree”) are extracted and provided to an image clustering engine that “identifies candidate images from an image database 128 that correspond to each of the three entities,” so that “a particular image for each entity can be Appeal 2019-004630 Application 15/222,024 6 selected [from the candidate images] and a composite visual query can be composed from the selected images.” See Mei ¶¶ 26–27, 57. Mei’s multi- modal search tool can also “apply location-based context information to candidate images” to identify “[c]andidate images and/or composite visual query results that are related to the current location [of the user’s mobile device].” See Mei ¶¶ 53, 65. Thus, Mei’s search tool provides all search capabilities (e.g., speech- to-text and image search) regardless of a location of the mobile device, and provides search results based on the location of the mobile device. See Mei ¶¶ 26–27, 53, 57, 65, 68. As Appellant explains, however, Mei does not invoke or select an image-based search type and search type capability based on the location of the mobile device, as required by claim 2. Appeal Br. 10–12; Reply Br. 3. The Examiner also has not shown that the additional teachings of Manber and Florance make up for the above-noted deficiencies of Mei. As Appellant explains, Manber merely “provide[s] a user with an image that is associated with the current location of the mobile phone, rather than a type of image-based search capability that is linked to the geographic location.” Appeal Br. 9–10 (citing Manber ¶¶ 12, 33, 45); see Manber ¶¶ 33 (“the image-address database 43 permit[s] users of the mobile devices 20 to search and view online Yellow Pages in which each listing (e.g., business) is displayed with an image of an object (e.g., business building)”), 45 (“based on an IP address of the mobile device 20. . . . only those listings corresponding to the general geographic position may be searched”). As the Examiner has not identified sufficient evidence to support the obviousness rejection of claim 2, and of independent claims 10 and 18 Appeal 2019-004630 Application 15/222,024 7 reciting similar limitations, we do not sustain the Examiner’s § 103 rejection of claims 2, 10, and 18. We also do not sustain the Examiner’s § 103 rejections of claims 3–5, 7, 9, 11–13, 15, 17, and 19–25 depending from one of claims 2, 10, and 18. Because the issues discussed above with respect to claims 2, 10, and 18 are dispositive as to the rejections of all claims, we do not reach additional issues raised by Appellant’s arguments as to the rejections of claims 2, 10, and 18. See Appeal Br. 8–9; Reply Br. 1–2. CONCLUSION The Examiner’s decision rejecting claims 2–5, 7, 9–13, 15, and 17–25 under 35 U.S.C. § 103 is reversed. DECISION SUMMARY In summary: Claim(s) Rejected 35 U.S.C. § Reference(s)/ Basis Affirmed Reversed 2–5, 7, 9–13, 15, 17–21 103 Manber, Mei 2–5, 7, 9–13, 15, 17–21 22–25 103 Manber, Mei, Florance 22–25 Overall Outcome 2–5, 7, 9–13, 15, 17–25 REVERSED Copy with citationCopy as parenthetical citation