Ex Parte Meirman et alDownload PDFPatent Trial and Appeal BoardAug 30, 201612757993 (P.T.A.B. Aug. 30, 2016) Copy Citation UNITED STA TES p A TENT AND TRADEMARK OFFICE APPLICATION NO. FILING DATE FIRST NAMED INVENTOR 121757,993 04/10/2010 Ilan Meirman 56436 7590 09/01/2016 Hewlett Packard Enterprise 3404 E. Harmony Road Mail Stop 79 Fort Collins, CO 80528 UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www .uspto.gov ATTORNEY DOCKET NO. CONFIRMATION NO. 82260599 3975 EXAMINER MILLER, VIV AL ART UNIT PAPER NUMBER 2197 NOTIFICATION DATE DELIVERY MODE 09/01/2016 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address( es): hpe.ip.mail@hpe.com mkraft@hpe.com chris.mania@hpe.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte ILAN MEIRMAN, ROY NURIEL, YOSSI RACHELSON, and DEKELTAL Appeal2015-004827 Application 12/757,993 Technology Center 2100 Before CAROLYN D. THOMAS, JEFFREYS. SMITH, and TERRENCE W. McMILLIN, Administrative Patent Judges. THOMAS, Administrative Patent Judge. DECISION ON APPEAL Appellants seek our review under 35 U.S.C. § 134(a) of the Examiner twice rejecting claims 1, 3-7, 9, and 11-19, all the claims pending in the application. We have jurisdiction over the appeal under 35 U.S.C. § 6(b). We AFFIRM. The present invention relates generally to development and testing of a software application, and more particularly to apparatus and methods that Appeal2015-004827 Application 12/757,993 detect defects in a software application and report such defects to a defect tracking system (see Spec. i-fi-f l, 8). Claim 1 is illustrative: 1. A method executed by a computer, comprising: opening, on a display of a computer, an inspection surface that displays an inspection tool and a user interface of a software application being tested for defects, wherein the inspection tool includes a number of tools executable by a user to detect errors in the user interface; detecting, with the inspection tool, a defect in the user interface; generating, in the inspection surface and with the inspection tool, a graphical annotation of the defect, wherein the graphical annotation describes the defect and includes information collected with the number of tools; and sending a report to a defect tracking system, wherein the defect tracking system is different than the inspection tool, and wherein the report includes a screenshot of the user interface, the inspection tool, and the graphical annotation generated with the inspection tool. Appellants appeal the following rejections: RI. Claims 1, 9, 14, 15, and 17 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Mori et al. (US 2006/0279571 Al, Dec. 14, 2006), Hayutin (US 2009/0210749 Al, Aug. 20, 2009), and Ergan et al. (US 2010/0229112 Al, Sept. 9, 2010). 1 1 The statement of rejection made by the Examiner (see Ans. 3; see also Final Act. 5), and by Appellants (see App. Br. 9), includes claim 10, which has been cancelled. Furthermore, the statement of rejection made by the Examiner and Appellants omits claims 14 and 17, which are also rejected under 35 U.S.C. § 103(a) as being unpatentable over Mori, Hayutin, and Ergan. 2 Appeal2015-004827 Application 12/757,993 R2. Claims 3, 4, 11, 16, 18, and 19 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Mori, Hayutin, Ergan, and Winer (US 5,796,401, Aug. 18, 1998). R3. Claims 5 and 13 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Mori, Hayutin, Ergan, and McLaughlin et al. (US 5,499,040, Mar. 12, 1996). R4. Claims 6 and 12 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Mori, Hayutin, Ergan, and Halstead, Jr. (US 2002/0118193 Al, Aug. 29, 2002). R5. Claim 7 stands rejected under 35 U.S.C. § 103(a) over Mori, Hayutin, Ergan, and Ido et al. (US 2007 /0046700 Al, Mar. 1, 2007). Claim Groupings Based on Appellants' arguments in the Appeal Brief, we will decide the appeal on the basis of claim 1, as set forth below. See 37 C.F.R. § 41.37(c)(l)(iv). ANALYSIS Rejection under§ 103(a) over Mori, Hayutin, and Ergan Issue 1: Did the Examiner err in finding that the combined teachings of Mori, Hayutin, and Ergan teach or suggest "generating, in the inspection surface and with the inspection tool, a graphical annotation of the defect, wherein the graphical annotation describes the defect and includes information collected with the number of tools," as recited in claim 1? Appellants contend Ergan's general annotations "are added by a user, and describe what the user was trying to accomplish when performing a 3 Appeal2015-004827 Application 12/757,993 sequence of interactions," which is "not analogous to annotations that are generated by an inspection tool that is used to detect defects in a user interface" (App. Br. 12; Reply Br. 4--5). However, claim 1 does not recite the annotations are generated by an inspection tool. Rather, the claim recites generating "with the inspection tool, a graphical annotation of the defect." The Examiner finds the scope of this limitation, when read in light of Paragraphs 36 and 40 of Appellants' Specification, encompasses a user using a tool to generate a graphical annotation of the defect. Ans. 18-19. We agree with the Examiner. Appellants' contention is not commensurate with the scope of claim 1. Further, Appellants' argument against Ergan separately from Hayutin does not persuasively rebut the combination made by the Examiner. One cannot show non-obviousness by attacking references individually, where the rejections are based on combinations of references. In re Merck & Co., Inc., 800 F.2d 1091, 1097 (Fed. Cir. 1986); In re Keller, 642 F.2d 413, 425 (CCPA 1981). Specifically, we agree with the Examiner's finding that Hayutin teaches a test automation tool or inspection tool that detects anomalies and pops up annotations commenting on details of the test which, "[ w ]hen combined with Ergan, it provides more detail on how the annotation describes the defect detected and includes information collected with the number of tools" (Ans. 19; see also Ans. 18, 20). For example, Hayutin discloses: To test the GUI of the application 135, a user of the computing system 101 may execute a test automation tool 102 that can drive the application 135 automatically to detect anomalies during execution .... The test automation tool 102 may run a number of predetermined test scenarios on the application 135 to determine 4 Appeal2015-004827 Application 12/757,993 whether the application's GUI is functioning properly . ... To keep a user of the test automation tool 102 informed of events that are supposed to happen during the execution of the application 135, one or more annotations 106 commenting on details of the test will pop up on the display 107 . ... (Hayutin if 23, emphasis added). Ergan discloses: Compressor 236 may begin operating on the event records at any suitable time. For example, in some embodiments the compressor 23 6 begins retrieving event records from the event log 235 after a stop event has occurred with an indication that a problem report should be sent . ... The compressor 23 6 records each filtered event record extracted from the event log 235 and the user friendly translation of the event record to the UI event file 237. The compressor may also save screen shots, if they were captures, in the UI event file 237. After the compressor 23 6 has processed all events to be included in the UI event file 237, the UI event file may be reviewed by the user . ... Additionally, the user may wish to provide additional comments and annotations to the UI event file . ... Once the user has had the option to review and possibly annotated the UI event file 237, the file can serve as a completed report. The controller 231 may provide instructions to the network interface 220 to transmit the report over to the network 140 to a support entity .... (Ergan iii! 71, 7 4, 81, 82, emphasis added). In other words, Hayutin describes a testing tool that detects defects and generates annotations commenting on the details of the defects and test results as a pop-up on the display; and Ergan describes a recording tool performing UI error reporting as event records to create event files that can be additionally annotated by a user before being sent as a report to a support entity. 5 Appeal2015-004827 Application 12/757,993 Appellants do not provide persuasive evidence or argument that the combination of Hayutin and Ergan (and Mori) does not teach generating a graphical annotation of a detected defect with an inspection tool, wherein the graphical annotation describes the defect and includes information collected with a number of tools executable by a user to detect errors in the user interface. Thus, we agree with the Examiner's finding that Hayutin's testing tool detecting defects and generating pop-up annotations describing those defects teaches "generating, in the inspection surface and with the inspection tool, a graphical annotation of the defect, wherein the graphical annotation describes the defect," as recited in claim 1; and Ergan's UI event file, produced by a recording tool to provide user-annotatable error reporting for a UI, teaches "the graphical annotation describes the defect and includes information collected with the number of tools," as required by claim 1. Issue 2: Did the Examiner err in finding that the combined teachings of Mori, Hayutin, and Ergan teach or suggest "sending a report to a defect tracking system, wherein the defect tracking system is different than the inspection tool, and wherein the report includes a screenshot of the user interface, the inspection tool, and the graphical annotation generated with the inspection tool," as recited in claim 1? Appellants contend Ergan's screenshots are merely "screenshots of an interaction" and are "not analogous to a screenshot of: a user interface, an inspection tool, and graphical annotations generated with the inspection tool" (App. Br. 12; see also App. Br. 13). We agree with the Examiner's finding Ergan "shows a screenshot that is annotated with comments from the user," and that Ergan's screenshot 6 Appeal2015-004827 Application 12/757,993 teaches "a screenshot of the user interface, the inspection tool, and the graphical annotation generated" (Ans. 21-22). For example, Ergan discloses: The compressor 236 records each filtered event record extracted from the event log 235 and the user friendly translation of the event record to the UI event file 237. The compressor may also save screen shots, if they were captures, in the UI event file 237. Additionally, the user may wish to provide additional comments and annotations to the UI event file . ... Once the user has had the option to review and possibly annotated the UI event file 237, the file can serve as a completed report. The controller 231 may provide instructions to the network interface 220 to transmit the report over to the network 140 to a support entity . ... (Ergan i-fi-1 7 4, 81, 82, emphasis added). In other words, Ergan describes a complete report that comprises a UI event file comprising screenshots with event records and user annotations. Appellants do not provide persuasive evidence or argument that the combination of Mori, Hayutin, and Ergan does not teach sending a report that includes a screenshot of the user interface, the inspection tool, and the graphical annotation. Thus, we agree with the Examiner's finding that Ergan's transmission of a completed report including screenshots, error records, and annotations teaches or suggests "sending a report to a defect tracking system, wherein the defect tracking system is different than the inspection tool, and wherein the report includes a screenshot of the user interface, the inspection tool, and the graphical annotation generated with the inspection tool," as required by claim 1. For at least these reasons, we are unpersuaded the Examiner erred. Accordingly, the Examiner's 35 U.S.C. § 103(a) rejection of independent 7 Appeal2015-004827 Application 12/757,993 claim 1, as well as the rejection of commensurate independent claims 9 and 15, not separately argued (App. Br. 10-11, 13), as well as dependent claims 14 and 17, not separately argued (App. Br. 14), is sustained. Rejection under§ 103(a) over Mori, Hayutin, Ergan, and Winer Appellants have provided no separate arguments towards patentability for claims 3, 4, 11, 16, 18, and 19. Therefore, the Examiner's 35 U.S.C. § 103(a) rejection R2 of claims 3, 4, 11, 16, 18, and 19 is sustained for similar reasons as noted supra. Rejection under§ 103(a) over Mori, Hayutin, Ergan, and McLaughlin Appellants have provided no separate arguments towards patentability for claims 5 and 13. Therefore, the Examiner's 35 U.S.C. § 103(a) rejection R3 of claims 5 and 13 is sustained for similar reasons as noted supra. Rejection under§ 103(a) over Mori, Hayutin, Ergan, and Halstead Appellants have provided no separate arguments towards patentability for claims 6 and 12. Therefore, the Examiner's 35 U.S.C. § 103(a) rejection R4 of claims 6 and 12 is sustained for similar reasons as noted supra. Rejection under§ 103(a) over Mori, Hayutin, Ergan, andldo Appellants have provided no separate arguments towards patentability for claim 7. Therefore, the Examiner's 35 U.S.C. § 103(a) rejection R5 of claim 7 is sustained for similar reasons as noted supra. 8 Appeal2015-004827 Application 12/757,993 DECISION We affirm the Examiner's§ 103(a) rejections Rl-R5. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l )(iv). AFFIRMED 9 Copy with citationCopy as parenthetical citation