Ex Parte Iannucci et alDownload PDFPatent Trial and Appeal BoardMar 21, 201310938412 (P.T.A.B. Mar. 21, 2013) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE 1 ___________ 2 3 BEFORE THE PATENT TRIAL AND APPEAL BOARD 4 ___________ 5 6 Ex parte LOUIS A. IANNUCCI, 7 ROBERT J. RONAN, 8 and PETER E. KULIS 9 ___________ 10 11 Appeal 2012-000559 12 Application 10/938,412 13 Technology Center 3600 14 ___________ 15 16 17 Before MURRIEL E. CRAWFORD, ANTON W. FETTING, and 18 MEREDITH C. PETRAVICK, Administrative Patent Judges. 19 FETTING, Administrative Patent Judge. 20 DECISION ON APPEAL 21 Appeal 2012-000559 Application 10/938,412 2 STATEMENT OF THE CASE1 1 1 Our decision will make reference to the Appellants’ Appeal Brief (“App. Br.,” filed April 20, 2011) and Reply Brief (“Reply Br.,” filed July 25, 2011), and the Examiner’s Answer (“Ans.,” mailed June 24, 2011). Louis A. Iannucci, Robert J. Ronan, and Peter E. Kulis (Appellants) seek 2 review under 35 U.S.C. § 134 of a final rejection of claims 1, 5-7, 9, 10, 14-3 16, 18, 19, 23-25, and 27-30, the only claims pending in the application on 4 appeal. Oral arguments were presented March 13, 2013. We have 5 jurisdiction over the appeal pursuant to 35 U.S.C. § 6(b). 6 The Appellants invented a way for measuring how well interactive 7 components of a service facility that are accessible to users through a 8 communication network perform in serving users, the interactive 9 components being a proper subset of all of the interactive components of the 10 service facility. The results of the measuring are used to evaluate the 11 performance of the service facility with respect to a target performance. 12 (Specification 1:19 – 2:2). 13 An understanding of the invention can be derived from a reading of 14 exemplary claim 1, which is reproduced below [bracketed matter and some 15 paragraphing added]. 16 1. A computer-based method, the method comprising: 17 [1] receiving a selection 18 of a set of key components of a web page, 19 the set of key components being a proper subset 20 of a plurality of components of the web page; 21 Appeal 2012-000559 Application 10/938,412 3 [2] measuring, 1 by one or more computers, 2 a performance value 3 for each of the key components of the web page 4 sent to users through a communication 5 network; 6 [3] comparing, 7 by one or more computers, 8 the measured performance value 9 for each of the key components 10 to 11 a corresponding target performance value 12 for each of the key components; 13 [4] defining, 14 for each of the key components, 15 an acceptable performance value, 16 when the measured performance value 17 is less than or equal to 18 target performance value; 19 and 20 [5] generating, 21 by one or more computers, 22 a customer service level value, 23 with the customer service value based at least in 24 part on 25 a number of key components in the web 26 page 27 having acceptable performance values 28 divided by 29 Appeal 2012-000559 Application 10/938,412 4 a total number of key components 1 in the web page. 2 The Examiner relies upon the following prior art: 3 Killian US 6,438,592 B1 Aug. 20, 2002 Olsson US 2004/0205184 Al Oct. 14, 2004 Mangipudi US 7,058,704 B1 Jun. 6, 2006 Sarrel, M. D., "Early Warning", PC Magazine, New York: Dec. 11, 200l, 4 Vol. 20, Iss. 21; p.159. 5 Claims 1, 5-7, and 9 stand rejected under 35 U.S.C. § 103(a) as 6 unpatentable over Killian, Mangipudi, and Olsson.2 7 Claims 9, 10, 14-16, 18, 19, 23-25, and 27-30 stand rejected under 8 35 U.S.C. § 103(a) as unpatentable over Killian, Mangipudi, Olsson, and 9 Sarrel. 10 ISSUES 11 The issues of obviousness turn primarily on whether Killian’s composite 12 data object is a proper subset of all the objects downloaded, whether the 13 claimed acceptable performance value may be the measured performance 14 value, and whether patentable weight is to be afforded the characterization 15 of the result of a division as being a customer service value. 16 2 A rejection under 35 U.S.C. § 112 was withdrawn. Ans. 4. Appeal 2012-000559 Application 10/938,412 5 FACTS PERTINENT TO THE ISSUES 1 The following enumerated Findings of Fact (FF) are believed to be 2 supported by a preponderance of the evidence. 3 Facts Related to the Prior Art 4 Killian 5 01. Killian is directed to monitoring and improving performance of 6 client-server hypermedia, such as, for example, the downloading 7 of pages and page components on the World Wide Web. Killian 8 1:6-10. 9 02. Killian provides client-server hypermedia using a Web server 10 comprised of one or more computers connected to each other and 11 to client computers by a computer network. It receives URL 12 requests from individual client computers requesting the 13 transmission by the server to the client computers of individual 14 data objects, such a Web pages or images, identified by URL. 15 Such a server responds to each URL request by transmitting 16 information including the requested data object to the client from 17 which the request came. Killian 3:20-32. 18 03. A URL request in Killian includes performance monitoring 19 instructions which instruct the client computer to send to the 20 server a performance message indicating the length of time 21 required on the client for performing an act associated with one 22 or more of the transmitted data objects. Such performance 23 monitoring instructions include JavaScript contained in 24 downloaded Web documents and cookies sent to client Web 25 Appeal 2012-000559 Application 10/938,412 6 browsers in the HTTP headers with which such data objects are 1 sent. Other forms of instructions, such as, for example, Java or 2 ActiveX applets, can be used. Killian 3:33-45. 3 04. Often the performance monitoring instructions instruct a client 4 to monitor the time required to download a composite data object. 5 They can also instruct the client to measure the time associated 6 with other events, such as the time required to download 7 individual component data objects; to execute a given segment of 8 code, such as Java, JavaScript, or ActiveX code; or to execute a 9 transaction which includes both the uploading and downloading of 10 information, such as the submission of an HTML form and the 11 downloading of a confirmation page. Performance monitoring 12 instructions can also instruct a client to measure when a user 13 aborts the downloading of a Web page. Commonly such 14 instructions measure the time between events occurring in 15 different downloaded objects. For example, download times are 16 commonly measured from the time when a user clicks, within a 17 first Web page, on a link for a second Web page to the time when 18 that second Web page has been downloaded. Killian 3:46-63. 19 05. The server creates a representation of the distribution of the 20 client addresses from which performance messages indicating 21 problems come. If a large number of such problem messages are 22 sent from clients all across the portion of the network to which the 23 server has sent data objects, the server generates an indication that 24 there is a problem with the server itself. If the addresses from 25 which problem messages are sent are concentrated in a portion of 26 Appeal 2012-000559 Application 10/938,412 7 the network address space associated with a network entity, the 1 server generates an indication that there is a problem associated 2 with that entity. If a given client computer generates a sufficient 3 number of problem messages but there is not an abnormal number 4 of problem messages associated either with the population of 5 clients to which data objects have been sent or with the population 6 of clients connected to the server through the same network entity 7 as the given client, the server generates an indication that the 8 individual given client is having a problem. Killian 3:64 – 4:19. 9 06. When the server detects a problem with the server, itself, with a 10 network entity, or with an individual client, it automatically takes 11 appropriate action. For example, when a problem is detected with 12 the server, the server automatically informs both clients and its 13 operators, and it automatically starts sending out lighter versions 14 of data objects, such as Web pages with fewer or smaller images. 15 If the problem appears to be with a network entity or an individual 16 client, it informs affected clients of such problems and offers them 17 the choice of receiving lighter data objects. The server uses 18 indications of download times provided by performance messages 19 to provide an indication to clients of how long the completion of 20 future download times is likely to take. Killian 4:20-34. 21 Mangipudi 22 07. Mangipudi is directed to establishing, measuring, and reporting 23 service attributes for network communications. Mangipudi 1:18-24 20. 25 Appeal 2012-000559 Application 10/938,412 8 08. Mangipudi performs generating, collecting, and manipulating 1 useful information for validating or defining Service Level 2 Agreements (SLAs) of web servers on a network. Web servers 3 comprising a web farm on the network are adapted for logging 4 detailed runtime information regarding user transactions and 5 performance parameters. Mangipudi 3:12-17. 6 09. An Accumulator device on the network interacts with an 7 intelligent agent on each web server to collect and combine their 8 log files, process the combined file and post information into a 9 database. An operator enters committed performance parameters 10 into an SLA Reporter system according to classes of users, URLs, 11 transactions, content or file type, or classes of web sites being 12 hosted on the web servers. When compared with the actual data 13 from the database, processing of SLA reports indicate how well 14 the parameters of the SLAs are being met for users, virtual sites, 15 classes, URL's and transactions, or other measurable elements. By 16 generating, collecting, combining and processing in this manner, 17 application-specific performance can be quickly and automatically 18 evaluated with respect to parameters related to user satisfaction 19 and detailed signals can be issued for cases in which remedial 20 steps should be undertaken. Mangipudi 3:18-35. 21 10. A network model includes a plurality of computer systems 22 clustered into a web farm, in which a front-end system distributes 23 service requests to one or more back-end web servers. The front-24 end system receives a service request from a user outside the web 25 farm, selects one of the back-end servers to service the request, 26 Appeal 2012-000559 Application 10/938,412 9 and forwards (routes) the request to the selected back-end server. 1 Each back-end server then services the user request, and generates 2 any necessary response directly to the requesting user. A given 3 back-end server may be requested to process transactions destined 4 for any of a number of "virtual sites" that the back-end server is 5 hosting as a surrogate for the "hosted site" addressed in the 6 transaction. Mangipudi 3:35-49. 7 11. An SLA defines the operational parameters of the web server 8 that the operators will monitor, and the relative acceptability of the 9 server services for each parameter. Mangipudi 5:43-57. 10 Olsson 11 12. Olsson is directed to evaluating the performance of information 12 handling in a network environment. Olsson para 0002. 13 13. Application Probes measure availability and performance of 14 applications and business processes. Olsson para 0046. 15 14. FIG. 3A and FIG. 3B illustrate an example of a report with data 16 from remote probes, and statistics, resulting from probing a web 17 site. Similar reports could be produced in connection with probing 18 other kinds of web sites, or probing other kinds of applications. 19 Olsson para 0052. 20 15. FIG. 3A and FIG. 3B involves comparing data and statistics 21 with threshold values. Olsson para 0054. 22 16. This example involves calculating and outputting statistics. In 23 each of cells 331-369, a statistic is aligned with a corresponding 24 Appeal 2012-000559 Application 10/938,412 10 threshold value in row 322. Cells 331-369 reflect calculating, 1 mapping, and outputting, for statistics. In row 330, cells 331-339 2 display average performance values. This statistic involves 3 utilizing successful executions of a transaction step, utilizing 4 response times for the transaction step, calculating an average 5 performance value, and outputting the average performance value 6 (in row 330). Failed executions and executions that timed out are 7 not included in calculating an average performance value, but are 8 represented in ratios in row 350, and affect availability results, in 9 this example. This example also involves comparing the average 10 performance value with a corresponding threshold value (in row 11 322); and reporting the results (in row 330) of the comparison. 12 Olsson para 0055. 13 17. This example involves calculating a transaction step's 14 availability proportion, and outputting the transaction step's 15 availability proportion (in rows 350 and 360). The proportion is 16 expressed as a ratio of successful executions to attempts, in row 17 350, cells 351-359. The proportion is expressed as a percentage of 18 successful executions in row 360, cells 361-369 (the transaction 19 step's "aggregate" percentage). Olsson para 0057. 20 ANALYSIS 21 We are persuaded by the Appellants’ argument that the art fails to 22 describe both measuring and comparing a performance value for each of the 23 key components of the web page, and defining, for each of the key 24 Appeal 2012-000559 Application 10/938,412 11 components, an acceptable performance value, when the measured 1 performance value is less than or equal to target performance value. 2 While the first step of “receiving a selection of a set of key components 3 of a web page, the set of key components being a proper subset of a plurality 4 of components of the web page” only requires receiving data, the next two 5 steps of measuring and comparing do so with respect to the data that was so 6 received. As the measuring and comparing is “for each of the key 7 components so received, the data received must contain those key 8 components. Those key components are defined in the receiving step as 9 being a proper subset of the components of the web page. 10 The Examiner found that Killian 3:47-51 described this. Ans. 6. We 11 agree with the Examiner that this describes measuring and comparing 12 download and execution times for a proper subset of all the downloaded 13 components. The receiving step does not preclude receiving an entire set, so 14 long as what is received contains a proper subset that is then tested in the 15 measuring and comparing steps. 16 Appellants argue that Examiner admits that Killian fails to describe the 17 measuring and comparing steps. Appeal Br. 13. This is incorrect. The 18 Examiner found that Killian did not explicitly refer to a target performance 19 value as such. Ans. 6. Accordingly the Examiner applied Mangipudi to 20 show that Killian’s comparison was necessarily with respect to some 21 standard, which would be a target. 22 Here Appellants argue that Mangipudi fails to present a target for a key 23 component. Appeal Br. 14. The Appellants thus respond to the rejection by 24 attacking the references separately, even though the rejection is based on the 25 Appeal 2012-000559 Application 10/938,412 12 combined teachings of the references. Nonobviousness cannot be 1 established by attacking the references individually when the rejection is 2 predicated upon a combination of prior art disclosures. See In re Merck & 3 Co. Inc., 800 F.2d 1091, 1097 (Fed. Cir. 1986). Killian provided the key 4 component. 5 Appellants next argue that the Examiner failed to even make findings as 6 to the limitation “defining, for each of the key components, an acceptable 7 performance value, when the measured performance value is less than or 8 equal to target performance value.” Appeal Br. 15. This is in error. The 9 Examiner cited Mangipudi 3:22-30 and 5:43-57 as describing this limitation. 10 Ans. 7. 11 These portions of Mangipudi describe an operator entering committed 12 performance parameters into an SLA Reporter system and that, when 13 compared with the actual data from the database, processing of SLA reports 14 indicate how well the parameters of the SLAs are being met for users. An 15 SLA defines the operational parameters of the web server that the operators 16 will monitor, and the relative acceptability of the server services for each 17 parameter. FF 10 and 11. This clearly describes defining acceptable values 18 at the time of processing SLA reports, which are processed coincident with 19 the comparison. 20 But this finding is to some extent overkill, as the defining step in the 21 claim is no more than a recognition step. The disclosed embodiment of this 22 defining step is at Specification 10:3-8, which merely defines the actual 23 value measured as acceptable when it is within the range defined by the 24 Appeal 2012-000559 Application 10/938,412 13 target value. This is consistent with Appellants’ citation. Appeal Br. 3; 1 footnote 5. 2 Thus, measuring and recognizing the result of the comparison as 3 performed in the second and third claimed steps inherently results in the 4 defining step. This leads us to conclude that the Appellants’ argument that 5 the claimed "acceptable performance value" is a defined value that is 6 different from the claimed "measured performance value" (Appeal Br. 16) 7 is erroneous, as the disclosed embodiment is one in which the acceptable 8 performance value is identical with the measured performance value. 9 This leads us to Appellants’ final argument regarding the step of 10 generating a customer service level value based on some division between 11 the number of acceptable values and number of total values. The name of 12 the data item generated is not worthy of patentable weight, as the claim 13 never uses the data functionally. The only issue is whether it was 14 predictable for one of ordinary skill to divide two such numbers. The easy 15 answer is yes, as it presents a too well known and used fractional or 16 percentage calculation of a portion relative to the whole in which the portion 17 is contained. There are infinite reasons to do so. Thus it was predictable. 18 The Examiner presents evidence to support this finding in the form of 19 Olsson’s program for evaluating network performance of actual industry 20 practice dividing the number of acceptable instances by the total number. 21 We are not persuaded by the Appellants’ argument that claim 7 is 22 patentable as Appellants make an argument regarding Mangipudi as to a 23 limitation the Examiner found in Killian similar to claim 1. 24 Appeal 2012-000559 Application 10/938,412 14 We are not persuaded by the Appellants’ argument that claim 9 is 1 patentable because again Appellants argue the references separately. 2 Examiner applies Sarrel for the additional limitation of weighting the values. 3 Sarrel is no more than evidence that it was well known to weight different 4 portions of a formula to give greater weight to certain portions. Thus it was 5 at least predictable to do so. 6 CONCLUSIONS OF LAW 7 The rejection of claims 1, 5-7, and 9 under 35 U.S.C. § 103(a) as 8 unpatentable over Killian, Mangipudi, and Olsson is proper. 9 The rejection of claims 9, 10, 14-16, 18, 19, 23-25, and 27-30 under 10 35 U.S.C. § 103(a) as unpatentable over Killian, Mangipudi, Olsson, and 11 Sarrel is proper. 12 DECISION 13 The rejection of claims 1, 5-7, 9, 10, 14-16, 18, 19, 23-25, and 27-30 is 14 affirmed. 15 No time period for taking any subsequent action in connection with this 16 appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. 17 § 1.136(a)(1)(iv) (2011). 18 19 AFFIRMED 20 21 JRG 22 Copy with citationCopy as parenthetical citation