INTERNATIONAL BUSINESS MACHINES CORPORATIONDownload PDFPatent Trials and Appeals BoardApr 29, 202013659349 - (D) (P.T.A.B. Apr. 29, 2020) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/659,349 10/24/2012 DAVID H. GOTZ YOR920120550US2 (533CON) 3420 49267 7590 04/29/2020 Tutunjian & Bitetto, P.C. 401 Broadhollow Road Suite 402 Melville, NY 11747 EXAMINER PAULSON, SHEETAL R. ART UNIT PAPER NUMBER 3626 NOTIFICATION DATE DELIVERY MODE 04/29/2020 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): docketing@tb-iplaw.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte DAVID H. GOTZ, PEI-YUN S. HSUEH, JIANYING HU, and JIMENG SUN Appeal 2019-004932 Application 13/659,349 Technology Center 3600 Before MURRIEL E. CRAWFORD, PHILIP J. HOFFMANN, and BRADLEY B. BAYAT, Administrative Patent Judges. CRAWFORD, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 1–20. We have jurisdiction under 35 U.S.C. § 6(b). We AFFIRM. 1 We use the word “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies the real party in interest as International Business Machines Corporation. Appeal Br. 3. Appeal 2019-004932 Application 13/659,349 2 CLAIMED SUBJECT MATTER The claims are directed to “identifying group and individual-level risk factors via risk-driven patient stratification.” Spec. ¶ 2. Claim 1, reproduced below, is illustrative of the claimed subject matter: 1. A system for individual risk factor identification, comprising: a selection module configured to identify common risk factors for one or more risk targets from population data; a clustering module configured to stratify individuals into clusters based upon the common risk factors, wherein a distance between each cluster represents a similarity based upon the common risk factors, and wherein a closest pair of clusters is iteratively merged into a single cluster until a threshold is reached; a ranking module configured to determine, using a processor, a discriminability of each of the common risk factors for a target cluster using individual data of the target cluster to provide re-ranked common risk factors as individual risk factors for the target cluster, such that the discriminability is a measure of how a risk factor discriminates its cluster from other clusters; a memory device for non-transitorily storing computer program instructions for the selection module, the clustering module, and the ranking module; and at least one hardware-based care management machine, including a processor for executing the computer program instructions, the processor being configured to perform clinical decision support at a point-of-care by generating a customized healthcare plan for an individual patient, and to customize and perform, using a personalized user interface and dashboard display operatively coupled to the at least one hardware based care-management machine for personalized treatment of the individual patient based on the customized healthcare plan, the personalized user interface and dashboard display comprising a customized graphical user interface (GUI) configured for customizing healthcare plans, tailored to the individual patient based on the individual risk factors, the stratifying individuals into clusters, and the re-ranked common risk factors, and being Appeal 2019-004932 Application 13/659,349 3 configured to provide real-time clinical decision support at a point-of-care for the individual patient. REJECTION The Examiner rejects claims 1–20 under 35 U.S.C. § 101 as directed to a judicial exception without something more. Claims Rejected 35 U.S.C. § Reference(s)/Basis 1–20 101 Eligibility OPINION An invention is patent-eligible if it claims a “new and useful process, machine, manufacture, or composition of matter.” 35 U.S.C. § 101. However, the Supreme Court has long interpreted 35 U.S.C. § 101 to include implicit exceptions: “[l]aws of nature, natural phenomena, and abstract ideas” are not patentable. E.g., Alice Corp. v. CLS Bank Int’l, 573 U.S. 208, 216 (2014). In determining whether a claim falls within an excluded category, we are guided by the Supreme Court’s two-step framework, described in Mayo and Alice. Id. at 217–18 (citing Mayo Collaborative Servs. v. Prometheus Labs., Inc., 566 U.S. 66, 75–77 (2012)). In accordance with that framework, we first determine what concept the claim is “directed to.” See Alice, 573 U.S. at 219 (“On their face, the claims before us are drawn to the concept of intermediated settlement, i.e., the use of a third party to mitigate settlement risk.”); see also Bilski v. Kappos, 561 U.S. 593, 611 (2010) (“Claims 1 and 4 in petitioners’ application explain the basic concept of hedging, or protecting against risk.”). Concepts determined to be abstract ideas, and thus patent ineligible, include certain methods of organizing human activity, such as fundamental Appeal 2019-004932 Application 13/659,349 4 economic practices (Alice, 573 U.S. at 219–20; Bilski, 561 U.S. at 611); mathematical formulas (Parker v. Flook, 437 U.S. 584, 594–95 (1978)); and mental processes (Gottschalk v. Benson, 409 U.S. 63, 69 (1972)). Concepts determined to be patent eligible include physical and chemical processes, such as “molding rubber products” (Diamond v. Diehr, 450 U.S. 175, 191 (1981)); “tanning, dyeing, making water-proof cloth, vulcanizing India rubber, smelting ores” (id. at 182 n.7 (quoting Corning v. Burden, 56 U.S. 252, 267–68 (1854))); and manufacturing flour (Benson, 409 U.S. at 69 (citing Cochrane v. Deener, 94 U.S. 780, 785 (1876))). In Diehr, the claim at issue recited a mathematical formula, but the Supreme Court held that “[a] claim drawn to subject matter otherwise statutory does not become nonstatutory simply because it uses a mathematical formula.” Diehr, 450 U.S. at 187; see also id. at 191 (“We view respondents’ claims as nothing more than a process for molding rubber products and not as an attempt to patent a mathematical formula.”). Having said that, the Supreme Court also indicated that a claim “seeking patent protection for that formula in the abstract . . . is not accorded the protection of our patent laws, . . . and this principle cannot be circumvented by attempting to limit the use of the formula to a particular technological environment.” Id. (citing Benson and Flook); see, e.g., id. at 187 (“It is now commonplace that an application of a law of nature or mathematical formula to a known structure or process may well be deserving of patent protection.”). If the claim is “directed to” an abstract idea, we turn to the second step of the Alice and Mayo framework, where “we must examine the elements of the claim to determine whether it contains an ‘inventive Appeal 2019-004932 Application 13/659,349 5 concept’ sufficient to ‘transform’ the claimed abstract idea into a patent- eligible application.” Alice, 573 U.S. at 221 (quotation marks omitted). “A claim that recites an abstract idea must include ‘additional features’ to ensure ‘that the [claim] is more than a drafting effort designed to monopolize the [abstract idea].’” Id. (quoting Mayo, 566 U.S. at 77). “[M]erely requir[ing] generic computer implementation[] fail[s] to transform that abstract idea into a patent-eligible invention.” Id. The PTO published revised guidance on the application of § 101. 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50 (Jan. 7, 2019) (“Guidance”). Under the Guidance, we first look to whether the claim recites: (1) any judicial exceptions, including certain groupings of abstract ideas (i.e., mathematical concepts, certain methods of organizing human activity such as a fundamental economic practice, or mental processes); and (2) additional elements that integrate the judicial exception into a practical application (see Manual of Patent Examining Procedure (“MPEP”) § 2106.05(a)–(c), (e)–(h)). Only if a claim (1) recites a judicial exception and (2) does not integrate that exception into a practical application, do we then look to whether the claim: (3) adds a specific limitation beyond the judicial exception that is not “well-understood, routine, conventional” in the field (see MPEP § 2106.05(d)); or (4) simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. See Guidance. Appeal 2019-004932 Application 13/659,349 6 The Appellant generally argues claims 1–20 as a group. Appeal Br. 23–24, 34. We select claim 1 as representative. See 37 C.F.R. § 41.37(c)(1)(iv). Prong One In the Answer, which was mailed after publication of the Guidance, the Examiner finds that claim 1, which recites “identifying risk factors, stratifying individuals based on risk factors and providing a treatment plan based on the stratification,” is directed to “certain methods of organizing human activity.” Answer 4. In the Final Office Action, which pre-dates the Guidance, the Examiner finds the claims are directed to “organizing information,” and “collecting information . . . analyzing it . . . and displaying certain results of the analysis,” citing Electric Power Group, LLC v. Alstom, S.A., 830 F.3d 1350 (Fed. Cir. 2016). Final Act. 4. Following the Guidance, we determine that claim 1 recites three “modules,” including a “selection module,” a “clustering module,” and a “ranking module.” The Specification describes that a module is computer logic. Spec. ¶ 19 (describing that the system may be hardware, software, or “an embodiment combining software and hardware aspects that may all generally be referred to herein as a ‘circuit,’ ‘module’ or ‘system.’”). Each module is configured to perform a particular function. The “selection module” receives risk targets and identifies “common risk factors.” Id. ¶ 34. The Specification provides exemplary methods for the selection, but does not limit how the selection is accomplished. Id. (“Other selection methods and configurations are also contemplated.”). This selection could be accomplished by a person mentally examining each Appeal 2019-004932 Application 13/659,349 7 received risk target, studying risk factors for each risk target, and selecting some of them. Id. ¶ 35 (after “ranking identified common risk factors . . . selection module 218 selects the top n risk factors, where n is any positive integer.”). The “clustering module” assigns individuals to clusters, though “[e]ach individual may be assigned as its own cluster.” Id. ¶ 36. Then, using unspecified metrics (“[o]ther metrics are also contemplated”), the “closest pair of clusters” are merged. Id. This step could be performed mentally by a person examining individuals and grouping individuals with common risk factors together after mentally examining the data. The “ranking module” is described as operating to: re-rank the common risk factors (from selection module 218) based upon how important each risk factor is to its particular risk cluster. The importance can be quantified as how much a risk factor discriminates its local cluster from other clusters. The discriminability of each risk factor for its cluster may be measured by comparing its cluster (i.e., the target cluster) with other clusters or individuals based on a number of different comparison configurations. Id. ¶ 38. No particular method for ranking is limited by the Specification, because “[o]ther comparison configurations are also contemplated.” Id. This step could also be performed mentally by a human sorting the risk factors based on how often they appear in the clusters. As part of this, “risk factor discriminability” is measured and taken into account. Id. ¶ 39. The Specification describes that “discriminability is a measure of how a risk factor discriminates its cluster from other clusters” (id. ¶ 5), but the Specification does not limit the method of measurement used. Id. ¶ 40 (“Other methods are also Appeal 2019-004932 Application 13/659,349 8 contemplated.”). In ranking risk factors, a person could mentally take “discriminability” into account and sort factors based on a numeric measurement. Aside from the generally computer-related nature of a “module,” the three modules of claim 1 thus describe steps capable of being performed through abstract mental thought concerning data. Guidance at 52; see also MPEP § 2106.04(a)(2)(III)(A)–(B). If a claim, under its broadest reasonable interpretation, covers performance in the mind but for the recitation of generic computer components, then it is still in the mental processes category unless the claim cannot practically be performed in the mind. See Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307, 1318 (Fed. Cir. 2016). Prong Two Next, we determine if the claim integrates the abstract idea into a “practical application,” by examining “additional elements,” beyond the abstract idea. Guidance at 54–55. In addition to the language of claim 1 corresponding to the abstract mental steps, the claim recites that the “ranking” module is performed “using a processor.” Claim 1 also recites “a memory device for non- transitorily storing computer program instructions for the selection module, the clustering module, and the ranking module.” Finally, claim 1 recites an extensive “hardware-based care management machine” limitation. The “hardware-based care management machine” is recited as including a “processor,” being “configured to perform clinical decision support at a point-of-care,” and configured “to customize and perform.” It is Appeal 2019-004932 Application 13/659,349 9 not clear, from our initial reading of the claim, what the processor is customizing and performing. The claim recites that the customizing and performing uses “a personalized user interface and dashboard display operatively coupled to the at least one hardware based care-management machine for personalized treatment of the individual patient.” This seems to mean that the processor uses a display and user interface to display information used in the personalized treatment of a patient, as we explain next. The Specification is completely silent on the nature and operation of the “hardware-based care management machine.” The Appellant directs us (Appeal Br. 5–6) to Figure 2, elements 202, 210, 212, and 214, which point to the “processor 210,” “display 212,” and “user interface 214” of a “risk factor identification system 202.” Spec., Fig. 2. The Specification describes that the “system may include a workstation or console 202 from which procedures (e.g., medical examination) may be performed.” Spec. ¶ 31 (cited at Appeal Br. 6). This description does not define the nature of the “machine.” The Specification does describe that “display 212 may also permit a user (e.g., physician, care coordinator, care giver, etc.) to interact with the system 202 and its components and functions,” and that “user interface 214, . . . may include a keyboard, mouse, joystick, or any other peripheral or control to permit user interaction with the system 202.” Id. ¶ 32 (cited at Appeal Br. 6). The Specification further describes: The risk factor identification system 202 may provide output 228. Output 228 may include individual risk factors 230 for a cluster of individuals, which may represent a target individual or patient. In one application, individual risk factors Appeal 2019-004932 Application 13/659,349 10 230 may be applied in a personal care management process for, e.g., clinical decision support at the point-of-care. In another application, individual risk factors 230 may be displayed using display 212 and/or user interface 214 to, e.g., customize healthcare plans or tailor patient education. Id. ¶ 43 (cited at Appeal Br. 6). The last paragraph cited by the Appellant in support of the “machine” explains that “individual risk factors are outputted for the target cluster.” Id. ¶ 52 (cited at Appeal Br. 6). The Specification also describes that “risk factors are outputted as individual risk factors,” and “the individual risk factors may be utilized to, e.g., customize a personalized care management process or may be displayed for clinical decision support at the point-of-care or for patient education.” Id. ¶ 18. Based on these descriptions, especially the use of general-purpose computer components such as a processor, display, and user interface, and because the Specification generally describes the “output” of information, which “may be utilized” to customize personal care, or “may be displayed for clinical decision support,” we construe the entire “hardware-based care management machine” limitation as a general-purpose computer that outputs the results of the mental analysis steps recited in the three “modules,” and permits a user to input and control other applications. To the extent that the machine limitation essentially involves output, this is not given weight in a § 101 analysis. See Bilski v. Kappos, 561 U.S. 593, 610–11 (2010) (“Flook stands for the proposition that the prohibition against patenting abstract ideas ‘cannot be circumvented by attempting to limit the use of the formula to a particular technological environment’ or Appeal 2019-004932 Application 13/659,349 11 adding ‘insignificant postsolution activity.”’) (quoting Diehr, 450 U.S. at 191-92); see also Guidance at 55, fn. 31. This is true also for the use of the display and user interface to input and edit data “for customizing healthcare plans.” Guidance at 55, n.31. This is supported by the lack of description, or algorithm, in the Specification for either “generating a customized healthcare plan for an individual patient,” or “provid[ing] real-time clinical decision support at a point-of-care for the individual patient.” In examining these “additional elements” (Guidance at 54–55), we determine that the claimed system does not improve the underlying “memory” or “processor” recited as performing the limitations of claim 1, because any computer can be used to execute the claimed method. See Spec. ¶ 23 (“instructions may be provided to a processor of a general purpose computer . . . to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts”). In addition, the system is for “identifying group-level and individual-level risk factors” (Spec. ¶ 2), and, as such, the claimed method does not improve another technology. Guidance at 55; see also MPEP § 2106.05(a). Because a particular computer is not required, the claim also does not define or rely on a “particular machine.” Id.; see also MPEP § 2106.05(b). Further, the method does not transform matter. Id.; see also MPEP § § 2106.05(c). Instead, the claim analyzes data and outputs results. As such, the method has no other meaningful limitations (MPEP § 2106.05(e)), and thus merely recites instructions to execute the abstract idea on a computer (MPEP § 2106.05(f)). Appeal 2019-004932 Application 13/659,349 12 Although the claim is drawn to a “system,” and involve components such as a “processor,” “memory,” “display,” and “user interface,” these components do not significantly alter our analysis. As the Federal Circuit has made clear “the basic character of a process claim drawn to an abstract idea is not changed by claiming only its performance by computers, or by claiming the process embodied in program instructions on a computer readable medium.” See CyberSource, 654 F.3d at 1375–76 (citing In re Abele, 684 F.2d 902 (CCPA 1982)). As such, we determine that claim 1 does not integrate the judicial exception into a “practical application.” Step 2B Next, we “evaluate the additional elements individually and in combination under Step 2B to determine whether they provide an inventive concept (i.e., whether the additional elements amount to significantly more than the exception itself).” Guidance at 56. We consider whether an additional element or combination of elements adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, or simply appends well-understood, routine, conventional activities previously known to the industry. Id. As developed above, in addition to the language of claim 1 corresponding to the abstract mental steps, the claim recites that the “ranking” module is performed “using a processor.” Claim 1 also recites “a memory device for non-transitorily storing computer program instructions for the selection module, the clustering module, and the ranking module,” and a “hardware-based care management machine” limitation. Appeal 2019-004932 Application 13/659,349 13 The ranking using the processor, which could involve sorting based on a number (see Spec. ¶¶ 38–40), involves basic computer tasks. The operations of storing, analyzing, receiving, and writing data are primitive computer operations found in any computer system. See In re Katz Interactive Call Processing Patent Litig., 639 F.3d 1303, 1316 (Fed. Cir. 2011) (“Absent a possible narrower construction of the terms ‘processing,’ ‘receiving,’ and ‘storing,’ discussed below, those functions can be achieved by any general purpose computer without special programming.”). This also means the use of a memory to store computer program instructions is a conventional use of computers. Finally, we consider the “hardware-based care management machine.” Above, we construed that this was a general-purpose computer that has a display and user interface to output the results of the three claimed modules data processing, and that data output and input was insignificant extra- solution activity. Using a display and user interface to provide data input and output is a conventional computer operation. See In re Katz, 639 F.3d at 1316. We therefore conclude that the additional elements of claim 1 beyond the abstract idea of steps able to be performed mentally, does not recite “significantly more.” Appellant’s Arguments We are not persuaded by the Appellant’s argument that “the functions recited in claims l, 12, and 20 perform[] functions which are not generic computer functions, but instead include unconventional steps previously unknown to the industry to which the inventions of these claims pertain Appeal 2019-004932 Application 13/659,349 14 (noting the lack of any obviousness-type rejections in the Office Action dated August 27, 2018).” Appeal Br. 20. Even unconventional abstract ideas are still unpatentable. See SAP America, Inc. v. Investpic, LLC, 890 F.3d 1016, 1018 (Fed. Cir. 2018). “What is needed is an inventive concept in the non-abstract application realm.” SAP Am., Inc. v. InvestPic, LLC, 898 F.3d 1161, 1168 (Fed. Cir. 2018). In addition, analyzing data, and outputting results, are conventional computer operations. See In re Katz, 639 F.3d at 1316. We are unpersuaded by the Appellant’s argument that the claims “would ‘not preempt approaches that use rules of a different structure or different techniques’.” Appeal Br. 20. While preemption may signal patent ineligible subject matter, the absence of complete preemption does not demonstrate patent eligibility.” Ariosa Diagnostics, Inc. v. Sequenom, Inc., 788 F.3d 1371, 1379 (Fed. Cir. 2015). The Appellant argues the independent claims are “directed to a customized graphical user interface (GUI) configured for customizing healthcare plans, . . . and for providing real-time clinical decision support,” which “requires fast and accurate functionality,” and is thus “not simply a generic display or interface.” Appeal Br. 20–23. We disagree. Claim 1 recites that it includes a GUI, but neither the claims nor the Specification provide any details about customizing the user interface, how healthcare plans are customized, or how real-time clinical decision support is implemented. The claim preamble itself, a “system for individual risk factor identification,” gives no indication the GUI is critical to the claim. We discern no mention in the originally-filed claims of any user interface, customizing of healthcare plans, or decision support Appeal 2019-004932 Application 13/659,349 15 functions, which corresponds to a total lack of detail on these currently- claimed functions or components. The drawings only show “block/flow diagrams,” about the data analysis, and do not show any features related to a GUI, healthcare plan, or decision support, all ending with “output.” See Spec. ¶¶ 12–14, Figs. 1–3. The Specification barely mentions a user interface, decision support, or healthcare plan. See Spec. ¶ 18 (“the individual risk factors may be utilized to, e.g., customize a personalized care management process or may be displayed for clinical decision support at the point-of-care or for patient education”), ¶ 30 (“risk factors may be used, e.g., in a personalized care management process or may be displayed in a user interface or dashboard”), ¶ 32 (“user interface 214, which may include a keyboard, mouse, joystick, or any other peripheral or control to permit user interaction with the system 202”), ¶ 43 (“individual risk factors 230 may be displayed using display 212 and/or user interface 214 to, e.g., customize healthcare plans or tailor patient education”). The terms “healthcare plan” and “decision support” do not seem to appear anywhere else in the Specification. The Appellant argues “the user dashboards recited in the present claims also function as user interfaces to interact with, and to control the hardware care management machines to actually perform treatment on a living patient based on the rules recited in the amended claims.” Appeal Br. 21. Once again, we disagree. The “hardware-based care management machine” of claim 1 uses “personalized user interface and dashboard display operatively coupled to the at least one hardware based care-management machine.” The machine is thus coupled to its user interface and display. There is no language that the machine is coupled to any other device, Appeal 2019-004932 Application 13/659,349 16 including one that performs treatment. Further, we do not discern any description in the Specification otherwise. Citing Trading Technologies International, Inc. v. CQG, Inc., 675 F. App’x 1001 (Fed. Cir. 2017) (Appeal Br. 21), the Appellant argues the claims are not abstract, because the claims “include a specifically structured graphical user interface paired with functionality related to the structure of the graphical user interface (e.g., control hardware care management machines).” Id. at 23. We disagree with the Appellant, because the distinguishing feature for the claims in Trading Technologies was an advance in efficiency provided by an improved graphical user interface as compared to other computer processes. Trading Technologies, 675 F. App’x at 1004. In contrast to Trading Technologies, no such distinguishing features are recited in claim 1. In the claimed “personalized user interface and dashboard display comprising a customized graphical user interface” (emphasis added), the terms “personalized” and “customized” essentially have no meaning, and, we believe, can be ignored. This is because the Specification mentions a “personalized care management process” (Spec. ¶¶ 18, 30, 52), without describing it, but does not mention, or provide any description of, what is involved in personalizing a user interface. Similarly, the Specification mentions “risk factors may be utilized to, e.g., customize a personalized care management process” (id. ¶ 18), and mentions that risk factors may be displayed “to, e.g., customize healthcare plans” (id. ¶ 43), but does not mention, or provide any description of, customizing a graphical user interface. The Appellant’s argument is thus unconvincing, because there is nothing revealed in the claims or Specification that suggests that the Appeal 2019-004932 Application 13/659,349 17 user interface of the claimed system is either personalized or customized in a manner comparable to the user interface of Trading Technologies. The Appellant again misstates the claim language, when arguing the claims “amount to significantly more,” stating the “claims relate directly to real-world physical items (e.g., hardware-based care management machine performing treatments on patients and a personalized GUI for customizing healthcare plans and to provide real-time clinical decision support at a point of care).” Appeal Br. 25; see also id. at 26 (“the GUI of the present claims is customized and improves timeliness, accuracy, and/or effectiveness of real-time diagnosis and treatment”). We do not construe the claims to perform any treatments on patients, nor to recite anything more than convention user interfaces or data output functions. The user interface is “coupled” to the “hardware-based care management machine,” but that “machine” is merely a vehicle to provide output about risk factors for display on the user interface. There is nothing in the claim or Specification that supports that the claimed system performs treatments on patients. As noted above, the scope of the original invention ended at displaying the risk factors that resulted from the claimed analysis. The remaining features being argued here have been contrived later. Next, citing a 2018 USPTO memo, the Appellant asserts that the law requires the Examiner “must prove that the present claims are not routine and well understood in the art, and that claimed advantages of the present invention are not in fact, advantages over the art,” and argues “the Examiner has not provided the above support to conclude that any of the elements of the present claims constitute functions that are well-understood, routine, and conventional activities previously known to the pertinent industry.” Appeal Appeal 2019-004932 Application 13/659,349 18 Br. 27–28; see also id. at 29–30 (citing Berkheimer v. HP Inc., 881 F.3d 1360 (Fed. Cir. 2018)). This argument follows the Appellant’s quotations of rules that the analysis in question relates to “additional” elements recited in the claims, which are elements beyond the abstract idea. See Guidance at 56. We find this argument unpersuasive, because the analysis, which we provided above, shows there are no additional elements that are not well-understood, routine, and conventional. That the abstract idea itself may not be proven to be well- understood, routine, and conventional is beside the point. See Mayo, 566 U.S. at 90 (holding that a novel and nonobvious claim directed to a purely abstract idea is, nonetheless patent-ineligible). See also Synopsys Inc. v. Mentor Graphics Corp., 839 F.3d 1138, 1151 (Fed. Cir. 2016) (“[A] claim for a new abstract idea is still an abstract idea.”) (emphasis omitted). The Appellant next argues that, similar to the claims in Core Wireless Licensing S.A.R.L., v. LG Electronics, Inc., LG Electronics Mobilecomm U.S.A., Inc., 880 F.3d 1356 (Fed. Cir. 2018), the “customized, interactive interface” has several advantages in time spent by a user. Appeal Br. 31–32; see also Reply Br. 4–5 (“customized, interactive interface also enables users to make selections and interact with care management systems to plan and/or execute customized health-care plans for individual patients, tailored to their particular needs, which very clearly is advantageous over requiring a user and/or caregiver to search through multiple databases, websites, patent records”). We do not agree that the claims here are similar to those in Core Wireless. In Core Wireless, the court held that claims which recited an interface were patent eligible as the claims recited specific limitations of the interface Appeal 2019-004932 Application 13/659,349 19 such as: an application summary that can be reached through a menu, the data being in a list and being selectable to launch an application, and additional limitations directed to the actual user interface displayed and how it functions. Core Wireless, 880 F.3d at 1363. The court found that the claims were directed to an improved user interface and not the abstract concept of an index as the claim “limitations disclose a specific manner of displaying a limited set of information to the user, rather than using conventional user interface methods to display a generic index on a computer.” Id. There is no improvement to technology recited in claim 1. Further, an improvement in speed alone does not render it patent eligible. While the claimed system and method certainly purport to accelerate the process of seeing analyzed risk factors data, the speed increase comes from the capabilities of a general-purpose computer, rather than the patented system itself. See Bancorp Servs., L.L.C. v. Sun Life Assurance Co. of Can. (U.S.), 687 F.3d 1266, 1278 (Fed. Cir. 2012) (“[T]he fact that the required calculations could be performed more efficiently via a computer does not materially alter the patent eligibility of the claimed subject matter.”). Of interest, the Specification describes in background that the “ability to identify risk factors related to an adverse health condition (e.g., congestive heart failure) is very important for improving healthcare quality and reducing cost,” and that “methods based on general population data will only yield common risk factors and do not address individual differences of patients.” Spec. ¶¶ 3–4. Missing, though, is any description of a problem such as “requiring a user and/or caregiver to search through multiple databases,” as argued. Appeal 2019-004932 Application 13/659,349 20 Finally, we are unpersuaded by the Appellant’s assertion, that under the Guidance, the claims are “very clearly directed to a practical application,” quoting only the last limitation of claim 1. Appeal Br. 32–33. We have already addressed whether the claim integrates the abstract idea into a practical application, above, in the Prong Two analysis. For the preceding reasons, we do not find error in the Examiner’s analysis. Therefore, we sustain the 35 U.S.C. § 101 rejection of claims. CONCLUSION The Examiner’s rejection is AFFIRMED. DECISION SUMMARY Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1–20 101 Eligibility 1–20 TIME PERIOD FOR RESPONSE No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation