Ex Parte Decre et alDownload PDFPatent Trials and Appeals BoardApr 1, 201911997174 - (D) (P.T.A.B. Apr. 1, 2019) Copy Citation UNITED STA TES p A TENT AND TRADEMARK OFFICE APPLICATION NO. FILING DATE 11/997, 174 01/29/2008 138325 7590 Signify Holding B.V. 465 Columbus A venue Suite 330 Valhalla, NY 10595 04/03/2019 FIRST NAMED INVENTOR Michel Marcel Jose Deere UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www .uspto.gov ATTORNEY DOCKET NO. CONFIRMATION NO. 2005P02185WOUS 3761 EXAMINER OJIAKU, CHIKAODINAKA ART UNIT PAPER NUMBER 3696 NOTIFICATION DATE DELIVERY MODE 04/03/2019 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): kim. larocca@signify.com jo.cangelosi@signify.com Gigi.Miller© signify. com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte MICHEL MARCEL JOSE DECRE, EVERT JAN VAN LOENEN, and BARTEL MARINUS VAN DE SLUIS Appeal 2016-008644 Application 11/997, 17 4 Technology Center 3600 Before JOHN A. JEFFERY, DENISE M. POTHIER, and JUSTIN BUSCH, Administrative Patent Judges. Opinion for the Board filed by Administrative Patent Judge POTHIER. Opinion Concurring-in-Part and Dissenting-in-Part filed by Administrative Patent Judge JEFFERY. Opinion Dissenting-in-Part filed by Administrative Patent Judge BUSCH. POTHIER, Administrative Patent Judge. DECISION ON APPEAL Appeal 2016-008644 Application 11/997, 17 4 Pursuant to 35 U.S.C. § 134(a), Appellants 1 appeal from the Examiner's decision to reject claims 1-3, 5, 7-13, and 17, which constitute all the claims pending in this application. Claims 4, 6, and 14--16 were cancelled. We have jurisdiction under 35 U.S.C. § 6(b ). We affirm-in-part. STATEMENT OF THE CASE Appellants' invention monitors a person having an interest in an object. See generally Abstract; Spec. 1 :2-5. According to the Specification, the person is a potential customer of a shop and is located either inside the shop or near the shop window. Spec. 3: 15-21. In one embodiment, an intersection of the direction in which the person is looking with a location of a specific product is determined. Spec. 4:22-24. In another embodiment, touches of the person with the shop window are projected onto coordinates of products to determine the closest product to the touches. Spec. 6:3-10. Once the particular object is identified, corresponding information about the particular object is provided to the person. Spec. 3:3-5. Claims 1 and 17, reproduced below, are illustrative: 1. A method of monitoring a person having an interest in a physical object, the method comprising steps of: monitoring the person by at least one video camera, obtaining interaction data from said at least one video camera, the interaction data relating to an interaction between the person and at least one of a plurality of three-dimensional, physical objects, wherein the person does not have physical contact with the physical objects, and wherein each of said at 1 Appellants identify the real party in interest as Koninklijke Philips Electronics N.V. App. Br. 2. 2 Appeal 2016-008644 Application 11/997, 17 4 least one cameras is physically detached from each of the plurality of three-dimensional physical objects, analyzing the interaction data by a data processor to identify a particular one of the physical objects to which the person has the interest, wherein said analyzing step comprises determining a direction (D) of a person's look and an intersection of the direction with the particular object, obtaining additional data related to the identified particular physical object for immediately informing the person about the identified particular physical object, wherein said obtaining additional data step is performed without any further action by the person, and providing the additional data to the person. 17. A method of monitoring a person having an interest to an object, the method comprising steps of: by a data processor, obtaining interaction data related to a remote interaction between the person and at least one of a plurality of objects, and analyzing the interaction data to identify a particular one of the objects to which the person has the interest, and by the data processor, obtaining additional data related to the particular object to provide additional data to the person, wherein the interaction data comprises touch data indicative of one or more touches by the person to a window located in front of the plurality of objects, and wherein identifying the particular object comprises detecting a position of touches on the window based on the touch data and then projecting the touch position onto coordinates of the objects to identify the particular object. 3 Appeal 2016-008644 Application 11/997, 17 4 THE REJECTION The Examiner rejected claims 1-3, 5, 7-13, and 17 under 35 U.S.C. § 101 as directed to ineligible subject matter. Final Act. 5---6. 2 CONTENTIONS The Examiner determines that the claims constitute a method of monitoring or tracking a person having an interest in an object and, therefore, concludes the claims are directed to an abstract idea. Ans. 2--4. The Examiner further concludes the claims do not include additional elements that amount to significantly more than the abstract idea. Id. at 4--12. Accordingly, the Examiner concludes the claims are ineligible under § 101. Id. at 2-12. Appellants argue that the claimed invention is not directed to an abstract idea. App. Br. 5---6; Reply Br. 2-3. Appellants further argue the claims' elements amount to significantly more than an abstract idea. App. Br. 6-10; Reply Br. 3-7. According to Appellants, a physical transformation occurs when a person's direction of interest is determined by analyzing a video camera's image data of claims 1 and 12, or touch sensors of claim 17. App. Br. 5---6; Reply Br. 2, 5---6. ISSUE Has the Examiner erred in determining claims 1-3, 5, 7-13 and 17 are patent ineligible under 35 U.S.C. § 101? This issue turns on whether claims 2 Throughout this opinion, we refer to (1) the Final Rejection mailed July 1, 2015 ("Final Act."); (2) the Appeal Brief filed Jan. 4, 2016 ("App. Br."); (3) the Examiner's Answer mailed July 18, 2016 ("Ans."); and ( 4) the Reply Brief filed Sept. 19, 2016 ("Reply Br."). 4 Appeal 2016-008644 Application 11/997, 17 4 1-3, 5, 7-13 and 17 are directed to an abstract idea. If so, we then determine whether the claim's elements----considered individually and as an ordered combination----contain an inventive concept sufficient to transform the nature of the claimed abstract idea into a patent-eligible application. PRINCIPLES OF LAW An invention is patent eligible if it claims a "new and useful process, machine, manufacture, or composition of matter." 35 U.S.C. § 101. However, the Supreme Court has long interpreted 35 U.S.C. § 101 to include implicit exceptions: "[L Jaws of nature, natural phenomena, and abstract ideas" are not patentable. Mayo Collaborative Servs. v. Prometheus Labs., Inc., 566 U.S. 66, 70 (2012) (brackets in original) (citing Diamond v. Diehr, 450 U.S. 175, 185 (1981)). In determining whether a claim falls within an excluded category, we are guided by the Supreme Court's two-step framework, described in Mayo and Alice. Alice Corp. v. CLS Banklnt'l, 573 U.S. 208, 217-18 (2014) (citing Mayo, 566 U.S. at 75-77). In accordance with that framework, we first determine what concept the claim is "directed to." See Alice, 573 U.S. at 219 ("On their face, the claims before us are drawn to the concept of intermediated settlement, i.e., the use of a third party to mitigate settlement risk."); see also Bilski v. Kappas, 561 U.S. 593, 611 (2010) ("Claims 1 and 4 in petitioners' application explain the basic concept of hedging, or protecting against risk."). Concepts determined to be abstract ideas, and thus patent ineligible, include certain methods of organizing human activity, such as fundamental economic practices (Alice, 573 U.S. at 219-20; Bilski, 561 U.S. at 611 ); 5 Appeal 2016-008644 Application 11/997, 17 4 mathematical formulas (Parker v. Flook, 437 U.S. 584, 594--95 (1978)); and mental processes (Gottschalkv. Benson, 409 U.S. 63, 69 (1972)). Concepts determined to be patent eligible include physical and chemical processes, such as "molding of rubber products" (Diamond v. Diehr, 450 U.S. 175, 193 (1981) ); "tanning, dyeing, making water-proof cloth, vulcanizing India rubber, smelting ores" (id. at 184 n.7 (quoting Corning v. Burden, 56 U.S. (15 How.) 252, 267---68 (1854))); and manufacturing flour (Benson, 409 U.S. at 69 (citing Cochrane v. Deener, 94 U.S. 780, 785 (1876))). In Diehr, the claim at issue recited a mathematical formula, but the Supreme Court held that "[a] claim drawn to subject matter otherwise statutory does not become nonstatutory simply because it uses a mathematical formula." Diehr, 450 U.S. at 187; see also id. at 191 ("We view respondents' claims as nothing more than a process for molding rubber products and not as an attempt to patent a mathematical formula."). That said, the Supreme Court also indicated that a claim "seeking patent protection for that formula in the abstract ... is not accorded the protection of our patent laws, ... and this principle cannot be circumvented by attempting to limit the use of the formula to a particular technological environment." Id. ( citing Benson and Flook); see, e.g., id. at 187 ("It is now commonplace that an application of a law of nature or mathematical formula to a known structure or process may well be deserving of patent protection."). If the claim is "directed to" an abstract idea, we tum to the second step of the Alice and Mayo framework, where "we must examine the elements of the claim to determine whether it contains an 'inventive concept' sufficient to 'transform' the claimed abstract idea into a patent- 6 Appeal 2016-008644 Application 11/997, 17 4 eligible application." Alice, 573 U.S. at 221 (quotation marks omitted). "A claim that recites an abstract idea must include 'additional features' to ensure 'that the [claim] is more than a drafting effort designed to monopolize the [abstract idea]."' Id. (quoting Mayo, 566 U.S. at 77). "[M]erely requir[ing] generic computer implementation[] fail[ s] to transform that abstract idea into a patent-eligible invention." Id. In January 2019, the USPTO published revised guidance on the application of§ 101. 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50 (Jan. 7, 2019) ("Guidance"). Under that guidance, we first look to whether the claim recites: (1) any judicial exceptions, including certain groupings of abstract ideas (i.e., mathematical concepts, certain methods of organizing human activities such as a fundamental economic practice, or mental processes) (Guidance, 84 Fed. Reg. 50, 52-54) ("Revised Step 2A - Prong 1 "); and (2) additional elements that integrate the judicial exception into a practical application (see MPEP §§ 2106.05(a}-(c), (e}-(h)) (Guidance, 84 Fed. Reg. 50, 53-55) ("Revised Step 2A - Prong 2"). Only if a claim ( 1) recites a judicial exception, and (2) does not integrate that exception into a practical application, do we then look to whether the claim: (3) adds a specific limitation beyond the judicial exception that is not well-understood, routine, and conventional in the field (see MPEP § 2106.05(d)); or (4) simply appends well-understood, routine, and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. 7 Appeal 2016-008644 Application 11/997, 17 4 See Guidance, 84 Fed. Reg. 50, 56 ("Step 2B"). ANALYSIS CLAIMS 1-3, 5, AND 7-13 Claims 1-3, 5, and 7-13 are argued as group. App. Br. 5-10. We select claim 1 as representative. See 37 C.F.R. § 4I.37(c)(l)(iv). Alice Step One, Revised Step 2A - Prong 1 At the outset, we note that the claims recite a series of steps for "monitoring a person having an interest in a physical object" (App. Br. 11 (Claims App'x)) and, therefore, fall within the process category of§ 101. Even so, we must still determine whether the claims are directed to a judicial exception, namely an abstract idea. See Alice, 573 U.S. at 217. To this end, we must determine whether the claims (1) recite a judicial exception, and (2) fail to integrate the exception into a practical application. See Guidance, 84 Fed. Reg. 50, 52-55. If both elements are satisfied, the claims are directed to a judicial exception under the first step of the Alice/Mayo test. See id. In the rejection, the Examiner finds that claim 1 is directed to the abstract idea of "obtaining interaction data" and "monitoring a person." Final Act. 5. In the Answer, the Examiner describes the abstract idea in more detail as "monitoring or tracking a person having an interest in an object." Ans. 4. We agree with the Examiner for the following reasons. To determine whether a claim recites an abstract idea, we (1) identify the claim's specific limitations that recite an abstract idea, and (2) determine whether the identified limitations fall within certain subject matter 8 Appeal 2016-008644 Application 11/997, 17 4 groupings, namely (a) mathematical concepts3; (b) certain methods of organizing human activity4; or ( c) mental processes. 5 Upon review, we agree with the Examiner that claim 1 is directed to an abstract idea, reciting at least one judicial exception. The recited judicial exception can be categorized as either certain methods of organizing human activity (e.g., a commercial interaction, such as marketing or sales activities or behaviors) or mental processes ( e.g., observing and judging where a person is looking and providing more information by evaluating where the human is looking). Independent claim 1 recites a method of monitoring a person having an interest in a physical object, where the method comprises five steps: (1) monitoring a person by at least one video camera ("step 1 "); (2) obtaining interaction data from the at least one camera that relates to an interaction between physical objects and a monitored person, where the person does not have physical contact with the objects, and each camera is physically detached from each object ("step 2"); (3) analyzing the interaction data by a data processor to identify a particular object among the objects, where the analyzing comprises determining a direction of the person's look and an 3 Mathematical concepts include mathematical relationships, mathematical formulas or equations, and mathematical calculations. See Memorandum, 84 Fed. Reg. 50, 52. 4 Certain methods of organizing human activity include fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). See id. 5 Mental processes are concepts performed in the human mind including an observation, evaluation, judgment, or opinion. See id. 9 Appeal 2016-008644 Application 11/997, 17 4 intersection of the direction with the particular object ("step 3"); (4) obtaining additional data related to the identified particular object for immediately informing the person about the identified particular object ("step 4"); and (5) providing the additional data to the person ("step 5"). This claimed invention provides information related to an object based on a person's observed interest in the object and, therefore, at least recites a certain method of organizing human activity, such as a commercial interaction employed in everyday commerce related to sales activities or behaviors. See Guidance, 84 Fed. Reg. 50, 52. For example, sales representatives often provide additional information for a particular product once they see a customer has visual interest in a particular product. To illustrate, a car salesperson sees a potential customer (e.g., step 1 or monitoring a person) looking at a particular car on a car lot ( e.g., step 2 or obtaining interaction data related to an interaction between the person and a 3-D physical object without having physical contact with the object) and identifies the particular car that the potential customer is interested in based at least partly on where the customer looks (e.g., step 3 or analyzing the interaction data to identify a particular object in which the person has an interest by determining the direction of a person's look and an intersection of the direction with the particular object). Given the customer's perceived interest in a particular car, the salesperson approaches the potential customer and provides the customer with additional information related to the car, such as the car's features, technical specifications, and price (e.g., steps 4 and 5 or obtaining additional data related to the identified particular object 10 Appeal 2016-008644 Application 11/997, 17 4 for immediately informing the person about the identified particular physical object and providing the additional data to the person). Additionally, claim 1 's steps 1 through 5 include mental processes performed in the human mind as illustrated by the salesperson example above, including performing the steps of observation ( e.g., steps 1 and 2), evaluation (e.g., step 3), and judgment (e.g., steps 4 and 5). See Guidance, 84 Fed. Reg. 50, 52. Specifically, a person can be monitored by another person solely using the other person's senses, such as visually observing a person and the direction of the person's looks, and providing related information to the monitored person based on what the other person observes. We note that, unlike the first three steps of claim 1, the last two steps, namely steps 4 and 5, do not require any machine to perform those steps. Accordingly, we determine claim 1 recites a judicial exception. Alice Step One, Revised Step 2A - Prong 2 Although claim 1 recites an abstract idea, we must still determine whether the abstract idea is integrated into a practical application, namely whether the claim applies, relies on, or uses the abstract idea in a manner that imposes a meaningful limit on the abstract idea, such that the claim is more than a drafting effort designed to monopolize the abstract idea. See Guidance, 84 Fed. Reg. 50, 53. To this end, we (1) identify whether there any additional, recited elements beyond the judicial exception, and (2) evaluate those elements individually and collectively to determine whether they integrate the exception into a practical application. See id., 84 Fed. Reg. 50, 54--55. 11 Appeal 2016-008644 Application 11/997, 17 4 The additionally recited elements beyond the above-identified judicial exception( s) in claim 1 are "at least one video camera" and "a data processor." App. Br. 11 (Claims App'x). When considering these elements individually or in combination, we determine they do not integrate a judicial exception into a practical application for the below-stated reasons. First, the additional elements do not reflect an improvement in a computer's functioning or an improvement to other technology or technical field as set forth in MPEP § 2106.05(a) and Guidance, 84 Fed. Reg. at 55. Instead, the claimed video camera and data processor merely automate a manual process using generic computer elements as tools to perform the judicial exception( s ), which does not constitute a patentable improvement in computer technology. See Credit Acceptance Corp. v. Westlake Servs., 859 F.3d 1044, 1055 (Fed. Cir. 2017); see also Alice, 573 U.S. at 221. The claimed camera is a tool to monitor and obtain the person's interactions with objects in steps 1 and 2, and the claimed data processor is a tool to analyze the interactions to determine what direction the person is looking and where this direction intersects with a particular object in step 3. See App. Br. 11 (Claims App'x). Thus, the recited sales activities or commercial interactions or instructions in steps 1 through 5 are being applied using general computer elements. See Alice, 573 U.S. at 221; see also MPEP § 2106.05([). Unlike the dissent for claim 1, we do not see an improvement in a computer's functioning or an improvement to other technology or a technical field. Granted, the Specification describes the instant method is an improvement over the manner in which a published Japanese patent application determines whether a person is looking at a display and what information to present to that person. Spec. 1 :6-2:28. Specifically, the 12 Appeal 2016-008644 Application 11/997, 17 4 Japanese application already uses the general direction of the person's look to recognize that the person is looking in the direction of the display, and displays information based on the person's gender and age, whereas Appellants prefer to use different information to identify the relevant information to display. Id. But, as Appellants state, "the current invention provides an improved marketing system for potential shoppers" (Reply Br. 7) ( emphasis added), not an improvement in how the video camera or data processor functions or an improvement to other technology or a technical field. In other words, Appellants do not allege an improvement to the underlying technology (i.e., video camera or data processor), but rather to the information used to identify appropriate items to market to the person. See Spec. 1: 19--25 ( describing the Japanese published application's disadvantage as using only age and sex to identify the relevant information). At best, claim 1 improves on how to market or sell an object more effectively by automating a business practice using computers as tools. Also, we disagree that claim 1 is sufficiently similar to Thales Visionix Inc. v. United States, 850 F.3d 1343, 1345--49 (Fed. Cir. 2017) to conclude claim 1 is patent eligible. Contrary to any assertions by Appellants (see Reply Br. 4), claim 1 does not recite a sensor determining a shopper's interaction with a physical object. Granted, a "video camera," which is recited in claim 1, includes some sensor to capture images, but claim 1 recites no details related to this underlying sensor and further recites the data processor, not the video camera, analyzes the interaction data to identify a particular object. App. Br. 11 (Claims App'x). Also, unlike here, the court in Thales concluded its claims were not directed to an abstract idea or do not recite a judicial exception as set forth in the recent Guidance. Even more, 13 Appeal 2016-008644 Application 11/997, 17 4 claim 1 improves on a sales or marketing strategy as noted above, which is not a technological improvement or an improvement in a computer's functioning. Second, claim 1 's additional elements do not implement the identified judicial exception(s) with or use the judicial exception(s) in conjunction with a particular machine as set forth in MPEP § 2106.05(b). For example, the claimed video camera and data processor are not recited as a particular type of video camera or data processor. App. Br. 11 (Claims App'x). Although claim 1 recites the camera "is physically detached from each of said at least plurality of three-dimensional physical objects" (id.), this limitation alone or in combination with the additional elements does not sufficiently contribute to claim 1 such that the additional elements are arranged in a particular way to create a particular machine. Id. Also, the additional features in claim 1 merely recite generic features of a video camera or data processor. Id. The claimed video camera performs generic functions of monitoring and obtaining data related to the direction a monitored person is looking (i.e., steps 1 and 2), and the claimed processor performs a generic function of analyzing received data to identify related information (i.e., step 3). See Alice, 573 U.S. at 223, 225; see also Bilski, 561 U.S. at 604. Accordingly, we disagree with the dissent in this regard because claim 1 does not recite explicitly or effectively determining the "azimuth of a user's attention, and a physical floor space that has been mapped in such a manner that the azimuth can be used to identify a specific object of interest present on that floor space." Reply Br. 3. Third, we disagree with Appellants and the dissent that the additional elements in claim 1 effect a physical transformation of a particular article 14 Appeal 2016-008644 Application 11/997, 17 4 into a different state or thing as set forth in MPEP § 2106.05( c ). Specifically, Appellants contend "a physical transformation occurs between an action of a person and an electronic signal that is provided to a processor. [T]he processor then uses this information to determine a particular object that is present in the derived direction." See App. Br. 6; see also id. at 7; Reply Br. 2-3. Notably, claim 1 recites in step 3 "analyzing the interaction data ... to identify a particular one of the physical objects to which the person has the interest." But, merely manipulating or reorganizing data, similar to claim 1 's step 3, is not enough to satisfy the transformation test. See CyberSource Corp. v. Retail Decisions, Inc., 654 F3.d 1366, 1375 (Fed. Cir. 2011); see also Ans. 6-7. No new or different function has been created using the additional elements in claim 1, such that an article has been transformed physically into a different state or thing. To the extent, claim 1 recites any transformation, such a transformation is of an intangible concept to a commercial interaction as previously discussed. Also, any such transformation is extra-solution activity that does not impose meaningful limits on claim 1 's method steps. At best, the additional elements contribute only nominally or insignificantly to the claimed judicial exception( s) and generally link the judicial exception(s) to a particular field of use. See Final Act. 5 (discussing the field of "eye tracking"); see also MPEP § 2106.05(h). For example, the video camera is involved in mere data gathering steps, including steps 1 and 2, the data to be analyzed or manipulated is merely limited to a particular type (i.e., interaction data), and the obtained and provided data related to the identified object in steps 4 and 5 are, at best, insignificant extra-solution activity. See MPEP § 2106.05(g). 15 Appeal 2016-008644 Application 11/997, 17 4 For the above-stated reasons, we determine the additional elements beyond the judicial exception(s) in claim 1 are not integrated into a practical application. Alice/Mayo Step Two, Step 2B Because we determine claim 1 does not integrate its recited judicial exception(s) into a practical application, we need to consider whether the additional elements add a specific limitation or combination of limitations that are not well-understood, routine, or conventional activity in the field. Guidance, 84 Fed. Reg. 50, 56. If so, this indicates that an inventive concept may be present. If, instead, the additional elements simply append well- understood, routine, and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception(s), this indicates that an inventive concept may not be present. The Examiner concluded that the additional elements in claim 1 perform generic computer functions that are well-understood, routine, and conventional activities previously known in the industry, including monitoring a person, obtaining data, analyzing data, obtaining additional data, and providing data. Final Act. 5; Ans. 7. Appellants do not dispute these determinations made by the Examiner. See, e.g., App. Br. 9--10; see also, e.g., Reply Br. 5-7. Thus, on the record, we agree with the Examiner. Additionally, we note the Specification supports the Examiner's determination, describing a video camera/processor system that monitors, captures, and processes the direction a customer is looking similar to claim 1 's steps 1 through 3. Spec. 1 :6-18, 2:23-24. Appellants state "novel and or [sic] non-obvious features are acknowledged," asserting claim 1 is patent eligible. App. Br. 10. We are 16 Appeal 2016-008644 Application 11/997, 17 4 not persuaded. Appellants have not identified clearly or specifically which claimed features are novel and non-obvious, let alone what claimed additional elements are not well-understood, routine, and conventional. Moreover, despite the Examiner not presenting a prior art rejection (see Reply Br. 6 (stating '"these features are novel over the prior art")), "§ 101 subject-matter eligibility is a requirement separate from other patentability inquiries." See Return Mail, Inc. v. US. Postal Serv., 868 F.3d 1350, 1370 (Fed. Cir. 2017); see also Afayo, 566 U.S. at 90; Diehr, 4501J.S. at 190 ( 1981) (''The question ... of whether a particular invention is novel is 'wholly apart from whether the invention falls into a category of statutory b . "') su ~ect matter. . Thus, we conclude the additional elements in claim 1 (i.e., the video camera and data processor) simply append well-understood, routine, and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exceptions(s). As such, we determine the additional elements do not provide an inventive concept in claim 1. For the above reasons, we are not persuaded that the Examiner erred in rejecting (1) independent claim 1; (2) independent claim 12 that recites similar limitations; and (3) dependent claims 2, 3, 7-11, and 13 for similar reasons under 35 U.S.C. § 101. CLAIM 17 The majority reaches the opposite conclusion for independent claim 17. Claim 17 recites a series of steps and, therefore, falls within the process category of§ 101. App. Br. 14--15 (Claims App'x). However, we must still determine whether the claim is directed to a judicial exception, namely an 17 Appeal 2016-008644 Application 11/997, 17 4 abstract idea. See Alice, 573 U.S. at 217. To this end, we must determine whether the claim (1) recites a judicial exception and (2) fails to integrate the exception into a practical application as previously discussed. See Guidance, 84 Fed. Reg. 50, 53-55. Alice/Mayo Step One, Revised Step 2A, Prong 1 Claim 1 7 recites a method of monitoring a person having an interest in an object where a data processor (1) obtains interaction data related to a remote interaction between the person and at least one of plural objects, (2) analyzes the interaction data to identify a particular one of the objects to which the person has the interest, and (3) obtains additional data related to the particular identified object to provide additional data to the person. App. Br. 14--15 ( Claims App 'x ). The claim further recites that the interaction data comprises touch data indicative of one or more touches by the person to a window located in front of the objects. Id. at 15 (Claims App'x). The claim adds that "identifying the particular object" includes (1) detecting a position of touches on the window based on the touch data, and then (2) projecting the touch position onto coordinates of the objects to identify the particular object. Id. The Examiner makes the same findings and conclusions for claims 1 and 17 under § 101. Final Act. 5---6. To summarize, the Examiner finds that claim 17 is directed to the abstract idea of "obtaining interaction data" and "monitoring a person." Final Act. 5. In the Answer, the Examiner describes the abstract idea in more detail as "monitoring or tracking a person having an interest in an object." Ans. 4. Similar to the analysis of claim 1 above, we agree claim 1 7 recites at least one judicial exception, including certain methods of organizing human 18 Appeal 2016-008644 Application 11/997, 17 4 activity ( e.g., a commercial interaction, such as marketing or sales activities or behaviors) or mental processes (e.g., observing and judging where a person is pointing and providing more information by evaluating where the human is pointing). We refer to the previous discussion related to claim 1 for more explanation. But, as an example, sales representatives often provide additional information for a particular product once they see that a customer shows interest in a particular product by pointing to the particular object located behind a display case ( e.g., a window) in a jewelry store or a protective window in a bakery. Accordingly, we determine claim 17 recites at least one judicial exception. Alice Step One, Revised Step 2A, Prong 2 Next, we determine whether claim 17 integrates the exception into a practical application. Claim 17 recites at least four additional elements beyond the judicial exception, namely a data processor, a window, a person's touch, and a plurality of objects. We consider whether those additional elements individually or in combination integrate the judicial exception(s) into a practical application. Unlike claim 1, claim 17 identifies a particular object by detecting the person's touch to a window located in front of objects, which, according to the Specification, can be a shop window that incorporates a grid of touch detectors (e.g., capacitive touch detectors) that send touch data with the touch position to a processor. Spec. 6:4---6. After the window touch position is detected, claim 17 then projects the touch position onto objects' coordinates to identify the particular object. See Spec. 6:7-10. Notably, this projection uses the products' coordinates, which are described as being 19 Appeal 2016-008644 Application 11/997, 17 4 found through an interaction map that defines the products' locations, to determine which product (e.g., 131, 132, or 133) is closest to the touch on the window and, based on this physical proximity, identifies the particular product. See Spec. 5:1-5; 6:7-10, Fig. 1. A key aspect of the claimed invention is that the "analyzing the interaction data to identify a particular one of the objects" step further involves "identifying the particular object compris[ing] detecting a position of touches on the window based on the touch data [indicative of one or more touches by the person to the window]" and "projecting the touch position onto coordinates of the objects to identify the particular object." App. Br. 14--15 ( Claims App 'x ). That is, the claim specifies a particular way to identify the particular object using a person's touch, a data processor, and a window and how the particular object is identified based on this touch data, namely by detecting a position on the window based on the touch data indicative of a person's touch to a window and projecting the touch position onto various objects' coordinates. In this regard, claim 17 differs from claim 1 and is more than simply using the additional elements ( e.g., a window and data processor) as tools to automate a task previously performed by humans. To illustrate, a salesperson analyzing a person's interactive touch data to identify the particular object the person is pointing at behind a display case (i.e., analyzing the interaction data, which comprise touch data indicative of one or more touches by the person to a window, to identify one of the objects to which the person has the interest as recited in claim 1 7) is not detecting the position where the person touches the window as claimed, but rather the direction the person is pointing and then projecting this direction onto an 20 Appeal 2016-008644 Application 11/997, 17 4 object to identify the particular object. Moreover, the salesperson is not projecting the direction onto a plurality of objects' coordinates as recited in claim 17 but rather a single object. Thus, in this manner, the recited window and data processor in claim 17 are employed to perform a distinct process from automating tasks previously performed by humans. See McRO, Inc. v. Bandai Namco Am. Inc., 837 F.3d 1299, 1314 (Fed. Cir. 2016). Given this machine-based functionality, claim 17, when read in light of the Specification, recites a technological improvement at least to the extent that a person's interest in a particular object is identified automatically through a window and data processor, and associated information is provided to that person. The claimed invention also improves on known camera-based systems (see Spec. 1-3) and technology related to image analysis by further reciting the above noted limitations that identify the particular object in which the person has an interest from a plurality of objects. That is, claim 17's additional elements in combination reflect an improvement to other technology or a technical field. See MPEP § 2106.0S(a). In reaching our conclusion, we emphasize that claim 17 is unlike claim 1 that identifies a particular object by determining a direction of a person's look and an intersection of the direction with the particular object. Indeed, Appellants' Specification acknowledges that the processor-based functions in claim 1 are known. See Spec. 1:6-16. Claim 17, however, identifies the particular object by detecting a person's touch position based on touch data and projecting the touch position onto the coordinates of physical objects-a distinct process that involves a technical improvement. Unlike the camera-based functions in claim 1, the Specification does not 21 Appeal 2016-008644 Application 11/997, 17 4 describe window-touch-and-projection identification techniques are known. This omission, although not dispositive, nevertheless weighs against the Examiner's determination, particularly given the lack of evidence on this record. Claim 1 7 's method also goes beyond merely organizing data or information in a new form, but rather recites a specific manner for obtaining and projecting the interactive data. See McRO, 837 F.3d at 1315. As previously noted, this is achieved by detecting "a position of touches on the window" based on the touch data indicative of one or more touches and this information is used and applied through "projecting the touch position" onto objects' coordinates to identify the particular object as claimed. In this manner, the additional elements collectively (i.e., the window, data processor, person's touch, and plurality of objects) apply the noted judicial exception in a meaningful way beyond merely presenting data in a new form or generally linking the judicial exception's use to a particular environment. Further, claim 17 does not preempt all methods of monitoring a person's interest in an object as evidenced by the distinct methods of claims 1 and 17. Although the claimed "detecting a positon of touches on the window based on the touch data" in claim 1 7 does not recite explicitly a sensor as asserted by Appellants (App. Br. 6; Reply Br. 2), this limitation provides more details related to the sensor's functionality (i.e., detects the touch position on the window using data indicative of the person's touch on the window) than claim 1. Because this technological improvement is achieved by the recited data processor detecting a person's touch position on the window (i.e., uses the window) and projects this touch position onto the 22 Appeal 2016-008644 Application 11/997, 17 4 objects' coordinates, claim 17's additional elements go beyond the judicial exception and integrate the exception into a practical application. To be sure, merely reciting a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention. Alice, 573 U.S. at 223. Also, mere manipulation or reorganization of data is not enough to satisfy the transformation prong of the machine-or-transformation test. CyberSource, 654 F.3d at 1375. But, as noted above, claim 17's "analyzing" step that involves "detecting" a person's touch on the window based on interaction data comprising touch data indicative of a person's touch to a window, as previously discussed, recites more than manipulating or reorganizing data. Here, the claimed invention does not merely implement an abstract idea on a generic computer ( e.g., data processor), but rather recite an improvement in existing technology through analyzing touch data indicating a person's touches to a window located in front of objects to identify a particular object and projecting a detected touch position on objects' coordinates to identify the particular object of interest. See MPEP § 2106.05(a)(II). Additionally, the machine-or-transformation test, although not the only test for eligibility (see MPEP § 2106.05( c) ), can nevertheless indicate whether claim 1 7 's additional elements integrate the exception into a practical application. See Guidance, 84 Fed. Reg. at 55 n.28 (citing MPEP § 2106.05(c)). Accord Ultramercial, Inc. v. Hulu, LLC, 772 F.3d 709, 716 (Fed. Cir. 2014) (quoting Bilski v. Kappas, 561 U.S. 593, 594 (2010)). Under the machine-or-transformation test, a claimed process is patent eligible if: (1) it is tied to a particular machine or apparatus, or (2) it transforms a particular article into a different state or thing. In re Bilski, 545 23 Appeal 2016-008644 Application 11/997, 17 4 F.3d 943, 954 (Fed. Cir. 2008), aff'd sub nom. Bilski, 561 U.S. at 593. According to the Federal Circuit in Bilski, the transformation part of the machine-or-transformation test (1) must involve transforming an underlying article from one state or thing to a different state or thing, and (2) must be central to the purpose of Appellants' claimed process. Id. at 962. Although an underlying article can be intangible, such as electrical signals, and transformation can include data transformation, the data must represent a physical object or an article. Id. at 962---63 (citing In re Abele, 684 F.2d 902, 908---09 (CCPA 1982)). The limitations of claim 17, when considered as a whole, effect a transformation or reduce an article into a different state or thing, thus reciting additional elements that integrate the exception into a practical application. As previously explained, the claim recites a window located in front of the objects, and when the person touches the window, the position of the touch is detected. The Specification explains that this window can be a shop window that incorporates a grid of touch detectors (e.g., capacitive touch detectors) that send touch data with the touch position to a processor. Spec. 6:4---6. After the window touch position is detected, claim 17 then projects the touch position onto coordinates of the objects to identify the particular object of interest. According to the Specification and as noted previously, this projection determines which product is closest to the touch on the window and, based on this physical proximity, identifies the particular product. Spec. 5:1-5; 6:7-10. This projection of detected touch positions transforms the physical touch on the window and its associated data into a different state or thing. In particular, the physical position of a person's touch is projected onto the 24 Appeal 2016-008644 Application 11/997, 17 4 objects' coordinates in claim 17 so as to identify a particular interesting object, thus resulting in a transformation. See Spec. 6:3-10. Notably, the recited touch data represents at least one physical object, such as a finger touching a window, and, in that sense, is analogous to Abele's X-ray attenuation data that likewise represented physical objects, namely the internal structure of bones, organs, and other tissues. See Bilski, 545 F.3d at 962---63 (citing Abele, 684 F.2d at 908---09). To be sure, Abele's system transformed raw data acquired by a computed tomography scanner into a particular visual depiction of a physical object on a display, which is not present in claim 17. Bilski, 545 F.3d at 962---63. Nevertheless, the acquired data in claim 17----data that represents a physical object-is transformed into a different state or thing, namely touch position data is projected onto objects' coordinates that indicates the touch's physical proximity to a particular physical object and the particular interesting object. This transformation (e.g., from a person's touch to an interested object) goes beyond merely gathering or manipulating data, and is critical in identifying a particular object behind a window as recited in the claim. In this regard, we agree with Appellants that claim 1 7 effects a transformation or reduces a particular article to a different state or thing (see Appeal Br. 6; see also Reply Br. 2), such that the additional elements integrate the judicial exception into a practical application and are sufficient to transform the claimed abstract idea into a patent-eligible application. Alternatively, claim 1 's additional elements noted above collectively recite components or techniques (e.g., measurement techniques) that generate new 25 Appeal 2016-008644 Application 11/997, 17 4 data sufficient to improve existing technologies. See MPEP § 2106.0S(a)(II)(vi). The Examiner's finding that that there is no "support" that specialized computer hardware is necessary to implement the invention recited in claim 17 is likewise unavailing. See Ans. 12 (citing "Alice Corp., 134 S.Ct. at 2360"). The fact remains that the recited projection of detected touch positions transforms the physical touch on the window and its associated data into a different state or thing, namely detected position data that is projected onto the objects' coordinates, which, by virtue of this transformation, identifies a particular interesting object. So even assuming, without deciding, that "specialized computer hardware" is unnecessary to implement the claimed invention as the Examiner contends, the claim as a whole still recites a transformation or devices/techniques that integrate the judicial exception into a practical application and adds significantly more to the abstract idea. Accord App. Br. 10 ("[W]hen the purported 'abstract idea' involves physical trans/ ormations in addition to computer processing and interactions with a user in a process, the 'substantially more' requirement should be less.") ( emphasis added). In conclusion, although claim 1 7 recites a judicial exception, the claim nevertheless integrates that the exception into a practical application. Because this issue is dispositive of the ineligibility rejection of claim 17, we need not address further whether the additional recited elements add significantly more to the abstract idea to provide an inventive concept under Alice/Mayo step two. 26 Appeal 2016-008644 Application 11/997, 17 4 Accordingly, we do not sustain the rejection of claim 17 under 35 U.S.C. § 101. CONCLUSION Under§ 101, the Examiner did not err in rejecting claims 1-3, 5, and 7-13, but erred in rejecting claim 1 7. DECISION The Examiner's decision to reject claims 1-3, 5, and 7-13 is affirmed. The Examiner's decision to reject claim 17 is reversed. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 41.50(±). AFFIRMED-IN-PART 27 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte MICHEL MARCEL JOSE DECRE, EVERT JAN VAN LOENEN, and BARTEL MARINUS VAN DE SLUIS Appeal 2016-008644 Application 11/997, 17 4 Technology Center 3600 Before JOHN A. JEFFERY, DENISE M. POTHIER, and JUSTIN BUSCH, Administrative Patent Judges. JEFFERY, Administrative Patent Judge, Concurring-in-Part and Dissenting- in-Part. I concur with Judge Pothier's decision to reverse the Examiner's ineligibility rejection of independent claim 17, but I would also reverse the Examiner's rejection of independent claims 1 and 12 as well for the following reasons. Independent claim 1 recites a method of monitoring a person having an interest in a physical object, where the method comprises five steps: (1) monitoring a person by at least one video camera; (2) obtaining interaction data from the at least one camera that relates to an interaction Appeal 2016-008644 Application 11/997, 17 4 between physical objects and a monitored person, where the person does not have physical contact with the objects, and each camera is physically detached from each object; (3) analyzing the interaction data by a data processor to identify a particular object among the objects, where the analyzing comprises determining a direction of the person's look and an intersection of the direction with the particular object; (4) obtaining additional data related to the identified particular object for immediately informing the person about the identified particular object; and (5) providing the additional data to the person. According to Appellants' Specification, known systems ( 1) acquire images to determine a potential customer's age and sex, and (2) provide pre- selected information based on that determination. Spec. 1: 11-18. The drawback of this system, however, is that the customer merely sees the information that was pre-selected for the determined age and sex-a process that may result in determining age incorrectly or otherwise provide information that the potential customer does not prefer. See id. at 1 :24--25. To overcome this problem, the present invention monitors a person having interest in an object to provide more accurate and suitable information to that person about the particular object of interest. See id. at 2: 1-3. Turning to claim 1, I first note that the claim recites a series of steps and, therefore, falls within the process category of§ 101. But despite falling within this statutory category, the key question is whether the claim is directed to a judicial exception, namely an abstract idea. See Alice, 573 U.S. at 217. To this end, it must be determined whether ( 1) the claim recites a judicial exception, and (2) fails to integrate the exception into a practical application. See Guidance, 84 Fed. Reg. at 54--55. If both elements are 2 Appeal 2016-008644 Application 11/997, 17 4 satisfied, the claim is directed to a judicial exception under the first step of the Alice/Mayo test. See id. In the rejection, the Examiner determines that claim 1 is directed to the abstract idea of "obtaining interaction data" (Final Act. 5}-a determination that merely quotes three words from the claim. In the Answer, however, the Examiner restates the abstract idea as "monitoring or tracking a person having an interest in an object." Ans. 4. Despite this more refined restatement, the Examiner's determination in this regard is problematic on this record. To determine whether a claim recites an abstract idea, we (1) identify the claim's specific limitations that recite an abstract idea, and (2) determine whether the identified limitations fall within certain subject matter groupings, namely (a) mathematical concepts; (b) certain methods of organizing human activity; or (c) mental processes. See Guidance, 84 Fed. Reg. at 52. Turning to claim 1, the first clause recites monitoring a person by at least one video camera. Although a person can be monitored solely using human senses by, for example, visually observing a person, listening to sounds associated with that person, etc., the claim nonetheless requires at least one machine, namely a video camera, to achieve this end. The next clause of claim 1 obtains interaction data from the camera, where the interaction data relates to an interaction between the person and at least one of plural three-dimensional, physical objects, where (1) the person does not have physical contact with the objects, and (2) each camera is physically detached from each object. 3 Appeal 2016-008644 Application 11/997, 17 4 Leaving aside the recitation of a video camera in these two clauses-a particular type of camera that captures moving pictures-the claim further recites that the interaction data obtained from the camera is analyzed by a data processor to identify a particular one of the objects to which the person has an interest, where the analyzing comprises determining (1) a direction of the person's look, and (2) an intersection of the direction with the particular object. Unlike the first three clauses of claim 1, however, the last two clauses, namely (1) obtaining additional data related to the identified particular object for immediately informing the person about the identified particular object, and (2) providing the additional data to the person, do not require any machine, let alone a particular machine, to perform those steps. Nevertheless, the claimed invention, in essence, provides information related to an object based on a person's observed interest in the object and, therefore, at least fundamentally recites a method of organizing human activity, namely a fundamental business practice employed in everyday commerce. See Guidance, 84 Fed. Reg. at 52. For example, sales representatives are often eager to provide additional information for a particular product once they see that customers show visual interest in a particular product among other products. In one typical scenario, a used car salesman, after seeing a potential customer looking at ( and perhaps admiring) a particular car on a used car lot, would identify a particular object (i.e., a used car) that the potential customer is interested in based at least partly on where the customer looks-namely in the direction of the car. Given this perceived interest in a particular car, and the possible opportunity to sell the car, the salesman would then approach 4 Appeal 2016-008644 Application 11/997, 17 4 the potential customer and provide additional data related to the identified car to the potential customer, which could include a wide variety of related information including the car's features, technical specifications, price, etc. Although the claim recites an abstract idea based on this fundamental business practice, the key question is whether the abstract idea is integrated into a practical application, namely whether the claim applies, relies on, or uses the abstract idea in a manner that imposes a meaningful limit on the abstract idea, such that the claim is more than a drafting effort designed to monopolize the abstract idea. See Guidance, 84 Fed. Reg. at 54--55. To this end, we ( 1) identify whether there any additional recited elements beyond the abstract idea, and (2) evaluate those elements individually and collectively to determine whether they integrate the exception into a practical application. See id. A key aspect of the claimed invention is that a data processor analyzes data obtained from a video camera to identify a particular one of multiple objects to which the person has an interest, by determining (1) a direction of the person's look, and (2) an intersection of the direction with the particular object. To be sure, but for the camera and data processor, the recited process could otherwise be done by a human by mere visual observation as noted previously in connection with the used car salesman example. But the claimed invention uses particular components, namely a video camera and a data processor, that are arranged and used in a particular way that effectively determines the azimuth of the user's attention with respect to a particular object automatically. Accord Reply Br. 3 (noting this azimuth determination). Notably, this particular machine-based determination improves the known camera-based systems that, despite determining a 5 Appeal 2016-008644 Application 11/997, 17 4 direction of a customer's look ( e.g., in the direction of a bookstore window), nonetheless display information based on the customer's perceived sex and age-a determination that may be incorrect or yield irrelevant results. See Spec. 1-3. Because this technological improvement is achieved by the recited video camera and data processor functionality, they are additional elements beyond the abstract idea that integrate the abstract idea into a practical application. See Guidance, 84 Fed. Reg. at 55 (citing MPEP § 2106.05(a)). Cf Thales Visionix Inc. v. United States, 850 F.3d 1343, 1345--49 (Fed. Cir. 2017) (holding eligible claims directed to a technique for using sensors to track an object on a moving platform more efficiently). To be sure, merely reciting generic computer components cannot transform a patent-ineligible abstract idea into a patent-eligible invention. See Alice, 573 U.S. at 223-24. But here, the claimed invention does not merely implement an abstract idea on a generic computer, but rather analyzes data obtained from a video camera to not only determine a particular direction in which a person looks, but also the intersection of that direction with a particular object to identify that object. This particular visual analysis technique goes well beyond mere generic computing functionality, but rather yields a technological improvement in analyzing video data to identify a particular object in which a person has an interest by, among other things, using a machine to (1) assess the particular direction that a person looks with respect to the object, and then (2) pinpoint the exact location where that direction intersects a particular object. To achieve this end, the Specification indicates that the processor's analysis may (1) detect a position and inclination of a person's head as disclosed in the Japanese published application JP2003/271084, or (2) track 6 Appeal 2016-008644 Application 11/997, 17 4 the user's eyes. Spec. 4. In the first case, the machine would have to analyze the data from the video camera to ( 1) identify a person's head; (2) determine the head's particular position relative to other features in the video data, including a particular distant object; (3) determine the head's particular angle of inclination at that particular position with respect to a particular distant object in the video data; (4) determine the particular direction associated with this angle of inclination at that particular head position; and (5) determine that this direction intersects with the distant object in the video data. To do that, the machine would have to presumably analyze the shape and structure of the head itself to not only distinguish it from other parts of the person's body, but also determine its angle of inclination with respect to the distant object. Based on this determined head position and angle, the machine would then have to determine not only the direction corresponding to this particular head position and angle, but also assess whether that determined direction intersects with a distant object in the video data. At a minimum, this machine-based determination involves, among other things, physiological analysis techniques including recognizing and distinguishing particular anatomical characteristics of the human body- including the head-from the video data. This analysis also involves algorithms that assess particular geometrical and spatial relationships between the person's head and other objects in the video data. These algorithms would, among other things, estimate and extrapolate data based on these geometrical and spatial relationships by extending the direction of the person's look to intersect with a certain distant object, or otherwise infer this intersection from the head's position and inclination relative to the object. 7 Appeal 2016-008644 Application 11/997, 17 4 Similarly, for eye-tracking, the machine would have to analyze the data from the video camera to ( 1) identify a person's head; (2) identify the facial region of the head where the eyes are located; (3) identify the person's eyes ( which presupposes that the video camera captured an image of the person's face so that the eyes can be discerned); and (4) determine the particular direction that the person's eyes are gazing. To do that, the machine would presumably have to analyze the structure of the eye itself, including the iris, pupil, etc., to ascertain the particular direction of the person's look based on this eye structure, and then extend a line corresponding to this direction to some distant object such that the line intersects with that object. Accord Spec. 4 (noting that eye-tracking methods may be used to exploit a relation of a person's gaze to the relative position of the pupil of that person's eye). See also id. ( discussing using a "bright eye effect" that is a result of the highly-reflective nature of the eye's retina). These machine-based determinations strike me as non-trivial, and yielding a technological improvement. At a minimum, these techniques involve not only machine-based anatomical and facial analysis processes including recognizing particular parts of the human anatomy including the head and eyes, but also algorithms that determine and assess particular geometrical and spatial relationships between the person's head or eyes and other objects in the video data. Moreover, these algorithms would, among other things, estimate and extrapolate data based on these geometrical and spatial relationships by extending the direction of the person's look to intersect with a certain distant object. For this reason alone, the invention recited in claim 1 yields a technological improvement. 8 Appeal 2016-008644 Application 11/997, 17 4 In addition, the machine-or-transformation test, although not the only test, can nevertheless indicate whether additional elements integrate the exception into a practical application. See Guidance, 84 Fed. Reg. at 55 (citing MPEP §§ 2106.05(b), (c)). Accord Ultramercial, Inc. v. Hulu, LLC, 772 F.3d 709, 716 (Fed. Cir. 2014) (quoting Bilski v. Kappas, 561 U.S. 593, 594 (2010)). Under the machine-or-transformation test, a claimed process is patent eligible if: (1) it is tied to a particular machine or apparatus, or (2) it transforms a particular article into a different state or thing. In re Bilski, 545 F.3d 943, 954 (Fed. Cir. 2008), aff'd sub nom. Bilski, 561 U.S. at 593. According to the Federal Circuit in Bilski, the transformation (1) must involve transforming an underlying article from one state or thing to a different state or thing, and (2) must be central to the purpose of Appellants' claimed process. Id. at 962. Although an underlying article can be intangible, such as electrical signals, and transformation can include data transformation, the data must represent a physical object or an article. Id. at 962-63 (citing In re Abele, 684 F.2d 902, 908---09 (CCPA 1982)). Claim 1 recites a transformation, namely transforming the interaction data obtained from the video camera----data that represents a physical object (i.e., the monitored person}--into an intersection of a determined direction of that person's look with a particular object. Although this data transformation differs somewhat from Appellants' characterization of the transformation of a person's action and a resulting electronic signal in the recited process (App. Br. 6), I nonetheless agree with Appellants at least to the extent that some transformation occurs beyond merely manipulating or reorganizing data. See CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1375 (Fed. Cir. 2011) (noting that merely manipulating or 9 Appeal 2016-008644 Application 11/997, 17 4 reorganizing data does not satisfy the transformation prong of the machine- or-transformation test). To be sure, the video camera provides the data that is analyzed to determine the look direction and object intersection, and, therefore, arguably merely gathers data-an insignificant pre-solution activity. See Bilski, 545 F.3d at 963 (characterizing data gathering steps as insignificant extra- solution activity). But this pre-solution activity is significant in the context of the claimed invention because the camera not only monitors the person- an essential step of the process-but also provides interaction data based on this video-based monitoring. I recognize that video cameras are used routinely to input video data into computers for processing, and in those applications, are conventional computer peripherals that are themselves generic computing components. See, e.g., WEBSTER'S NEW WORLD COMPUTER DICTIONARY 402 (10th ed. 2003) (defining "webcam," in pertinent part, as "[a] low-cost video camera used for low-resolution videoconferencing on the Internet"); see also MICROSOFT COMPUTER DICTIONARY 562 (5th ed. 2002) (noting that webcams allow customers and users to observe current activities at the site owner's business or home); see also Spec. 1:6-16 (describing a known monitoring system with a camera and display connected to a processor). That these components are conventional, however, is not a factor in determining whether a judicial exception is integrated into a practical application. See Guidance, 84 Fed. Reg. at 55. Rather, such considerations are relevant under step two of the Alice/Mayo test-not step one. See id. Although claim 1 recites an abstract idea, namely a fundamental business practice that is a method of organizing human activity, the claim 10 Appeal 2016-008644 Application 11/997, 17 4 nevertheless integrates that abstract idea into a practical application by applying, relying on, or using the abstract idea in a manner that imposes a meaningful limit on the abstract idea. See id. at 54--55. As with claim 1 7, claim 1 does not preempt all methods of monitoring a person's interest in an object. Nor does claim 1 preempt all ways of identifying physical objects in which a person has an interest----even techniques involving image analysis. 6 Rather, claim 1 is limited to a particular way to achieve this end, namely by analyzing data from a video camera by using a processor that determines ( 1) a direction of the person's look, and (2) an intersection of that direction with a particular object. In short, claim 1 integrates an abstract idea into a practical application. Therefore, I would reverse the Examiner's rejection of independent claim 1. I would also reverse the rejection of independent claim 12 for similar reasons given its commensurate limitations. Although the machine-or- transformation test applies generally to process claims, independent system claim 12 is nevertheless no different substantively from method claim 1. Therefore, I see no reason to treat claim 12 any differently with respect to its eligibility, just as the U.S. Supreme Court did for the system claims in Alice. 6 For example, another possible way of using image analysis to identify an object in which a person has an interest would be to determine a person's relative physical proximity to various objects, and conclude from this determination that the person has an interest in the object that is closest to that person. Such an analysis would not involve a person's look whatsoever, let alone determining a direction of that look, much less an intersection of that direction with the object. Rather, this analysis would be based on other relationships between the person and the object discemable from the image data. 11 Appeal 2016-008644 Application 11/997, 17 4 See Alice, 573 U.S. at 208 ("[T]he system claims are no different from the method claims in substance."). This approach is also consistent with the Federal Circuit's approach for the computer-readable medium claims in CyberSource, as well as its predecessor court's treating an apparatus claim as a method claim for eligibility purposes in Abele. See CyberSource, 654 F.3d at 1374 (noting that the underlying invention for both the method and computer-readable medium claims was a method for detecting credit card fraud-not a manufacture for storing computer-readable information); see also id. (noting that the Abele court treated apparatus claim 7 as a method claim for purposes of determining eligibility). On this record, I find that system claim 12 's additional elements, namely the limitations reciting that the data processor's analyzing function determines ( 1) a direction of a person's look, and (2) an intersection of the direction with a particular object, apply or use the abstract idea in some other meaningful way beyond generally linking the use of the abstract idea to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the abstract idea. See Guidance at 55 (citing MPEP § 2106.05(e)). As noted previously in connection with method claim 1, independent system claim 12 does not preempt all ways of identifying physical objects of which a person has an interest using a data processor----even those involving image analysis. Rather, claim 12 recites a data processor whose functionality is limited to a particular way to achieve this end, namely by analyzing data from a video camera by using a processor that determines ( 1) a direction of the person's look, and (2) an intersection of that direction with a particular object. 12 Appeal 2016-008644 Application 11/997, 17 4 In short, claim 12 integrates an abstract idea into a practical application, as does claim 1. Because this issue is dispositive of error in the Examiner's ineligibility rejection, it is unnecessary to address whether these claims' additional elements add significantly more to the abstract idea to provide an inventive concept under Alice/Mayo step two. 13 UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte MICHEL MARCEL JOSE DECRE, EVERT JAN VAN LOENEN, and BARTEL MARINUS VAN DE SLUIS Appeal 2016-008644 Application 11/997, 17 4 Technology Center 3600 Before JOHN A. JEFFERY, DENISE M. POTHIER, and JUSTIN BUSCH, Administrative Patent Judges. BUSCH, Administrative Patent Judge, dissenting-in-part. I join the holding and analysis of the principal opinion ("Decision" or "Dec.") determining that claims 1-3, 5, and 7-13 are directed to patent- ineligible subject matter under the analysis set forth in Mayo Collaborative Services v. Prometheus Laboratories, Inc., 566 U.S. 66 (2012) and Alice Corp. Proprietary Ltd. v. CLS Bank International, 573 U.S. 208 (2014). Dec. 7-17. I also agree with the Decision that claim 17 recites an abstract idea. See Dec. 1 7-19. I initially note that Appellants argue the claims as a group. App. Br. 5-10. 1 Accordingly, for that reason alone, because I agree 1 For example, the Appellants state "as independent claims 12 and 17 recite similar features, they, as well as each of the dependent claims, meet[] those eligibility standards as well." App. Br. 8. As noted in the next paragraph, the closest Appellants come to arguing claim 17 's features separately with Appeal 2016-008644 Application 11/997, 17 4 that claim 1 is patent-ineligible, I would determine claim 17 falls with claim 1. However, I disagree with the majority view that claim 17 is patent eligible because the additional limitations are sufficient to integrate the abstract idea into a practical application under Revised Step 2A - Prong 2. 2 See Dec. 19--25. Like Appellants, I see no distinction between claims 1 and 1 7 sufficient to reach a different result for claim 1 7 than the result reached regarding claim 1. See App. Br. 6 (arguing that the limitations in claims 1, 12, and 17 recite "a physical transformation occurs between an action of a person and an electronic signal that is provided to a processor" without distinguishing between the camera data recited in claims 1 and 12 and the touch data recited in claim 17). In fact, Appellants do not even allege that the touch input data provides an inventive concept or integrates the abstract idea into a practical application, other than asserting it is an aspect of the transformation. App. Br. 7-10 (arguing only that "the capability of a system to perform eye tracking is not an abstract idea if for no other reason than it entails hardware to obtain an image and the system's ability to perform digital image processing"). I would, therefore, affirm the Examiner's particularity is a passing statement regarding Appellants' argument that all of the claims effect a transformation. App. Br. 5---6. Notably, as discussed further below, Appellants' statement that claim 17 recites determining a direction "by a touch sensor" is imprecise because it reads limitations not recited into the claim. 2 The Decision accurately summarizes the Guidance and how the Guidance directs us to analyze claims for eligibility under 35 U.S.C. § 101 in light of Alice and Mayo. See Dec. 8-17. Therefore, instead of reiterating the Guidance and principles of law here, I simply use the Decision's shorthand for referring to the various steps of the Alice analysis consistent with the Guidance. 2 Appeal 2016-008644 Application 11/997, 17 4 rejection under 35 U.S.C § 101 of claim 17 for substantially the same reasons the Decision affirms the rejection of claims 1-3, 5, and 7-13. In the Decision's exemplary bakery protective window scenario, claim 1 7 would be akin to a person placing her finger on a display case and asking what filling was in a particular baked good----e.g., a danish. The bakery worker may then attempt to identify a particular baked good in which the person is interested based on a determination of the location the person touched the protective window. The bakery worker may attempt to identify the baked good by mentally mapping that location ( or an approximate location if the bakery employee cannot see exactly where on the protective window the customer touched or pointed) to the baked goods closest to where the user touched the protective window. The determination may also involve using both the position of the customer's finger and the direction the customer's finger is pointing to map this directional information to a location that may or may not be the closest position. See Spec. 3:33--4:4 ("The remote interaction may concern ... the person 101 pointing with a finger at one of the products ... and a person's touch of the shop window 105."). Regardless, in order to determine the baked good about which the customer inquired based on a user pointing at the baked goods, the bakery worker must at least identify the location of the customer's finger. Given this perceived interest in a particular baked good, the bakery worker may then point at or touch a particular baked good inside the case and provide additional data about the item to the customer----e.g., that the baked good is an apple danish. Appellants' Specification generally describes using generic computing devices to receive interaction data, suggesting that any known input devices 3 Appeal 2016-008644 Application 11/997, 17 4 may be used, and discloses that a processor analyzes the input to identify the object in which a person is interested. Spec. 3 :25-29 ( explaining that video cameras may capture video of a monitored area, including people in the monitored area), 4:9-21 ( explaining that a processor may analyze video data to determine the direction a person is looking or pointing), 5:25---6:2 ( describing using microphones to capture audio data that a processor may use to identify products of interest), 6:4--10 (disclosing that "[t]he shop window may incorporate a grid with e.g. capacitive touch detectors (not shown) or another type of touch detector for detecting a position of the touches on the shop window" and the processor may use the touch positions to identify products of interest). 3 In my view, both claim 1 and claim 17 identify a person's interest in an object and do not recite significantly more than method steps that are directed to mental processes or fundamental economic and business practices and, therefore, abstract ideas. Notably, Appellants' Specification provides no details regarding the elements that allegedly integrate the abstract idea of identifying a person's interest in an object into a practical application. The Specification simply states that one or more images may be used to determine a person's position and the direction in which they are looking and identify the object of interest based on that information. Spec. 2:11-15; see id. 3:30-4:4 (generically 3 Notably, nothing in claim 17 requires the "touch data" to involve touch sensors. Claim 17 is broad enough to encompass, for example, using a camera or multiple cameras to determine that a user touched a window at least once. However that touch data was obtained, claim 17 then detects "a position of touches on the window based on the touch data and then project[s] the touch position onto coordinates of the objects to identify the particular object." See App. Br. 14--15 (Claims App'x). 4 Appeal 2016-008644 Application 11/997, 17 4 describing various types of input data that may be used to identify an object in which a person may be interested without details of how such input is generated or used). With respect to receiving or using the touch data, the Specification provides even less detail than with respect to the direction of look data. The Specification merely describes one example using a grid of touch detectors to detect touch positions and using that data "to project the touch position onto the coordinates of products ... to determine which product is closest." Spec. 6:3-10; see also id. 7:3----6 (generally indicating the system may obtain "touch input data"). The conclusion in the Decision that claims 1-3, 5, and 7-13 are not patent eligible rests on a determination that using a video camera and a processor to determine a direction in which a person is looking does not amount to significantly more than an abstract idea or integrate the abstract idea into a practical application. See Dec. 12-17. Specifically, the Decision determines claim 1 simply automates a manual process using generic computer elements as tools to perform the abstract idea of determining where a potential customer is looking and providing additional (sales or marketing) information. See Dec. 9-12. The Decision also determines the additional elements (i.e., the video camera and processor) do not recite a transformation and, to the extent there is a transformation, it is merely extra- solution activity. Dec. 14--15. I agree with the Decision's analysis of claim 1, and I would only add that the alleged transformation of data is a part of the abstract idea and, therefore, not an element recited in addition to the abstract idea. See 84 Fed. Reg. 54 (We "evaluate integration into a practical application by: 5 Appeal 2016-008644 Application 11/997, 17 4 (a) Identifying whether there are any additional elements recited in the claim beyond the judicial exception( s ); and (b) evaluating those additional elements individually and in combination to determine whether they integrate the exception into a practical application." ( emphases added)). I discuss this aspect in more detail with respect to claim 1 7 below. In contrast, the conclusion in the Decision that claim 1 7 is patent eligible rests on the determination that claim 17's steps of detecting a touch position on a window and projecting that position onto coordinates of the objects integrate the abstract idea into a practical application. 4 I disagree because claim 17 does not recite additional elements that integrate the abstract idea into a practical application. Claim 17's use of the touch data (i.e., the location on a window the person touches) is merely a different type of interaction data than the data (i.e., data relating to the direction of a person's look) obtained by a video camera in claim 1. Claim 1 recites obtaining the input data using a video camera and determining an intersection of a direction of a person's look and an object, whereas claim 17 recites using touch data to determine a touch position and projecting that position onto coordinates of the objects. The Decision concludes (1) there is no transformation in claim 1, and (2) any potential transformation in claim 1 is merely extra-solution activity. Dec. 14--15. I agree there is no transformation and, for similar reasons, I conclude obtaining interaction data using a touch sensitive window does not 4 As noted above, the Decision determines claim 1 7 recites an abstract idea. See Dec. 17-19. 6 Appeal 2016-008644 Application 11/997, 17 4 involve a transformation, and, to the extent there is a transformation, it is merely insignificant extra-solution activity. The process described in claim 17 is no more transformative of the touch data obtained from claim 1 7 's window than "determining a direction (D) of a person's look and an intersection of the direction with the particular object" is transformative of the interaction data obtained from claim 1 's video camera. Indeed, "determining a direction (D) of a person's look" ( claim 1) is no different in principle than "obtaining interaction data" that "comprises touch data indicative of one or more touches by the person to a window" ( claim 17); and an "intersection ... with the particular object" ( claim 1) is no different in principle than "identify[ing] a particular object" by "detecting a position of touches on the window based on the touch data and then projecting the touch position onto coordinates of the objects" ( claim 1 7). The processes in claims 1 and 1 7 both analyze input data to determine an object in which a person is interested. Specifically, these processes receive input ( either data regarding the direction in which a person is looking or data regarding the location of a window that a user touches) and calculate an output based on that received input. Although both processes require physical objects (a camera and a person's head and eyes in claim 1 or a window and a user's body part that contacts the window in claim 17) to generate the input, the calculations do not transform data representing those physical objects. Contra In re Abele, 684 F.2d 902, 908---09 (CCP A 1982). In Abele, the Court appeared to focus on the fact that both the raw data and the transformed data (i.e., the visual depiction) both represented the same physical object and that the process "transformed" the X-ray attenuation data 7 Appeal 2016-008644 Application 11/997, 17 4 representing the object into the visual representation of the object. In other words, Abele 's eligible claim transformed one representation of a physical object ( e.g., bones, organs or other tissue) into a different representation of the same object. Here, the claimed processes use the location information input from the physical objects to calculate a location separate and apart from the physical objects used for input. Simply because claim 17 uses touch data rather than the direction of a person's look should not be determinative of patent eligibility. Accordingly, I disagree with the majority that claim 17 recites a transformation. Moreover, even to the extent claim 1 7 ( or claim 1) passes the machine or transformation test, I believe such a transformation of video or touch data into data identifying an object still fails to provide an inventive concept for at least two reasons. First, the transformation merely involves well-understood, routine, and conventional components to perform their known functions. See DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1256 (Fed. Cir. 2014) ("[I]n Mayo, the Supreme Court emphasized that satisfying the machine-or-transformation test, by itself, is not sufficient to render a claim patent-eligible, as not all transformations or machine implementations infuse an otherwise ineligible claim with an 'inventive concept."'). Second, the alleged transformation is, itself, a part of the abstract idea and, therefore, insufficient to provide the inventive concept in addition to the claimed abstract idea. See BSG Tech LLC v. BuySeasons, Inc., 899 F.3d 1281, 1291 (Fed. Cir. 2018). Of particular relevance to Appellants' claims and arguments, I note, an "inventive concept" is furnished by an element or combination of 8 Appeal 2016-008644 Application 11/997, 17 4 elements that is recited in the claim in addition to the judicial exception and sufficient to ensure the claim as a whole amounts to significantly more than the judicial exception itself. Alice Corp., 573 U.S. at 217-18 (citing Mayo, 566 U.S. at 72-73); see BSG Tech, 899 F.3d at 1290 ( explaining that the Supreme Court in Alice "only assessed whether the claim limitations other than the invention's use of the ineligible concept to which it was directed were well-understood, routine and conventional," (emphasis added)). On the other hand, "[i]f a claim's only 'inventive concept' is the application of an abstract idea using conventional and well- understood techniques, the claim has not been transformed into a patent- eligible application of an abstract idea." BSG Tech, 899 F.3d at 1290-91 (citing Berkheimer v. HP Inc., 881 F.3d 1360, 1370 (Fed. Cir. 2018)). "[I]t is irrelevant whether [the claimed abstract idea] may have been non-routine or unconventional as a factual matter ... narrowing or reformulating an abstract idea does not add 'significantly more' to it." BSG Tech, 899 F.3d at 1291. Similarly, the Guidance directs us to evaluate whether the claim's additional elements integrate the abstract idea into a practical application. 84 Fed. Reg. 54--55. Thus, we evaluate only whether the elements in addition to the recited abstract idea integrate the abstract idea into a practical application. As discussed in the Decision, the step of identifying the particular object, which includes projecting the touch position onto coordinates of the objects, is the abstract idea-i.e., the mental process or commercial interaction of determining the object in which the customer expresses an interest in order to provide additional information. See Dec. 18-19 ( explaining that "observing and judging where a person is pointing and providing more 9 Appeal 2016-008644 Application 11/997, 17 4 information by evaluating where the human is pointing" is a mental process). As explained above with respect to the bakery example, identifying an object to which a person is pointing necessarily involves determining where the person's finger is located and may include identifying a location where the user touches the window. The only additional element recited in claim 17 is the data processor. Accordingly, claim 1 7 simply uses a computer as a tool and recites no more than simply applying the abstract idea on a computer. See 84 Fed. Reg. 55 (citing MPEP § 2106.05(±)). As discussed above, the determination performed in claims 1 and 17 is simply an automation of a mental process or a commercial interaction using generic computing components recited at a high level of generality. Alice, 573 U.S. Ct. at 223 ("[T]he mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention."); see also Bancorp Services, L.L.C. v. Sun Life Assur. Co. of Canada (US.), 687 F.3d 1266, 1278 (Fed. Cir. 2012) ("[T]he use of a computer in an otherwise patent-ineligible process for no more than its most basic function-making calculations or computations-fails to circumvent the prohibition against patenting abstract ideas and mental processes."). Specifically, as discussed above, claims 1 and 17 use generic computer components, recited at a high level of generality, in place of a salesperson to automate the mental process ( and commercial interaction) of identifying a potential buyer's interest in an item in order to provide more information about that item of interest. Even reading certain embodiments from Appellants' Specification into claim 1 7, such as a window with touch sensors, claim 1 7 still simply implements a mental process on generic computer components because the 10 Appeal 2016-008644 Application 11/997, 17 4 touch sensors are generic computer components recited at a high level of generality used in their ordinary way to provide data regarding a location that the window was touched. This input merely replaces the visual observation of the touch location by a human and constitutes insignificant pre-solution data-gathering activity. See 84 Fed. Reg. 55 (citing MPEP § 2106.05(g)). Unless the claim cannot practically be performed in the mind, the claim is in the mental process category if the claim, under its broadest reasonable interpretation, covers performance in the mind but for the recitation of generic computer components. 84 Fed. Reg. 52 n.14. Furthermore, to the extent Appellants argue claim 1 7 improves a computer or other technology (see App. Br. 6-7; Reply Br. 6), I disagree. As the Examiner explains, the alleged improvement in Appellants' claims lies not in the technology, but in the selection of the information used to perform improved marketing techniques. Ans. 6-7. Notably, there is no argument that the claimed touch data is obtained in a way that improves a computer or other technology, and Appellants have not persuasively explained how simply implementing the abstract idea using a "data processor" constitutes an improvement to a computer or other technology sufficient to integrate the abstract idea into a practical application. Finally, I would determine claim 17's additional elements fail to add limitations that are more than well-understood, routine, and conventional activity in the field for the same reasons explained in the Decision with respect to claim 1 and for the same reasons I would determine the additional limitations do not integrate the abstract idea into a practical application discussed above. See Dec. 15-17. In particular, the only additional limitation recited is a data processor, which is generically recited and 11 Appeal 2016-008644 Application 11/997, 17 4 performs generic processing steps of obtaining and analyzing data. Even to the extent the touch data requires touch sensors in a window, Appellants do not contend, and the Specification does not support, that the touch sensors and detecting touch positions are more than well-understood, routine, and conventional elements performing their well-understood, routine, and conventional functions. See App. Br. 7-10; Reply Br. 5-7; Spec. 6:4--10. For the above reasons, I respectfully dissent from the majority's determination that claim 17 recites patent eligible subject matter. 12 Copy with citationCopy as parenthetical citation