Waymo LLCDownload PDFPatent Trials and Appeals BoardNov 4, 202015847064 - (D) (P.T.A.B. Nov. 4, 2020) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 15/847,064 12/19/2017 Jiajun Zhu XSDV 3.0-460 CON CON RE I 6919 146033 7590 11/04/2020 WAYMO LLC BOTOS CHURCHILL IP LAW 430 Mountain Ave Suite 401 New Providence, NJ 07974 EXAMINER LILLIS, EILEEN DUNN ART UNIT PAPER NUMBER 3993 NOTIFICATION DATE DELIVERY MODE 11/04/2020 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): pto@bciplaw.com waymo@bciplaw.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________ Ex parte JIAJUN ZHU, DAVID I. FERGUSON, and DMITRI A. DOLGOV ____________ Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 Technology Center 3900 ____________ Before MICHELLE R. OSINSKI, JILL D. HILL, and CYNTHIA L. MURPHY, Administrative Patent Judges. OSINSKI, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE The above-identified application seeks reissue of US 9,216,737 B1. Appellant1 appeals under 35 U.S.C. § 134(a) from the Examiner’s decision rejecting claims 1–49 in this reissue application. We have jurisdiction over the appeal under 35 U.S.C. § 6(b). A telephonic oral hearing was held on September 30, 2020.2 1 We use the term “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies the real party in interest as Waymo LLC. Appeal Br. 2. 2 The record includes a transcript of the oral hearing. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 2 We AFFIRM. THE CLAIMED SUBJECT MATTER Claims 1, 8, 13, 21, 28, 33, and 41 are independent. Claims 1 and 8 are reproduced below. 1. A method comprising: controlling, by one or more computing devices, an autonomous vehicle in accordance with a first control strategy; receiving, by the one or more computing devices, sensor data indicating a detection of a first object; classifying, by the one or more computing devices, the first object based on the sensor data; accessing, by the one or more computing devices, behavior data based on a classification of the first object, wherein the behavior data identifies potential actions of the first object that are to result in a change in control strategy, and wherein at least one of the potential actions identified in the behavior data is the action of changing from traveling on a first road element to travelling on a second road element; determining, by the one or more computing devices, that the first object has performed an action identified in the behavior data; and based on the determination, altering the control strategy of the autonomous vehicle by the one or more computing devices. Appeal Br. 56 (Claims App.). 8. A method comprising: controlling, by one or more computing devices, an autonomous vehicle; receiving, by the one or more computing devices, sensor data indicating a position of a first object external to the autonomous vehicle; classifying, by the one or more computing devices, the first object based on the sensor data; accessing, by the one or more computing devices, map data having a plurality of road elements; Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 3 comparing the sensor data with the map data; identifying, by the one or more computing devices, that the first object is travelling on a first road element from the plurality of road elements; determining, by the one or more computing devices, that based on the comparison of the sensor data with the map data, the first object has travelled from the first road element to a second road element; and altering, by the one or more computing devices, at least one of a position, heading, speed, and acceleration of the autonomous vehicle based on the determination that the first object has travelled from the first road element to the second road element. Appeal Br. 57 (Claims App.). EVIDENCE The Examiner relied on the following evidence in rejecting the claims on appeal: Sakai US 8,676,487 B2 Mar. 18, 2014 Kitahama US 8,983,679 B2 Mar. 17, 2015 THE REJECTIONS3 I. Claims 1–49 stand rejected under 35 U.S.C. § 112, first paragraph, as failing to comply with the written description requirement. Final Act. 2–4. II. Claims 13–20 and 33–49 stand rejected under 35 U.S.C. § 102(b) as anticipated by Kitahama. Id. at 5–7. 3 Rejections of various claims on the grounds of nonstatutory double patenting (Final Act. 12–26) have been withdrawn (Ans. 3) and are not before us on appeal. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 4 III. Claims 1–49 stand rejected under 35 U.S.C. § 103(a) as unpatentable over Kitahama and Sakai. Id. at 7–12. OPINION Rejection I The Examiner finds that because there is no algorithm or flow of logic described in the Specification to guide one of ordinary skill in the art as to how to program a computer to perform at least the steps of “classifying . . . [by one or more computing devices] the first object” and “accessing . . . [by one or more computing devices] behavior data” as recited in claim 1, for example, the Specification does not include written description support for this subject matter. Final Act. 3–4. More particularly, with respect to “classifying . . . [by one or more computing devices] the first object,” as claimed, the Examiner explains that, although the “[S]pecification indicates that ‘the computer 110 may classify the object based on the data received by the vehicle 101’s sensors,’” “[m]issing here is an explanation as to how the computer performs that function or achieves that desired result of classifying the object.” Final Act. 30. The Examiner takes the position that the disclosure “that ‘the sensor data could be used to classify objects as being a pedestrian, bicycle, or vehicle’ . . . is . . . insufficient to describe how the computer performs the function/result of ‘classifying’ in the context of claim 6 (for example).” Id. The Examiner further explains: Object classification (in context) would appear to require more than simply determining the speed, heading and acceleration that is associated with an object, because raw sensor data does not inherently distinguish between detected objects. In addition, the “sensors” identified in [the Specification] at columns 5–7, lines Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 5 18–9, are described as providing distance and speed data, but such data would appear to require computer processing which is not described. Id. More particularly, with respect to “accessing . . . [by one or more computing devices] behavior data,” as claimed, the Examiner explains: The . . . [S]pecification indicates that “database 138 may include a set of actions or behaviors of interest”. Response at 15, citing to column 7, lines 41–50. Missing here is an explanation as to how the computer performs the function or achieves the desired result of accessing behavior data. In the [E]xaminer’s opinion, the suggestion in [the Specification] that “instructions 132 may allow for computer system 110 to identify when a detected vehicle has performed one or more of the behaviors of interest” ([i]d[.]), is insufficient to describe how the computer performs the claimed function/result of “accessing ... behavior data”. Moreover, given a database (137) that includes the detected position and movement associated with a vehicle (as opposed to the claimed “object”) (col. 7, ll. 33–40), and a database (136) that includes a road graph or “map”, simply “combining both sets of data" ([i]d[.]) would not appear to inherently result in the access of “behavior data” associated with that detected vehicle. In this regard, the detected vehicle data is local (i.e. relative to vehicle 101: cols. 7–8, ll. 51–2), but the map data would appear to be global. [The Specification] does not describe how the local/global data would be reconciled with “a set of actions or behaviors of interest” contained in database 138. Final Act. 31. Whether a specification complies with the written description requirement of 35 U.S.C. § 112, first paragraph, is a question of fact and is assessed on a case-by-case basis. See, e.g., Purdue Pharma L.P. v. Faulding, Inc., 230 F.3d 1320, 1323 (Fed. Cir. 2000) (citing Vas-Cath Inc. v. Mahurkar, 935 F.2d 1555, 1561 (Fed. Cir. 1991)). The disclosure, as Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 6 originally filed, need not literally describe the claimed subject matter (i.e., using the same terms or in haec verba) in order to satisfy the written description requirement. The specification, however, must convey with reasonable clarity to those skilled in the art that, as of the filing date, Appellant was in possession of the claimed invention. See id. When examining computer-implemented functional claims, such as claim 1, the Examiner should determine whether Appellant’s Specification discloses the computer and the algorithm (e.g., the necessary steps and/or flowcharts) that perform the claimed function in sufficient detail such that one of ordinary skill in the art can reasonably conclude that the inventor possessed the claimed subject matter at the time of filing. It is not enough that one skilled in the art could write a program to achieve the claimed function because a specification must explain how the inventor intends to achieve the claimed function to satisfy the written description requirement. See, e.g., Vasudevan Software, Inc. v. MicroStrategy, Inc., 782 F.3d 671, 681-683 (Fed. Cir. 2015). “The written description requirement is not met if the specification merely describes a ‘desired result.’” Id. The Office provides guidance (“2019 § 112 Guidance”) for addressing whether a claim meets the requirements of 35 U.S.C. § 112 “where functional language is used to claim [a] computer-implemented invention.” (Computer-Implemented Functional Claim Limitations for Compliance With 35 U.S.C. 112, 84 Fed. Reg. 57 (Jan. 7, 2019)). The 2019 § 112 Guidance incorporates Federal Circuit precedent pertaining specifically to computer- implemented inventions, such as Vasudevan, and particularly addresses written-description issues “related to the examination of computer- Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 7 implemented inventions that recite only the idea of a solution or outcome to a problem but fail to recite details of how the solution or outcome is accomplished.” 2019 § 112 Guidance, 84 Fed. Reg. at 61. The 2019 § 112 Guidance instructs examiners that, “[w]hen examining computer implemented, software-related claims,” they “should determine whether the specification discloses the computer and the algorithm(s) that achieve the claimed function in sufficient detail that one of ordinary skill in the art can reasonably conclude that the inventor possessed the claimed subject matter at the time of filing.” 2019 § 112 Guidance, 84 Fed. Reg. at 61. An algorithm can be expressed in “any understandable terms,” such as “in prose” or “as a flow chart.” Id. at 62. It is not sufficient, however, “that one skilled in the art could theoretically write a program to achieve the claimed function, rather the specification itself must explain how the claimed function is achieved to demonstrate that the applicant had possession of it.” Id.; see also Vasudevan, 782 F.3d at 682–83. The 2019 § 112 Guidance tells examiners that “[t]he level of detail required to satisfy the written description requirement varies depending on the nature and scope of the claims and on the complexity and predictability of the relevant technology,” and “[i]nformation that is well known in the art need not be described in detail in the specification.” 2019 § 112 Guidance, 84 Fed. Reg. at 61. Here, for example, a recital of actual “computer code” and/or details about “electrical signals” (see Appeal Br. 13) would not be necessary to satisfy the written description requirement. On the other hand, “sufficient information must be provided to show that the inventor had possession of the invention as claimed.” 2019 § 112 Guidance, 84 Fed. Reg. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 8 at 61. Here, this “sufficient information” would be details about how disclosed computer 110 is programmed to achieve the desired functions of classifying objects and accessing behavior data. The 2019 § 112 Guidance instructs examiners “to compare the scope of the claim with the scope of the description to determine whether applicant has demonstrated possession of the claimed invention.” 2019 § 112 Guidance, 84 Fed. Reg. at 61. This is not to say that written-description support for a claim extends only to details expressly set forth in the specification. See id. However, details set forth in the specification must convey possession of the full scope of the claimed invention. Here, the details set forth in Appellant’s Specification must convey possession of programming the computer 110 to achieve the claimed functions of classifying objects and accessing behavior data. Classifying Claims 1–20 require a computer (e.g., “one or more computing devices” and/or a “one or more processors”) to classify “[a] first object based on the sensor data.” Appeal Br. (Claims App.) Claims 6–7 (which depend from claim 1) and claims 26–27 (which depend from claim 21) require the first object to be “classified as a vehicle.” (Id.) When a human is driving a vehicle, he/she sees “objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc.” Spec. 5:18–20. The human driver’s thought process with respect to a detected object depends, of course, upon what the detected object is. If a human driver sees “a sign indicating that the speed limit is 35 mph,” and the vehicle’s current speed is 50 mph, and the driver knows that Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 9 he/she needs to “slow down” the vehicle. Spec. 7:13–15. If a human driver sees “an object is obstructing the intended path of the vehicle,” the driver knows that he/she needs to “maneuver the vehicle around the obstruction.” Id. 7:16–19. Thus, a human driver’s thought process while driving a vehicle invariably includes classifying a detected object. The classifying function recited in claims 1–20, “without the computer, amount[s] to normal human perception/judgement” that would be employed when a human driver navigates a vehicle. Ans.14 n.2. The Specification may convey possession of the Appellant’s idea to computerize a human driver’s classification of detected objects via the computer 110, but the Specification does not convey possession of the necessary steps of a program to perform such a classification of detected objects. The functional diagram depicted in Figure 1 shows, schematically, a database 137 containing “[o]bject [d]ata” and “classification data,” but there is no discussion in the Specification as to what is meant, even generally, by “object data” and “classification data.” The flow chart depicted in Figure 7 does not mention “object data” or “classification data,” but instead only includes a block 720 labeled as “Identify classification and state of each detected object.” The Specification does say that “the computer 110 may classify [a detected] object based on the data received by vehicle 101’s sensors” and, that, “[f]or example, the sensor data could be used to classify objects as being a pedestrian, a bicycle or vehicle.” Spec. 9:4–7. As noted by the Examiner, there is no indication in the rejection that the Specification lacked adequate details regarding the raw sensor data received by the controller Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 10 110. See Ans. 19–20. Rather, the Examiner’s written description concerns reside solely in what happens after the computer 110 receives sensor data. The raw sensor data received by the computer 110 would “require additional computer processing to distinguish between detected objects,” and “the algorithm for performing that additional processing has not been adequately described or otherwise identified” in the Specification. Id. at 23. Accessing Behavior Data Claims 1–7 and 13–20 require the computer 110 to access “behavior data” that is “based on a classification of the first object” and defines the behavior data in terms of “a first object” Appeal Br. (Claims App.) Claims 21–27 and 33–40 do not require the computer 110 to classify a first object based on sensor data, but they also define the behavior data in terms of “a first object.” Id. Specifically, these claims require the behavior data to identify “potential actions of the first object that are to result in a change of control strategy,” and require that “at least one of the potential actions of the first object is the first object changing from traveling on a first road element [(e.g., a first lane)] to travelling on a second road element [(e.g., a second lane)].” Id. When a human driver is driving a vehicle on a multi-lane roadway, he/she is aware of the lane arrangement of the roadway, his/her vehicle’s position in this lane arrangement, and other nearby vehicles’ positions in this lane arrangement. The human driver also knows, due to his/her driving experience, that a nearby vehicle could behave in certain ways, including changing lanes on the roadway. The human driver further knows that if a nearby vehicle changes lanes, it might be necessary to alter the position, Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 11 heading, speed, and acceleration of his/her vehicle. Thus, the human driver is on the lookout for this excepted behavior by a nearby vehicle so that, when it occurs, he/she can respond appropriately. The “behavior data” recited in claims 1–7, 13–27, and 33–40, “without the computer, amount[s] to normal human perception/judgement” that would be employed when a human driver is driving a non-autonomous vehicle. Ans. 14 n.2. The Specification may convey possession of the Appellant’s idea of the computer 110 storing a human driver’s knowledge about potential behaviors of nearby objects for later access, but the Specification does not convey possession of the necessary steps of a program to store and access such potential behaviors of nearby objects. The functional diagram depicted in Figure 1 shows, schematically, a database 138 containing “[b]ehaviors of interest.” The flow chart depicted in Figure 7 does not mention a database of behaviors of interest, but instead only includes a block 740 labeled as “Has a detected vehicle performed a behavior of interest?”. The Specification describes the database 138 as including “a set of actions or behaviors of interest such as [a detected] vehicle changing lanes or routes,” so that the computer 110 can “identify when a detected vehicle has performed one or more behaviors of interest.” Spec. 7:41–43. Inasmuch as “behaviors of interest” are described elsewhere in the Specification, they are only described in the context of a detected vehicle traveling along a multi-lane highway (see id. 7:20–c8:2); following a path 630 (see id. 8:10–16), or coming to stop for a period of time (see id. 8:51–54). Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 12 Claims 1–5, 13–20–25, and 33–40 do not require the first object to be classified as a vehicle, and, therefore, have a claim scope encompassing a detected object that is not a vehicle. The Specification discloses that sensor data could be used to classify non-vehicle objects such as a “pedestrian” or a “bicycle.” Spec. 9:5–7. The Specification does not describe “behavior data” for a pedestrian and/or “behavior data” for a bicycle, which is assessable by the computer 110. Even when the detected object is classified as a vehicle (claims 6, 7, 26, 27), the Specification does not describe an algorithm or flow of logic causing the computer 110 to access behavior data for a vehicle, as opposed to behavior data for a pedestrian or bicycle. Moreover, claims 1–5, 13–20–25, and 33–40 have a claim scope encompassing an almost infinite number of detected objects and behavior data associated therewith. The details set forth in the Specification about behavior data (i.e., behavior data is described only in the context of a nearby vehicle) does not convey possession of programming a computer commensurate with the scope of these claims.4 Accessing Map Data Claims 8–12, 28–32, and 41–49 require the computer 110 to access “map data having a plurality of road elements.” Appeal Br. (Claims App.). Claims 8–12 require the computer 110 to classify a first object “based on the sensor data,” but they do not recite any relationship between the classified first object and the map data that is accessed by the computer 110. Id. 4 We note that neither the claims nor the Specification describe what happens if the first object cannot be classified by the computer 110 and/or if there is no behavior data for the first object. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 13 Claims 28–32 and 41–49 only require the sensor data to indicate “a position of a first object,” and they likewise do not recite any relationship between the first object’s position and the map data that is accessed by the computer 110. The functional diagram depicted in Figure 1 shows, schematically, a database 136 containing, among other things, a “Detailed Map,” and the Specification says that “a road graph of the environment” is “stored in database 136.” Spec. 7:47–48. The flow chart depicted in Figure 7 includes a block 730 labeled “Access road graph data corresponding to objects’ environment,” and the Specification says that, when nearby vehicles 510 and 520 are detected, “road graph data” may be accessed that represents “the current environment of vehicles 510 and 520.” Spec. 7:55–58. The Specification shows, in Figure 6, “a road graph 600” that can be accessed by the computer 110. Spec. 7:56–59. The Specification, however, does not describe what “map data” is actually stored in the database 136. According to the Examiner, “the road graph data would appear to be global” (Ans. 26), which would align with the computer 110 being in communication with “a geographic position component” that determines the vehicle 101’s “latitude” and “longitude” positions (Spec 4:4–8). However, this same paragraph also says that “[t]he location of the vehicle,” determined by the geographic position component, can also include, as well, “relative location information, such as location relative to other cars immediately around it which can often be determined with less noise than absolute geographical location.” Spec. 4:17–20. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 14 In any event, we agree with the Examiner that the Specification does not adequately describe the computer 110 “reconciling” the map data with the sensed data. Ans. 26. More particularly, Appellant’s Specification describes no algorithm or flow of logic explaining how the computer 110 is programmed to access (e.g., from the database 136 or elsewhere) not only map data, but map data that corresponds with the location of the vehicle 101 and/or objects detected thereby.5 The Examiner’s findings and determinations reflect that the § 112 rejection of claims 1–49 was formulated by following the 2019 § 112 guidance. See Final Act. 2–4, Ans. 13–20. We agree with the Examiner that, per this Guidance, and thus per the controlling case law that the Guidance assimilates, claims 1–49 do not satisfy the written-description requirement in that Appellant’s Specification describes “an outline of desired results that the computer is expected to achieve,” but “missing” from the Specification is “the algorithm or logic flow” which would convey possession of the computer being programmed to do so. Ans. 13–14. In sum, we agree with the Examiner’s determination that there is no algorithm or flow of logic described in Appellant’s Specification sufficient to guide one of ordinary skill in the art to program a computer to perform at least the steps of (i) classifying, by one or more computing devices, an object based on sensor data, (ii) accessing, by one or more computing devices, behavior data, and (iii) accessing, by one or more computing devices, map data. Each 5 As indicated above, claims 8–12, 28–32, and 41–49 do not expressly require a relationship between the detected first object and the map data, but without such a relationship, the later comparison of sensor data and map data would seem to be nonsensible. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 15 of independent claims 1, 8, 13, 21, 28, 33, and 41 recite at least one of these steps. We sustain the rejection of claims 1–49 under 35 U.S.C. § 112, first paragraph, as failing to comply with the written description requirement. Rejection II Claim 13 The Examiner finds that Kitahama discloses all of the limitations of independent claim 13, including, among other things, a processor configured to perform all of the recited functions. Final Act. 5–6 (citing Kitahama 5:46–56, 7:17–27, 47–59, 8:29–46; see also id. at 33 (citing Kitahama 5:62– 6:12, 8:29–46) (“[T]he processor in Kitahama is capable of performing the function(s) recited in claim 13 because that processor actually does utilize the classification of an object, such as ‘attributes of the recognized object’ (Kitahama), to ‘access behavior data’ including ‘potential actions’, such as ‘states of the recognized objects’ (Kitahama). Similarly, the processor in Kitahama is capable of performing the ‘determine . . . an action’ and ‘alter the control strategy’ functions recited in claim 13, because the processor actually does recognize or ‘determine’ an object state or ‘action’, and can correct or ‘alter’ the movement trajectory or ‘control strategy’ using movement strategies.”). More particularly, as to the processor being configured to “classify the first object based on the sensor data,” as claimed, the Examiner takes the position that Kitahama “recognizes[s] the physical attributes of the detected objects.” Ans. 33. More particularly, as to the processor being configured to “access behavior data based on a classification of the first object,” as claimed, the Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 16 Examiner points to Kitahama “recogniz[ing] the movement states of the detected objects.” Ans. 33. The Examiner takes the position that claim 13 “do[es] not recite a database,” but “even if claim 13 was interpreted as requiring the structure of a database, the actual contents of that database would appear immaterial to the processor being capable of functioning to broadly ‘access’ the database.” Id. at 31–32. More particularly, as to the recitation that “the behavior data identifies potential actions of the first object,” as claimed, the Examiner takes the position that “the ‘potential actions’ and ‘behavior data’ claim recitations are satisfied by Kitahama.” Ans. 34. The Examiner states in particular: In Kitahama, the ECU 20 actually does access behavior data associated with a detected object. October 2019 Office [A]ction at 32–33. Kitahama indicates that the environment recognition unit 21 “recognizes the states of the recognized objects”. Such recognition is not only based upon the recognized attributes (i.e. “classification”) of the detected object, but also includes the recognition of different states of movement, such as: “a moving state (particularly, a traveling state), a crossing state, a moving speed, a moving direction, or unknown”. Kitahama at column 7, lines 23–27. These states of movement can also be characterized as “potential actions” and/or “behavior data”, and further must be determined by comparison with known behavior data in order for a state of movement to potentially be “unknown”. Moreover, the ECU 20 in Kitahama is structurally capable of accessing database hardware that contains information relating to potential actions of a detected object that are to result in a change in control strategy. The ECU 20 actually does access at least two databases (col. 6, ll. 6–12), so the structural capability is clearly demonstrated in Kitahama. Ans. 34–35. More particularly, as to the processor being configured to “determine that the first object has performed an action identified in the behavior data,” Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 17 as claimed, the Examiner points to Kitahama “determin[ing] whether the attribute and state associated with a detected object may result in the object being an obstacle to the host/automatic vehicle.” Ans. 33. The Examiner takes the position that “th[e] broad functional statement [of this claim limitation] simply refers to ‘an action identified in the behavior data’, which may or may not be one of the ‘potential actions’ that is recited together with the access function.” Ans. 32. More particularly, as to the processor being configured to “alter the control strategy of the autonomous vehicle based on the determination,” as claimed, the Examiner points to Kitahama “correct[ing] the movement trajectory using a movement strategy.” Ans. 33. Appellant argues that when “Kitahama ‘automatically generates movement trajectory based on movement strategies according to the traveling scene’ (Kitahama col. 5[,] ll. 64–65), that applies to a current situation in the environment and has nothing to do with identifying ‘potential actions’ from ‘behavior data’ in the manner claimed in Appellant’s claim 13” and “[t]he rejection fails to assert or to explain where Kitahama discloses that ‘behavior data identifies potential actions.’” Appeal Br. 27– 28; see also id. at 29 (boldface omitted) (“The ‘surrounding environment recognition unit’ ‘recognizes attributes of the recognized objects’ (id. col.[ ]7[,] ll. 14–16), which as best understood is currently observed information [regarding] the ‘traveling state’ of the object. This is irrelevant Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 18 to behavior data that ‘identifies potential actions of the first object that are to result in a change in control strategy’ as claimed.”).6 The Examiner’s position is that Kitahama’s surrounding environment recognition unit 21 recognizes the states of recognized objects (Kitahama 7:23–27), and these recognized states (e.g., stop state, moving state, crossing state) of recognized objects may be characterized as potential (or possible) actions of recognized objects. Ans. 34. Moreover, the Examiner’s position is that in order for Kitahama’s surrounding environment recognition unit 21 to be able to recognize a state of a recognized object (e.g., stop state, moving 6 Appellant also argues that “[i]t is improper (and erroneous) for the Examiner to assert that a limitation is missing in Kitahama for one claim but is present in that reference for another claim.” Appeal Br. 27. Although the Examiner has stated that “Kitahama does not explicitly describe performing the action of accessing behavior data based upon the object classification” in connection with an alternative obviousness rejection (Final Act. 7–8), the Examiner has explicitly stated in connection with the anticipation rejection of claim 13 before us that “Kitahama discloses an electronic control unit . . . which functions to . . . recognize the movement states of the detected objects (or ‘access behavior data based on a classification’).” Ans. 33; see also id. at 34 (“In Kitahama, the ECU 20 actually does access behavior data associated with a detected object. . . . Kitahama indicates that the environment recognition unit 21 ‘recognizes the states of the recognized objects’. Such recognition is not only based upon the recognized attributes (i.e. ‘classification’) of the detected object, but also includes the recognition of different states of movement, such as: ‘a moving state (particularly, a traveling state), a crossing state, a moving speed, a moving direction, or unknown’. Kitahama at column 7, lines 23–27. These states of movement can also be characterized as ‘potential actions’ and/or ‘behavior data’, and further must be determined by comparison with known behavior data in order for a state of movement to potentially be ‘unknown’.”). We do not find the alternative rejections articulated by the Examiner to constitute a basis for a finding of reversible error. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 19 state, crossing state) rather than having to label the state of the recognized object as “unknown,” there must be a comparison with known data. Id. In other words, the Examiner takes the position that Kitahama’s programming with respect to recognizing and labeling of a current state of a recognized object necessarily involves accessing known behavior data for a recognized object (i.e., potential or possible actions of the recognized object such as stopping, moving, crossing etc.) for comparison in order to be able to accurately label the current state of the recognized object. Appellant’s argument does not respond with sufficient particularity to the findings by the Examiner to persuade us of error by the Examiner. Appellant also argues that “[a]nother deficiency of Kitahama is that it does not teach ‘determine that the first object has performed an action identified in the behavior data.’” Appeal Br. 29. More particularly, Appellant argues that Kitahama’s “movement strategy candidates are based on observed actions by the detected objects, which is not a determination ‘that the first object has performed an action identified in the behavior data” as claimed.” Id. (boldface omitted). The Examiner’s position is that Kitahama discloses that in the case where an object is an obstacle to the traveling of the host vehicle, the surrounding environment recognition unit determines the traveling scene, which involves recognizing a state of a recognized object, and when the state of a recognized object is determined, this constitutes a determination that a recognized object has performed a potential (or possible) action (i.e., at least one of the recognized states of a stop state, moving state, or crossing state). Ans. 33. Appellant has not persuaded us of error in the Examiner’s findings. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 20 Appellant also argues that “because there is no teaching in Kitahama regarding behavior data, Kitahama necessarily does not disclose alteration of ‘the control strategy of the autonomous vehicle based on the determination’ in the manner claimed.” Appeal Br. 29. The Examiner’s position is that Kitahama “can correct or ‘alter’ the movement trajectory or ‘control strategy’ using movement strategies.” Final Act. 33 (citing Kitahama 5:62– 6:12, 8:29–46). Kitahama describes movement strategies being set according to a determined traveling scene. Kitahama 8:29–46. According to the Examiner, a change in the current traveling scene will result in a change in movement strategy so as to constitute an alteration of control strategy based on a determination of a state of a recognized object (i.e., at least one of the recognized states of a stop state, moving state, or crossing state), which is itself a determination that the recognized object has performed a potential (or possible) action. Appellant has not persuaded us of error in the Examiner’s findings. For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s determination that Kitahama discloses all of the limitations of independent claim 13. Accordingly, we sustain the rejection of claim 13 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Claims 14–20 In rejecting dependent claims 14–20, the Examiner stated in the Final Action that “claims 14–19, reciting a further limitation(s) related to a prior functional requirement would not appear to define a new structural requirement(s) of the claimed system that would distinguish the processor(s) disclosed in Kitahama” and with respect to claim 20 that “in Kitahama the Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 21 processor(s) further functions to control an autonomous vehicle in the recited manner, as indicated at column 12, lines 9–24.” Final Act. 6. Appellant argues that “[t]he Examiner has failed to meet his burden of setting forth a prima facie case for anticipation with regard to the rejections of claims 14–19.” Appeal Br. 30. The Examiner responds in the Answer by providing additional explanation and including citation to specific passages within Kitahama with respect to claims 14–19. Ans. 36–37 (citing Kitahama 5:65–6:4, 7:14–59, 8:31–38, Fig. 2). Appellant does not respond with sufficient particularity in the Reply Brief to the specific findings made in the Answer. See Reply Br. 11 (solely arguing the purported deficiencies with respect to independent claim 13). For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s determination that Kitahama discloses all of the limitations of claims 14–20. Accordingly, we sustain the rejection of claims 14–20 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Claims 33–40 The Examiner finds that claim 33 “recites the subject matter of claim 13,” but lacks a recitation relating to “classification of a sensed object.” Final Act. 6. The Examiner also finds that “claims 34–40 recite the same subject matter as that of claims 14–20.” Id. Accordingly, the Examiner relies on the previous analysis provided in connection with independent claim 13 and its dependent claims. Id. Appellant essentially relies on the same arguments and reasoning presented in connection with the rejection of independent claim 13 (Appeal Br. 31–35), and for similar reasons discussed above in connection with claim 13, we do not find such Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 22 arguments and reasoning persuasive of error. Accordingly, we sustain the rejection of claims 33–40 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Claim 41 The Examiner finds that Kitahama discloses all of the limitations of independent claim 41, including among other things, that the processor is “configured to . . . access map data having a plurality of road elements; [and] determine, based on a difference between the sensor data and the map data, that a first object has travelled from the first road element to a second road element,” as claimed. Final Act. 6. More particularly, the Examiner finds that “in Kitahama the processor(s) further functions to access map data, as indicated at column 10, lines 12–31 (‘navigation device’).” Id. The Examiner further explains the position taken in the rejection: In Kitahama, the processor can utilize information that is obtained from a navigation device, which would inherently include “map data”, in generating a display image of the vehicle traveling scene (“[t]he basic image . . . may be an image that is obtained by adding the recognized object to an image (CG image) that is generated by a navigation device”). Moreover, the processor in Kitahama is capable of performing the road element function recited in claim 41, because that processor actually can account for detected object movement (col. 7, ll. 14–27: “moving state . . . crossing state . . . moving speed . . . moving direction”). Id. at 34. The Examiner further explains the position taken in the rejection in the Answer: Considering the map data together with the “determine” function recited in claim 41, Kitahama clearly demonstrates the structural capability of the ECU 20 (processor) to access map data and determine whether a detected object has travelled on two road elements. Kitahama at column 10, lines 12-31. Brief at 36. As explained in these passages, the ECU 20 can utilize Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 23 information that is obtained from a navigation device (which would inherently include “map data”) that generates a basic display image of the autonomous vehicle traveling scene. As further explained in Kitahama, a display image representing a potential corrective movement strategy (as generated by the display control unit 27) is added to the basic image generated by the navigation device, in order to display the detected object information (including movement) relative to the traveling scene (and the autonomous vehicle). Thus, the broad functional recitations of claim 41: “access map data having a plurality of road elements”; and “determine, based on a difference between the sensor data and the map data, that a first object has travelled from the first road element to the second road element”, does not structurally distinguish the capability of the ECU 20 disclosed in Kitahama. Ans. 38–39. Thus, the Examiner explains in the Final Action and Answer that the addition of a “recognized object to an image (CG image) that is generated by a navigation device” as set forth in Kitahama (Kitahama 10:28–29) would necessarily require accessing map data having a plurality of road elements by virtue of the reference to a navigation device and would necessarily require determining that a recognized object has traveled from the first road element to a second road element by virtue of the recognized object being displayed on the navigation device image accurately. As to altering at least one of the position, heading, speed, and acceleration of the autonomous vehicle based on the determination that the first object has travelled from the first road element to the second road element, the Examiner takes the position that “Kitahama clearly demonstrates the structural capability of the ECU 20 (processor) to implement a corrective movement strategy.” Ans. 39 (citing Kitahama Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 24 5:65–6:4). For example, as Kitahama describes recognition of “a slow vehicle ahead,” “a pedestrian on the street,” or “a bicycle on the street” (Kitahama 7:55–60), the particular lane in which the slow vehicle, pedestrian or bicycle is located in would have bearing on a desired movement strategy (id. at 8:35–38) (“movement strategies for changing lanes” “or movement strategies for going around a pedestrian or a bicycle”)). Appellant argues that Kitahama’s “risk map” shown in Figure 6 is “irrelevant to the claimed features” and the Examiner has not explained adequately how Kitahama teaches the claimed limitations. Appeal Br. 36– 37; Reply Br. 12–13. Appellant does not respond with sufficient particularity to the explanation provided by the Examiner in the Final Action and Answer, and thus we are not persuaded of error in the rejection articulated by the Examiner. For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s finding that Kitahama discloses all of the limitations of independent claim 41. Accordingly, we sustain the rejection of claim 41 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Claims 42–45, 48, and 49 The Examiner finds that Kitahama discloses all of the limitations of dependent claims 42–45, 48, and 49. Final Act. 6–7 (citing Kitahama 7:60– 8:16, 10:12–31, 12:9–24). The Examiner provides additional clarification in the Answer as to the basis for the rejection of these claims. Ans. 40–42 (5:65–6:4, 7:14–27, 7:60–8:16, 12:9–24). Appellant argues the Examiner improperly relies on pro forma rejections of other claims, rather than addressing the limitations of claims 42–45, 48, and 49. Appeal Br. 37. We Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 25 do not find this argument persuasive in light of the clarifying statements made by the Examiner in the Answer. In addition, Appellant does not respond with sufficient particularity in the Reply Brief to the specific findings made in the Answer. See Reply Br. 13 (solely arguing the purported deficiencies with respect to independent claim 41). For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s finding that Kitahama discloses all of the limitations of dependent claims 42–45, 48, and 49. Accordingly, we sustain the rejection of claims 42–45, 48, and 49 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Claims 46 and 47 With respect to claims 46 and 47, the Examiner finds that “Kitahama discloses a partially autonomous mode, as indicated at column 8, lines 47– 60.” Final Act. 6. Appellant argues that “[w]hile the passenger of Kitahama can use a ‘movement strategy candidate[]’ to ‘change the movement strategies’, that is not a partially autonomous mode that enables a driver to directly ‘control one or more of steering, acceleration, braking and emergency braking’ in the manner claimed.” Appeal Br. 38 (boldface omitted). Appellant continues that “[i]n contrast, as explained by Kitahama, the system ‘receives a passenger’s change operation of the movement strategies, and if the movement strategies have been changed, it regenerates the movement trajectory based on the movement strategies.’ (Kitahama col.[ ]6[,] ll. 1–4).” Id. With respect to claim 46, the Examiner responds that “Kitahama clearly discloses the structural capability of the ECU 20 (processor) to fully Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 26 control the vehicle or partially control the vehicle.” Ans. 41 (citing Kitahama 8:47–60). Kitahama discloses a “movement strategy setting unit 23 [that] sets movement strategy candidates for the passenger to change the movement strategies.” Kitahama 8:47–50. In the absence of a specific explanation from Appellant as to why Kitahama’s disclosure of passenger involvement and/or control in the selection of a movement strategy does not constitute a partially autonomous mode, we are not persuaded of error in the Examiner’s findings regarding dependent claim 46. With respect to claim 47, the Examiner responds that: [i]n Kitahama, the structural capability is clearly disclosed because the partial autonomous mode includes passenger control of at least steering and braking with the selection of a corrective movement strategy (col. 8, ll. 47-60). The claim does not recite “directly control”, and even if it did, the passenger can still effect the change of movement strategy which results in control of steering and/or braking. Brief at 38. Ans. 41. Again, in the absence of a specific explanation from Appellant as to why Kitahama’s disclosure of passenger involvement and/or control in the selection of a movement strategy does not result, through the selection of a particular movement strategy, in at least indirect control by the vehicle operator of one or more of steering, acceleration, braking and emergency braking, we are not persuaded of error in the Examiner’s findings regarding dependent claim 47. That is, Appellant does not explain adequately why the Examiner’s position articulated in the Answer is in error. For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s finding that Kitahama discloses all of the limitations of dependent claims 46 and 47. Accordingly, we sustain the rejection of claims 46 and 47 under 35 U.S.C. § 102(b) as anticipated by Kitahama. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 27 Rejection III Claims 1–7 In connection with an obviousness rejection of all of the pending claims, the Examiner finds that Kitahama discloses most of the limitations of independent claim 1, but takes the position that “Kitahama does not explicitly describe performing the action of accessing behavior data based upon the object classification.” Final Act. 7–8. The Examiner turns to Sakai, finding that Sakai teaches “a motion prediction apparatus that predicts motion of a mobile body (object) which is detected near a host (autonomous) vehicle” and that “[t]he motion prediction apparatus determines a degree of normality associated with the movement of the object, and then selects a motion prediction model based upon the determined degree of normality.” Id. at 8 (citing Sakai 1:65–2:22). The Examiner concludes that it would have been obvious “to associate an object motion prediction apparatus, as taught in Sakai, with the movement trajectory generator disclosed in Kitahama” in order to “provide an improvement in travel safety and efficiency.” Id.; Sakai 2:37–43. The Examiner takes the position that: the improved autonomous vehicle control would include performing the method step of accessing behavior data (i.e. “motion prediction models” in Sakai). In Sakai, the motion prediction apparatus determines a classification parameter commensurate with the degree of normality for the detected object movement, and uses that classification parameter to select the motion prediction model (s) (col. 3, ll. 1–19). Considering Kitahama, the classification of the detected objects (col. 7, ll. 14– 27) would have been recognized as being consistent with the teachings in Sakai, because the classification also includes object movement (“moving state ... crossing state ... moving speed ... moving direction”). Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 28 Final Act. 8–9. As to the claimed step of “accessing, by the one or more computing devices, behavior data based on a classification of the first object, wherein the behavior data identifies potential actions of the first object that are to result in a change in control strategy” (Appeal Br. 56 (Claims App.)), the Examiner finds that “[t]he identified teaching in Sakai that is related to prediction models corresponds directly to the claim 1 recitation: ‘potential actions of the first object that are to result in a change in control strategy’.” Final Act. 34. The Examiner further finds: Sakai implements the object motion prediction apparatus with traveling information acquisition means 10, which utilizes sources of prior knowledge such as road maps and traffic rules in combination with sensor data, to establish the environment surrounding the autonomous vehicle. Sakai at column 9, lines 5– 51. As explained in Sakai, the prior knowledge is stored in a database that can be accessed by the motion prediction apparatus according to the present location of the autonomous vehicle. Id[.] at lines 24–51. The database includes information such as road arrangements that are relative to the autonomous vehicle, and the sensor data is used to calculate information about a detected object that is relative to the autonomous vehicle (such as position and speed information). As further explained in Sakai, the prediction of a potential action(s) of a detected object involves estimating a degree of normality associated with the behavior, or state of movement, of the object. Sakai at column 9, lines 52–66. Thus, Sakai reasonably teaches the claim 1 terminology: “accessing ... behavior data based on a classification of the first object”, because the information that is used by the object motion prediction apparatus is stated to include “behavior” data associated with the object. Ans. 45–46. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 29 As to the claimed step of “determining, by the one or more computing devices, that the first object has performed an action identified in the behavior data” (Appeal Br. 56 (Claims App.)), the Examiner finds that “Kitahama . . . teaches the claimed step involving ‘determining’ . . . in response to a classified object performing actions including moving and/or changing direction” and “[t]he determination in Kitahama that a classified object has performed an action, is entirely consistent with the terminology of claim 1.” Ans. 44 (citing Appeal Br. 39) (underlining omitted). The Examiner explains “that the ‘action identified in the behavior data’, as recited in the ‘determining’ step of claim 1, is not even required by the claim language to be the ‘at least one action’ that was defined in the ‘accessing’ step.” Final Act. 35. Appellant argues that “predicting motion of another object based on ‘past time motions’ of that object is not what is claimed in claim 1.” Appeal Br. 39. Thus, “Appellant submits that Sakai does not teach or suggest ‘accessing, by the one or more computing devices, behavior data based on a classification of the first object, wherein the behavior data identifies potential actions of the first object that are to result in a change in control strategy, and wherein at least one of the potential actions identified in the behavior data is the action of changing from traveling on a first road element to travelling on a second road element.’” Id. (boldface omitted). Appellant does not explain adequately how Kitahama’s surrounding environment recognition unit recognizing a state of a recognized object (e.g., stop state, moving state, crossing state), and having been modified in accordance with the teachings of Sakai so as to select for the recognized Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 30 object a movement prediction model for predicting the movement region of the recognized object, fails to teach accessing behavior data based on a classification of the recognized object. We, thus, are not persuaded that the Examiner erred in finding that modified Kitahama’s movement prediction model identifies potential actions of the recognized object (e.g., a range of movement for the recognized object) and is accessed when one of a plurality of movement prediction models is selected for a recognized object. Although a “degree-of-normality acquisition device may acquire the degree of normality based on a history of movement of the mobile body” (Sakai 3:33–35), and this degree of normality may be used to select a movement prediction model from the plurality of movement prediction models (id. at 2:4–11), this does not prevent Sakai from accessing a plurality of movement prediction models that identify potential actions of the object (i.e., a range of movement of a recognized object). Appellant’s argument does not respond with sufficient particularity to the findings by the Examiner to persuade us of error by the Examiner. For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s determination that Kitahama and Sakai renders obvious the subject matter of independent claim 1. Accordingly, we sustain the rejection of claim 1, and claims 2–7 which depend therefrom and for which Appellant relies on the same arguments and reasoning (Appeal Br. 38–39), under 35 U.S.C. § 103(a) as unpatentable over Kitahama and Sakai. Claims 13–27 and 33–40 In rejecting claims 13–27 and 33–40, the Examiner relies directly or indirectly on the previous analysis provided in connection with independent Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 31 claim 1 and its dependent claims. Final Act. 10–11. Appellant essentially relies on the same arguments and reasoning presented in connection with the rejection of independent claim 1. Appeal Br. 41–47, 49–52. For similar reasons discussed above in connection with Rejection II and the rejection of claim 1 under Rejection III, we do not find such arguments and reasoning persuasive of error. Accordingly, we sustain the rejection of claims 13–27 and 33–40 under 35 U.S.C. § 103(a) as unpatentable over Kitahama and Sakai. Claims 8–12 With respect to independent claim 8, the Examiner states: in Kitahama the recognition unit accounts for traffic rules in recognizing the surrounding environment, as noted above (col. 7, ll. 34–46). Kitahama also suggests that “other means may be used” to recognize the surrounding environment (col. 14, ll. 7– 9). Sakai implements the detected object motion prediction apparatus with traveling information acquisition means 10, which utilizes map data and traffic rules as sources of prior knowledge, in combination with sensor data, to establish the surrounding environment (col. 9, ll. 5–51). Thus, the reasonable combination of the Kitahama and Sakai disclosures would obviously include the use of map data in defining the surrounding environment, where that environment includes a travelling object (i.e. vehicle). Final Act. 10. The Examiner adds that: at least Sakai teaches that the detected mobile body (object) could be “weaving” (e.g. col. 3, ll. 36–49), and this action would obviously involve “changing from traveling on a first road element to traveling on a second road element” . . . Sakai expressly discloses traveling information acquisition means 10 which uses prior knowledge that include “road maps” stored in a database (col. 9, ll. 5–34). Further in Sakai, an object detection means 11 uses the information in the traveling information Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 32 acquisition means 10 to determine the past/current position, speed and travel direction of a mobile body relative to the host or “autonomous vehicle” (col. 9, ll. 35–51). Once the motion or movement (or “behavior”) associated with a mobile body (object vehicle) is established, the motion prediction apparatus determines a degree of normality thereof, and uses the degree of normality to select a prediction model(s) of future motion or movement for that detected mobile body (cols. 1–2, ll. 65–22). Thus, in addition to the detected “weaving” on road elements discussed above (cl. 1), if the object vehicle effected a (complete) travel lane change, the selection of the motion prediction model would have logically been made by “comparing the sensor data with the map data”, as recited in claim 8. This is evident because the road maps stored in the prior knowledge database would obviously have been accessed in order for the motion prediction apparatus to determine the movement associated with the detected object vehicle (col. 9, ll. 52–66: traffic rules/lane compliance). Id. at 34–36. The Examiner also explains in the Answer that Sakai’s teaching of “estimating a degree of normality associated with the behavior, or state of movement, of the detected object . . . would obviously involve a ‘comparison’ of the sensor data and the map data, in order to predict whether a detected object will become an obstacle relative to the autonomous vehicle.” Ans. 46. Appellant argues that Kitahama’s “risk map” shown in Figure 6 is “irrelevant to the claimed features” and the Examiner has not explained adequately how Kitahama teaches the claimed limitations. Appeal Br. 40; Reply Br. 15. Appellant also argues that “[a]s best understood, there is no enabling teaching in Sakai of ‘comparing the sensor data with the map data’ as claimed.” Appeal Br. 41; Reply Br. 15. Accordingly, Appellant argues, Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 33 “the applied combination” fails to teach the claimed “determining” and “altering” steps. Id. We understand the Examiner to be explaining that Sakai’s disclosure of the storage of “road maps” and “various traffic rules . . . (e.g., . . . dividing lines (yellow lines, white lines)” along with a determination of “the degree of weaving of [a] nearby object” (Sakai 9:26–29, 61–62) would necessarily implicate a comparison of sensor data with map data in order to determine the degree of weaving (or conversely, traffic rule/lane compliance) in Sakai, and that a determination of weaving necessarily implicates a determination that an object has traveled from a first road element to a second road element. We also understand the Examiner to be explaining that the degree of weaving by a detected mobile body would be utilized in Kitahama in connection with ensuring a desirable movement strategy. This explanation from the Examiner remain unaddressed by Appellant, and Appellant, therefore, fails to respond with sufficient particularity to the explanation made by the Examiner to persuade us of error in the rejection articulated by the Examiner. Appeal Br. 40–41; Reply Br. 15–16. For the foregoing reasons, Appellant does not apprise us of error in the Examiner’s determination that Kitahama and Sakai render obvious the subject matter of claim 8. Accordingly, we sustain the rejection of claim 8, and claims 9–12 which depend therefrom and for which Appellant relies on the same arguments (Appeal Br. 41; Reply Br. 16) under 35 U.S.C. § 103(a) as unpatentable over Kitahama and Sakai. Appeal 2020-005723 Application 15/847,064 Patent US 9,216,737 B1 34 Claims 28–32 and 41–49 In rejecting claims 28–32 and 41–49, the Examiner relies directly or indirectly on the previous analysis provided in connection with independent claim 8 and its dependent claims. Final Act. 10–12. Appellant essentially relies on the same arguments and reasoning presented in connection with the rejection of independent claim 8 (Appeal Br. 47–49, 52–55), and for similar reasons discussed above in connection with Rejection II and/or the rejection of claim 8 under Rejection III, we do not find such arguments and reasoning persuasive of error. Accordingly, we sustain the rejection of claims 28–32 and 41–49 under 35 U.S.C. § 103(a) as unpatentable over Kitahama and Sakai. CONCLUSION In summary: Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1–49 112, first paragraph Written Description 1–49 13–20, 33–49 102(b) Kitahama 13–20, 33– 49 1–49 103(a) Kitahama, Sakai 1–49 Overall Outcome 1–49 No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation