Solas OLED LtdDownload PDFPatent Trials and Appeals BoardJan 13, 2022IPR2021-01254 (P.T.A.B. Jan. 13, 2022) Copy Citation Trials@uspto.gov Paper 13 571-272-7822 Date: January 13, 2022 UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________ SAMSUNG ELECTRONICS CO., LTD. and SAMSUNG ELECTRONICS AMERICA, INC., Petitioner, v. SOLAS OLED LTD., Patent Owner. ____________ IPR2021-01254 Patent 8,526,767 B2 ____________ Before SALLY C. MEDLEY, JOHN A. HUDALLA, and JULIA HEANEY, Administrative Patent Judges. MEDLEY, Administrative Patent Judge. DECISION Denying Institution of Inter Partes Review 35 U.S.C. § 314 IPR2021-01254 Patent 8,526,767 B2 2 I. INTRODUCTION Samsung Electronics Co., Ltd. and Samsung Electronics America, Inc. (collectively “Petitioner”) filed a Petition for inter partes review of claims 1-14 of U.S. Patent No. 8,526,767 B2 (Ex. 1001, “the ’767 patent”). Paper 3 (“Pet.”). Solas OLED, Ltd. (“Patent Owner”) filed a Preliminary Response. Paper 8 (“Prelim. Resp.”). Subsequently, we authorized the parties to file replies limited to the issue of discretionary denial under 35 U.S.C. § 314(a). Paper 9. Petitioner filed a Reply to the Preliminary Response (Paper 10; “Pet. Reply”) and Patent Owner filed a Sur-reply (Paper 12; “PO Sur-reply”). Institution of an inter partes review is authorized by statute when “the information presented in the petition . . . and any response . . . shows that there is a reasonable likelihood that the petitioner would prevail with respect to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a). Upon consideration of the Petition, the Preliminary Response, the Reply, the Sur-reply, and the evidence of record, we decline to institute review of the challenged claims of the ’767 patent. A. Related Matters The parties indicate that related district court litigations are: Solas OLED Ltd. v. Samsung Electronics Co., Ltd. et al., No. 2:21-cv-00105-JRG (E.D. Tex.) and Samsung Electronics Co. Ltd. et al. v. Solas OLED Ltd. et al., No. 1:21-cv-05205 (S.D.N.Y.). Pet. 2; Papers 5, 6. IPR2021-01254 Patent 8,526,767 B2 3 B. The ’767 Patent1 The ’767 patent describes how touch sensors are used to recognize gestures, such as those by a human finger or a stylus, on sensing surfaces. Ex. 1001, 1:13-22. The ’767 patent addresses the difficulty in reliably and efficiently distinguishing between a significant number of gestures, including complex gesture combinations that arise in multi-touch input. Id. at 2:66-3:2, 13:64-14:5. The ’767 patent purports to solve this problem “by adopting a state machine approach,” in which a touch sensor device comprises an at least one-dimensional sensor to output a sense signal and a gesture processing unit comprising a plurality of linked state modules operable to analyze the time series data to distinguish gesture inputs. Id. at 3:11-23. In one embodiment, a touch sensor device has two one-touch state machines for generating two-touch events. Id. at 6:60-62, Fig. 6. Figure 6 of the ’767 patent is illustrative and is reproduced below. 1 Petitioner contends that “the earliest priority date to which Claims 1-14 are entitled is October 20, 2008.” Pet. 8-9. At this juncture of the proceeding, Patent Owner does not contest Petitioner’s assertion regarding the “priority date.” See generally Prelim. Resp. IPR2021-01254 Patent 8,526,767 B2 4 Figure 6 shows a simple scalable architecture that includes state machines. Id. at 14:34-37. Each one-touch finite state machine (FSM) represents a single-touch state machine that generates a single-touch gesture, and a two-touch FSM represents a two-touch state machine that generates two-touch gestures. Id. at 14:20-24. Touch 1 and Touch 2 are processed by the two-touch FSM, “which tracks the separation and angle between the touches, and generates stretch, pinch, and rotate events as the distance and/or angle between the touches changes.” Id. at 14:38-42. An FSM can also generate complex gestures. Id. at 14:43-44. For instance, if a one-touch FSM is in a “Pressed” state, and another one-touch FSM has just generated a IPR2021-01254 Patent 8,526,767 B2 5 “Tap” event, then the two-touch FSM can generate a “Press and Tap” event. Id. at 14:45-49. C. Illustrative Claim Petitioner challenges claims 1-14 of the ’767 patent. Claims 1 and 12-14 are independent claims, and claims 2-11 depend from claim 1. Claim 1 is reproduced below. 1. A touch sensor device comprising: a sensor having a sensitive area extending in at least one dimension and arranged to output sense signals responsive to proximity of an object to the sensitive area; a processor operable to execute position-processing logic stored in one or more tangible media, the position-processing logic, when executed by the processor, configured to: calculate positions of interactions with the sensitive area from an analysis of the sense signals; and output a times series of data indicative of the interaction positions on the sensor, the interaction positions corresponding to touches; and a processor operable to execute gesture-processing logic stored in one or more tangible media, the gesture-processing logic, when executed by the processor, configured to analyze the time series of data to distinguish one or more gesture inputs from the time series of data, the gesture-processing logic being coded with gesture-recognition code comprising a plurality of state-machine modules, the plurality of state machine modules comprising: a first one-touch state-machine module, the first one touch state- machine module being operable to recognize at least a first one-touch gesture and generate a first output based on the first one-touch gesture; a second one-touch state-machine module, the second one-touch state-machine module being operable to recognize at least a IPR2021-01254 Patent 8,526,767 B2 6 second one-touch gesture and generate a second output based on the second one touch gesture; and a multi-touch state-machine module operable to: receive, directly from the first one-touch state-machine module, the first output; receive, directly from the second one-touch state-machine module, the second output; and recognize, based on at least the first and second outputs, at least one multi-touch gesture, the first one-touch state- machine module, the second one-touch state-machine module, and the multi-touch state-machine module being distinct state-machine modules; and output the recognized multi-touch gesture. Ex. 1001, 20:51-21:26. D. Asserted Grounds of Unpatentability Petitioner asserts that claims 1-14 are unpatentable based on the following grounds (Pet. 3-4): Claim(s) Challenged 35 U.S.C § Reference(s)/Basis 1, 9-14 103(a)2 Baltierra3 1, 9-14 103(a) Baltierra, Katou4 2 The Leahy-Smith America Invents Act, Pub. L. No. 112-29, 125 Stat. 284 (2011) (“AIA”), amended 35 U.S.C. §§ 102 and 103. Here, Petitioner alleges that the ’767 patent has an October 20, 2008 effective filing date. Pet. 8-9. At this juncture of the proceeding, Patent Owner does not contest Petitioner’s assertions as to the October 20, 2008 effective filing date. Prelim. Resp. 16-26. Because the October 20, 2008 effective filing date is before the effective date of the applicable AIA amendments, we refer to the pre-AIA versions of 35 U.S.C. §§ 102 and 103. 3 U.S. Pat. Appl. Pub. No. US 2009/0284478 A1, published Nov. 19, 2009 (Ex. 1005, “Baltierra”). 4 Japanese Pat. Pub. No. 9-231004, published Sept. 5, 1997 (Ex. 1006, 1-19, “Katou”). Petitioner provides a certified English-language translation of IPR2021-01254 Patent 8,526,767 B2 7 Claim(s) Challenged 35 U.S.C § Reference(s)/Basis 2-8 103(a) Baltierra, Katou, Warren5 1, 9-14 103(a) Westerman6 1, 9-14 103(a) Westerman, Katou 2-8 103(a) Westerman, Katou, Warren II. DISCUSSION A. Principles of Law A claim is unpatentable under 35 U.S.C. § 103(a) if the differences between the claimed subject matter and the prior art are such that the subject matter, as a whole, would have been obvious at the time of the invention to a person having ordinary skill in the art. KSR Int’l Co. v. Teleflex, Inc., 550 U.S. 398, 406 (2007). The question of obviousness is resolved on the basis of underlying factual determinations including (1) the scope and content of the prior art; (2) any differences between the claimed subject matter and the prior art; (3) the level of ordinary skill in the art; and (4) objective evidence of nonobviousness. Graham v. John Deere Co., 383 U.S. 1, 17-18 (1966). B. Level of Ordinary Skill in the Art Relying on the testimony of Dr. Benjamin B. Bederson, Petitioner offers an assessment as to the level of ordinary skill in the art and the general knowledge of a person of ordinary skill at the time of the ’767 patent. Pet. 8 Katou (Ex. 1006, 21-46). Any reference to Katou hereinafter will be to the English-language translation. 5 U.S. Pat. Appl. Pub. No. US 2007/0176906 A1, published Aug. 2, 2007 (Ex. 1007, “Warren”). 6 U.S. Pat. Appl. Pub. No. US 2008/0036743 A1, published Feb. 14, 2008 (Ex. 1008, “Westerman”). IPR2021-01254 Patent 8,526,767 B2 8 (citing Ex. 1002 ¶¶ 24, 30-32). For example, Dr. Bederson states that a person having ordinary skill in the art “would have been someone with at least a bachelor’s degree in electrical engineering, computer engineering, computer science, or a related field, plus at least two years of experience in the research, design, development, and/or testing of touch and/or proximity sensors, human-machine interaction and interfaces, and related firmware and software, or the equivalent, with additional education substituting for experience and vice versa.” Ex. 1002 ¶ 30. Patent Owner does not propose an alternative assessment. See generally Prelim. Resp. We adopt Petitioner’s definition of the level of skill for purposes of this Decision, except that we delete the phrase “at least” to avoid including ambiguity in the definition of the level of skill. C. Claim Construction In inter partes review, we construe claims using the same claim construction standard that would be used to construe the claims in a civil action under 35 U.S.C. § 282(b), including construing the claim in accordance with the ordinary and customary meaning of such claim as understood by one of ordinary skill in the art and the prosecution history pertaining to the patent. 37 C.F.R. § 42.100(b) (2020). Petitioner states that it “does not believe that any term requires explicit construction to resolve the issues presented in this Petition.” Pet. 9. Patent Owner does not oppose. See generally Prelim. Resp. For purposes of this Decision, we need not expressly construe any claim terms. See Vivid Techs., Inc. v. Am. Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999) (holding that “only those terms need be construed that are in controversy, and only to the extent necessary to resolve the IPR2021-01254 Patent 8,526,767 B2 9 controversy”); see also Nidec Motor Corp. v. Zhongshan Broad Ocean Motor Co. Matal, 868 F.3d 1013, 1017 (Fed. Cir. 2017) (citing Vivid Techs. in the context of an inter partes review). D. Asserted Obviousness of Claims 1 and 9-14 over Baltierra alone or over Baltierra and Katou Petitioner contends claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Baltierra alone or over Baltierra and Katou. Pet. 18-35. In support of its showing, Petitioner relies upon the declaration of Dr. Benjamin B. Bederson. Id. (citing Ex. 1002). 1. Baltierra Baltierra describes tools that initiate a function based on tactile contacts received through a contact detection device, such as a touch pad. Ex. 1005 ¶ 3. These tools use an input mode to identify gestures and switch input modes to determine accurately gestures based on tactile contacts. Id. Figure 1 of Baltierra is illustrative and is reproduced below. IPR2021-01254 Patent 8,526,767 B2 10 Figure 1 shows an environment with a computer and a contact detection device. Id. ¶ 6. Computing system 100 includes multi-input system 102, contact detection device 108, and application 110. Id. ¶ 16. Multi-input system 102 includes identifier module 104, contact state machine 120, and monitoring state machine 122. Id. Contact detection device 108 includes contact detectors 106. Id. Gestures are detected by contact detectors 106 and are identified by identifier module 104. Id. Once the gestures are identified, identifier module 104 initiates application 110 to provide a function, such as zooming. Id. Contact state machine 120 and monitoring state machine 122 switch or determine identifier module 104’s input mode. Id. ¶ 30. Each tactile contact has its own instance of contact state machine. Id. ¶ 31. For instance, “a first finger may have a first contact state machine and a second finger [may have] a second contact state machine.” Id. Monitoring state machine 122 monitors the change in the number of tactile contacts by monitoring the state of contact state machines 120. Id. ¶ 32. Upon determining that contact state machines 120 have changed their states and the changes in the number of tactile contacts, monitoring state machine 122 then “switches the identifier module’s input mode from a previous input mode to a current input mode.” Id. 2. Katou Katou describes an information processing device for operating commands corresponding to gesture information input via finger contact means, such as for an operator’s finger or a touch pen. Ex. 1006 ¶ 1. The information processing device includes gesture capture means such as a touch panel that is capable of capturing gesture information. Id. ¶ 2. The IPR2021-01254 Patent 8,526,767 B2 11 finger contact means inputs gesture information such as pressure signal that is transmitted to a gesture capture means. Id. ¶ 30. Figure 1 of Katou is illustrative and is reproduced below. Figure 1 shows a functional block diagram of an information processing device. Id. ¶ 126. Information processing device 10 includes finger contact means 12, gesture capture means 14, gesture decoding means 16, and composite gesture decoding means 18. Id. ¶¶ 40, 126. Gesture decoding means 16 is connected to gesture capture means 14 and composite gesture decoding means 18, to decode captured gesture information 12 and generate decoded gesture information 17. Id. ¶ 40. Decoded gesture information 17 and generated composite gesture information 18a are combined to generate a “composite meaning.” Id. ¶ 43. Command generating means 20 “judge[s]” this composite meaning. Id. IPR2021-01254 Patent 8,526,767 B2 12 3. Discussion Claim 1 includes the following limitations: a multi-touch state-machine module operable to: receive, directly from the first one-touch state-machine module, the first output; receive, directly from the second one-touch state-machine module, the second output; and recognize, based on at least the first and second outputs, at least one multi-touch gesture. Ex. 1001, 21:15-26 (emphasis added). Independent claims 12-14 include similar language. Id. at 22:32-43, 23:1-12, 24:11-22. Petitioner’s contentions for each independent claim with respect to the above claim language are the same. Pet. 30-32. For the challenge based on Baltierra alone, Petitioner contends that Baltierra’s “[m]onitoring state machine 122 is a multi-touch state-machine module because it determines when to switch input modes based on the output states of the first and second contact state machines” 120. Id. at 30 (citing Ex. 1005 ¶¶ 31-32, Fig. 1). Patent Owner argues that Petitioner identifies “Baltierra’s ‘monitoring state machine 122’ as the claimed ‘multi- touch state-machine module,’” but Petitioner fails to “show that Baltierra’s monitoring state machine 122 recognizes any multi-touch gesture” as required by the claims. Prelim. Resp. 17 (citing Pet. 30), id. at 19-20 (“Baltierra does not teach that monitoring state machine 122 is capable of identifying any gestures at all.”). Petitioner fails to explain, with supporting evidence, how Baltierra’s monitoring state machine 122 is operable to “recognize, based on at least the first and second outputs, at least one multi-touch gesture.” Baltierra describes that in some embodiments, each tactile contact has its own IPR2021-01254 Patent 8,526,767 B2 13 instance of a contact state machine, for example, a first finger may have a first contact state machine and a second finger a second contact state machine. Ex. 1005 ¶ 31. Monitoring state machine 122 “monitors the change in the number of tactile contacts by monitoring the state of the contact state machines 120.” Id. ¶ 32. Petitioner has not shown that monitoring the change of state or monitoring the change in the number of tactile contacts meets the claim limitation of a multi-touch state-machine module operable to “recognize, based on at least the first and second outputs, at least one multi-touch gesture.” As Patent Owner points out, Baltierra’s identifier module 104 is described as recognizing multi-touch gestures, not monitoring state machine 122. Id. ¶¶ 16, 22-25. In its Reply, Petitioner cites to passages in Baltierra that it did not include in the Petition and argues that Baltierra’s monitoring state machine 122 changes the identifier module’s input mode in response to state changes and number of contact changes. Pet. Reply 5 (citing Ex. 1005 ¶¶ 33, 39-41). Petitioner argues that such descriptions show how monitoring state machine 122 recognizes a multi-touch gesture. Id. Petitioner did not present these arguments in the Petition. Although we authorized Petitioner to file a reply to address whether we should exercise our discretion under 35 U.S.C. § 314(a) to deny institution based on, inter alia, Apple Inc. v. Fintiv, Inc., IPR2020-00019 (PTAB Mar. 20, 2020) (precedential) (“Fintiv”), we did not authorize Petitioner to augment its showings on the merits presented in the Petition, as to how the challenged claims are unpatentable. Paper 9. Accordingly, we need not and do not address Petitioner’s new arguments and newly cited passages from Baltierra as to how Baltierra’s monitoring state machine 122 meets the claim limitation of a multi-touch state-machine module operable to “recognize, based on at least the first and second IPR2021-01254 Patent 8,526,767 B2 14 outputs, at least one multi-touch gesture.” As explained above, Petitioner fails to explain sufficiently how Baltierra’s monitoring state machine 122 that functions to monitor the change of state or monitors the change in the number of contacts meets the claim limitation of a multi-touch state-machine module operable to “recognize, based on at least the first and second outputs, at least one multi-touch gesture.” Petitioner alternatively contends that “Baltierra’s monitoring state machine 122 and identifier module 104 further recognizes multi-touch gestures based on input from the first and second contact state machines.” Pet. 30-31 (quoting various passages from Ex. 1005 ¶¶ 33, 39-40). Patent Owner argues that with respect to Petitioner’s “alternative theory in which the limitation is met by ‘monitoring state machine and identifier module 104,” “identifier module 104 recognizes gestures, but not based on the output of any single-touch state machine and not based on the output of monitoring state machine 122.” Prelim. Resp. 17-18 (citing Pet. 30). Patent Owner contends that “Baltierra’s identifier module 104 cannot disclose the ’767 patent’s ‘multi-touch state-machine module’ because it does not use the output of any one-touch state machine to recognize gestures.” Id. at 19. Lastly, Patent Owner argues that Petitioner fails to explain “how the two separate modules 122 and 104 could be combined to form the ’767 patent’s multi-touch state-machine module, much less why a POSITA7 would be motivated to or have a reasonable expectation of success in doing so.” Id. at 21. We agree with Patent Owner that Petitioner fails to show how Baltierra’s monitoring state machine 122 and identifier module 104 together 7 A person of ordinary skill in the art. IPR2021-01254 Patent 8,526,767 B2 15 are operable to “recognize, based on at least the first and second outputs, at least one multi-touch gesture.” Although identifier module 104 does identify gestures, it does so without “receiv[ing] directly” from a first and second one-touch state-machine module (contact state machines 120) a first and second output as claimed. Ex. 1005 ¶¶ 16, 21-25, Fig. 1. We agree with Patent Owner that Baltierra’s identifier module 104 “operates directly on the output of the input controller 116.” Prelim. Resp. 18-19 (citing Ex. 1005, Fig. 1 (showing identifier module 104 receiving output from input controller 116, not contact state machines 120)). The Petition also does not explain how identifier module 104, which Petitioner relies on to meet the claimed “gesture-processing logic,” also meets the “multi-touch state-machine module” limitation. Pet. 23-24. Petitioner relies on contact state machines 120 and monitoring state machine 122 to meet the claimed “plurality of state-machine modules,” separate from the claimed “gesture processing logic,” which Petitioner indicates is met by identifier module 104. In other words, Petitioner does not identify identifier module 104 as one of the claimed “plurality of state-machine modules,” yet Petitioner inconsistently relies on identifier module 104 to later, in its Petition, be a part of the claimed “multi-touch state-machine module.” Id. at 23-24, 30-31. Such a showing without explanation is fatal to the Petition. Moreover, we agree with Patent Owner that Petitioner fails to explain “how the two separate modules 122 and 104 could be combined to form the ’767 patent’s multi-touch state-machine module, much less why a POSITA would be motivated to or have a reasonable expectation of success in doing so.” Prelim. Resp. 21. Specifically, Petitioner fails to explain at all how Baltierra’s monitoring state machine 122 together with identifier module 104 meet the claimed “multi-touch state-machine module.” Pet. 30-31. Indeed, IPR2021-01254 Patent 8,526,767 B2 16 Petitioner appears to rely on the two separate modules 122 and 104 to meet the claimed function of recognizing multi-touch gestures, while relying only on monitoring state machine 122 to meet the claimed “multi-touch state- machine module.” Id. at 30 (“[m]onitoring state machine 122 is a multi- touch state-machine module”). Independent claims 1 and 12-14, however, require that the multi-touch state-machine module be operable to recognize at least one multi-touch gesture. Moreover, we agree with Patent Owner that Petitioner fails to explain how the two separate modules 122 and 104 could be combined to form the claimed multi-touch state-machine module or why a person having ordinary skill in the art would have been motivated to do so. Prelim. Resp. 21. The Petition is silent in that regard and makes no showing regarding combining the two separate modules 122 and 104. Pet. 30-31. Claims 9-11 depend from claim 1. For claims 9-11, Petitioner does not present arguments or evidence that remedy the deficiencies in Petitioner’s contentions identified above with regard to independent claims 1 and 12-14. Pet. 34-35. For all of the above reasons, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in showing that claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Baltierra. For the challenge based on Baltierra in combination with Katou, Petitioner contends that “[t]o the extent it is determined that Baltierra does not teach a multi-touch state-machine module operable to recognize at least one multi-touch gesture, a POSITA would have been motivated to combine Baltierra with Katou with a reasonable expectation of success in doing so for reasons similar to those discussed with respect to the contact state machines.” Pet. 31 (citing Ex. 1002 ¶ 105). Petitioner argues that since Baltierra’s monitoring state machine 122 “has access to the gesture IPR2021-01254 Patent 8,526,767 B2 17 information provided by the separate contact state machines [120], it can be programmed similar to the identification module [104] to provide the function of recognizing gestures (e.g., pinch) and mapping it to a corresponding function (e.g., Zoom Out).” Id. Petitioner further argues that a person having ordinary skill in the art would have been motivated to integrate “these functions as taught by Katou, which discloses that the module receiving the one-touch gestures is ‘composite gesture decoding means (18) capable of judging composite meaning generated by a combination of at least one or more decoded gesture information (17) and generating composite gesture information (18a).’” Id. (citing Ex. 1006 ¶¶ 11-12, 40-43, Fig. 1). Lastly, Petitioner argues that a “POSITA would have been further motivated to make this combination in order to structure the software with distinct state machine modules because of the ‘separation of concerns’ software engineering principle” and that the combination “requires simple integration of existing functions of the monitoring state machines and the identifier module.” Id. at 32 (citing Ex. 1002 ¶¶ 48-49, 105; Ex. 1036). Patent Owner argues that Petitioner’s suggestion “that Baltierra’s monitoring state machine 122 ‘can be programmed similar to the identification module to provide the function of recognizing gestures,’” is not explained sufficiently. Prelim. Resp. 21 (citing Pet. 31-32). That is so, Patent Owner argues, because “Baltierra already contains a module dedicated to performing the function of recognizing multi-touch gestures: identifier module 104” and Petitioner fails to explain “why a POSITA would be motivated to provide duplicative gesture recognition functionality in monitoring state machine 122” or “how the two multi-touch gesture recognition modules would work together in the proposed combination.” Id. IPR2021-01254 Patent 8,526,767 B2 18 We agree with Patent Owner that Petitioner has failed to explain sufficiently why a person having ordinary skill in the art would have been motivated to make the combination as proposed by Petitioner. The Petition does not make clear whether Petitioner proposes to integrate Baltierra’s identifier module 104 into monitoring state machine 122, or else to maintain Baltierra’s identifier module 104, but also to include the function of identifier module 104 in monitoring state machine 122. If the former, then Petitioner fails to explain how identifier module 104 can still be the claimed “gesture-processing logic” also recited in the claim. In other words, if the identifier module 104 is eliminated and integrated into monitoring state machine 122, then the Petition is inconsistent insofar as it maps the “gesture- processing logic” to identifier logic module 104 separate from the claimed “plurality of state-machine modules.” See Pet. 23-24, 31-32. If, on the other hand, Petitioner proposes to maintain Baltierra’s identifier logic module 104 in the proposed combination, but also to add the functionality of recognizing multi-touch gestures to monitoring state machine 122, then we agree with Patent Owner that Petitioner and Dr. Bederson fail to explain why a person having ordinary skill in the art would have wanted to include the function of recognizing gestures into the monitoring state machine when the identification module already provides that function. Pet. 31; Ex. 1002 ¶ 105. Moreover, to the extent that Petitioner proposes to integrate whatever is in Baltierra’s identifier module 104 into monitoring state machine 122, we determine that Petitioner fails to explain why a person having ordinary skill in the art would have done so. In particular, Petitioner argues that “[b]ecause Baltierra’s monitoring state machine has access to the gesture information provided by the separate contact state machines, it can be IPR2021-01254 Patent 8,526,767 B2 19 programmed similar to the identification module to provide the function of recognizing gestures (e.g., a pinch) and mapping it to a corresponding function (e.g., Zoom Out).” Pet. 31. Dr. Bederson testifies the same. Ex. 1002 ¶ 105. Petitioner then explains that Katou teaches “integrat[ing] these functions” and that it would have been obvious to make the combination “because of the ‘separation of concerns’ software engineering principle.” Pet. 31-32 (citing Ex. 1002 ¶¶ 48-49, 105; Ex. 1036; Ex. 1006 ¶¶ 11-12, 40-43, Fig. 1). Petitioner fails to explain how the quoted passage from Katou teaches “integrating these functions.” Id. at 31. And, even if Katou somehow teaches “integrating these functions,” such a teaching would be undermined by Petitioner’s suggestion that integration would have been desirable based on the “separation of concerns” principle. See id. at 32. According to Dr. Bederson, the “separation of concerns” principle “refers to the idea that by separating the components of a technical system and minimizing the coordination between them, each can be developed, tested, and updated separately from each other.” Ex. 1002 ¶ 48 (emphasis added). As Dr. Bederson further testifies, a common application of the “separation of concerns” principle is in the design of software systems, where designers “design software modules to operate independently of each with well- defined Application Programming Interfaces (APIs). This approach makes it possible to change the implementation of one module (without changing its API) without the other modules being aware of those changes.” Id. ¶ 49. Thus, the “separation of concerns” principle, which advocates separating software modules, is in direct conflict with Petitioner’s proposal to “integrate these functions” into one module. IPR2021-01254 Patent 8,526,767 B2 20 Claims 9-11 depend from claim 1 and incorporate all the limitations of claim 1 from which they depend. For claims 9-11, Petitioner does not present arguments or evidence that remedy the deficiencies in Petitioner’s contentions identified above with regard to independent claim 1. Pet. 34-35. For all of the above reasons, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in showing that claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Baltierra and Katou. E. Asserted Obviousness of Claims 2-8 over Baltierra, Katou, and Warren Petitioner contends claims 2-8 are unpatentable under 35 U.S.C. § 103(a) as obvious over Baltierra, Katou, and Warren. Pet. 35-45. 1. Warren Warren describes a proximity sensor that uses a touch sensor device for producing user interface inputs. Ex. 1007 ¶ 1. The proximity sensor facilitates user interface navigations such as dragging and scrolling. Id. ¶ 11. In one embodiment, the proximity sensor distinguishes between the motions of different object combinations and indicates specific results responsive to the detected object combinations. Id. ¶ 42; Fig. 5. Figure 5 of Warren is illustrative and is reproduced below. IPR2021-01254 Patent 8,526,767 B2 21 Figure 5 shows a state diagram of a proximity sensor device process. Id. ¶ 16. State diagram 500 includes four states, IDLE state 501, FIRST RESULT state 502, SECOND RESULT state 503, and THIRD RESULT state 504, in which each state corresponds to user interface action performed in response to various motions. Id. ¶ 16. IDLE state 501 provides an idle result whereas FIRST RESULT state 502, SECOND RESULT state 503, and THIRD RESULT state 504 provide results responsive to object motions. Id. Object motions can be different such as pointing or dragging. Id. ¶ 43. Types of results are implemented as value adjustments, which relate to functional parameters. Id. ¶ 45. Transitions between states are determined by detected object combination motion. Id. ¶¶ 46-47. 2. Discussion Each of claims 2-8 depends either directly or indirectly from independent claim 1. For claims 2-8, Petitioner does not present arguments or evidence that remedy the deficiencies in Petitioner’s contentions identified above with regard to claim 1. Pet. 35-45. Accordingly, for the IPR2021-01254 Patent 8,526,767 B2 22 same reasons discussed above, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in its challenge that claims 2-8 are unpatentable under 35 U.S.C. § 103(a) as obvious over Baltierra, Katou, and Warren. F. Asserted Obviousness of Claims 1 and 9-14 over Westerman alone or over Westerman and Katou Petitioner contends claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Westerman alone or over Westerman and Katou. Pet. 45-59. 1. Westerman Westerman describes a multipoint sensing device for determining a gesture set. Ex. 1008 ¶ 29. The multipoint sensing device includes a gesture module and a multipoint sensing area that receives input arrangement such as an arrangement of fingers or other parts of the hand. Id. A control operation includes a first input device and a second input device. Id. ¶ 33. The input device includes a touch sensing device for providing input events. Id. Input events for both the first input device and the second input device are simultaneously monitored. Id. Figure 22 of Westerman is illustrative and is reproduced below. IPR2021-01254 Patent 8,526,767 B2 23 Figure 22 shows control operation of a multipoint sensing device. Id. ¶ 60. Control operation 410 includes detecting a first touch on a touch surface (block 412) and simultaneously detecting a second touch on the touch surface (block 416). Id. ¶ 152. A first arrangement of contacts is recognized for a first hand (block 414) and a second arrangement of contacts IPR2021-01254 Patent 8,526,767 B2 24 is simultaneously recognized for a second hand (block 418). Id. Gesture sets for both the first and the second arrangement of contacts are simultaneously loaded (blocks 420, 422). Id. Gesture events for both the first touch and the second touch are simultaneously monitored. Id. Actions are simultaneously initiated when both first and second gesture events are performed. Id. Figure 23 of Westerman is also illustrative and is reproduced below. Figure 23 shows another control operation of a multipoint sensing device. Id. ¶ 61. Control operation 440 includes displaying a GUI object (block 442). Id. ¶ 153. Multiple pointers are then detected at different points of the displayed GUI object (block 444). Id. First pointers may be fingers of a first hand and second pointers may be fingers of a second hand. Id. Pointers are then locked to the displayed object (block 445) and the position of the pointers then monitored (block 446). Id. “[T]he displayed object can be modified when position of one or more pointers is moved relative to its locked position” (block 448). Id. IPR2021-01254 Patent 8,526,767 B2 25 2. Discussion Claim 1 recites: a multi-touch state-machine module operable to: receive, directly from the first one-touch state-machine module, the first output; receive, directly from the second one-touch state-machine module, the second output; and recognize, based on at least the first and second outputs, at least one multi-touch gesture, the first one-touch state- machine module, the second one-touch state-machine module, and the multi-touch state-machine module being distinct state-machine modules; and output the recognized multi-touch gesture. Ex. 1001, 21:15-26. Independent claims 12-14 include similar language. Id. at 22:32-43, 23:1-12, 24:11-22. Petitioner’s contentions for each independent claim with respect to the above claim language is the same. Pet. 53-57. For the challenge based on Westerman alone, Petitioner contends that “Westerman discloses a control operation in figure 23, comprising a multi- touch state-machine module (i.e., components 444, 445, 446, 448), that detects first and second pointers from first and second one-touch state- machine modules (as described in figure 22) at different points of a displayed object, locks the pointers to a displayed object by ‘pausing,’ and monitors the positions of the pointers relative to their locked positions.” Id. at 54 (citing Ex. 1008 ¶ 153, Figs. 22 and 23). Patent Owner argues that Westerman’s Figures 22 and 23 are directed to different embodiments. Prelim. Resp. 22-23. Patent Owner also argues that Petitioner has not provided a motivation to combine these embodiments IPR2021-01254 Patent 8,526,767 B2 26 and that combining the two embodiments does not “make any sense.” Id. at 23. We agree with Patent Owner that “Westerman lacks any disclosure connecting the separate embodiments of Figure 22 and Figure 23, much less the specific ‘direct’ connection required by the ’767 claims.” Prelim. Resp. 23. As Patent Owner points out, Westerman describes Figure 22 as “show[ing] illustrative control operation, in accordance with one embodiment of the present invention.” Ex. 1008 ¶ 152. Westerman then describes Figure 23 as “show[ing] illustrative control operation 440, in accordance with one embodiment of the present invention.” In describing Figure 23, Westerman does not link what is described in Figure 23 with what is described in Figure 22. Id. ¶ 153. Yet, Westerman paragraph 153 is what Petitioner relies on in support of its assertion that the multi-touch state- machine module (elements 444, 445, 446, and 448 of Figure 23) “detects first and second pointers from first and second one-touch state-machine modules (as described in figure 22).” Pet. 54 (citing Ex. 1008 ¶ 153). Petitioner fails to show a connection between what is described in Figure 22 and what is described in Figure 23 so as to meet the disputed limitation. Moreover, Petitioner has failed to show that a person having ordinary skill in the art would have understood that the control operations in Figures 22 and 23 are operable in a multi-touch state-machine module to receive (1) “directly from the first one-touch state-machine module, the first output,” and (2) “directly from the second one-touch state-machine module, the second output.” Even assuming that Westerman Figure 22 shows the control operation for a “first one-touch state-machine module” and a “second one- touch state-machine module” and Figure 23 shows the control operation for “a multi-touch state-machine module,” as Petitioner asserts, Petitioner fails IPR2021-01254 Patent 8,526,767 B2 27 to show that any “output” from the Figure 22 operation is received directly by the Figure 23 operation, as asserted. Pet. 54. Claims 9-11 depend from claim 1. For claims 9-11, Petitioner does not present arguments or evidence that remedy the deficiencies in Petitioner’s contentions identified above with regard to independent claims 1 and 12-14. Pet. 57-59. For all of the above reasons, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in showing that claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Westerman. For the challenge based on Westerman in combination with Katou, Petitioner contends that “[t]o the extent it is determined that Westerman does not explicitly disclose ‘a multi-touch state-machine module’ that receives outputs from one-touch state machines, a POSITA would have been motivated to combine Westerman with Katou and would have had a reasonable expectation of success in doing so.” Id. at 54-55 (citing Ex. 1002 ¶ 153). Petitioner argues that “Westerman figure 22 discloses separate one-touch contact state machines and figure 23 discloses a multi- touch contact state machine.” Id. at 55. Petitioner then argues that “Katou discloses a combination of a one-touch state machine (gesture decoding means (16)) that outputs decoded gesture information (17) to a multi-touch state-machine module (composite gesture decoding means 18)) and recognizes at least one multi-touch gesture.” Id. (citing Ex. 1006 ¶¶ 40-43, Fig. 1). Petitioner argues that [a] POSITA would have been motivated to connect disclosures of figure 22 and figure 23 based at least on Westerman’s description of pointing to different aspects of a GUI with single- touch gestures, then locking the pointers, and performing a multi- touch gesture (such as sliding pointers together to zoom out), IPR2021-01254 Patent 8,526,767 B2 28 based on Katou’s teaching of a multi-touch state machine taking input from a single-touch state machine. Id. (citing Ex. 1008 ¶¶ 153-154). Patent Owner argues that “the purported single-touch and multi-touch gestures described in the Petition are both aspects of Figure 23, i.e., steps 444 and 448 respectively. Figure 22 has nothing to do with ‘pointing to different aspects of a GUI with single-touch gestures,’ or with GUIs at all.” Prelim. Resp. 24 (citing Ex. 1008 ¶¶ 152-154). Patent Owner further argues that “even if a POSITA would somehow take from Katou the notion of combining a multi-touch state machine with a single-touch state machine, the Petition only explains motivation to combine two aspects of Figure 23” and that the Petition fails to give reasons for why a person having ordinary skill in the art would have combined Figure 22 with Figure 23. Id. We determine that Petitioner fails to show sufficiently why a person of ordinary skill in the art would have combined Westerman’s Figures 22 and 23 as suggested. We agree with Patent Owner that Westerman’s Figure 23 describes pointing to different aspects of a GUI with single-touch gestures, then locking the pointers and performing a multi-touch gesture (such as sliding pointers together to zoom out). Id.; Ex. 1008 ¶¶ 153-154. We further agree with Patent Owner that to the extent Katou teaches combining a multi-touch state machine with a single-touch state machine, the Petition only explains motivation to combine two aspects of Westerman’s Figure 23. Pet. 55. Thus, Petitioner’s reasons for why a person having ordinary skill in the art would have combined Westerman’s Figures 22 and 23 fail. Claims 9-11 depend from claim 1. For claims 9-11, Petitioner does not present arguments or evidence that remedy the deficiencies in IPR2021-01254 Patent 8,526,767 B2 29 Petitioner’s contentions identified above with regard to independent claim 1. Pet. 57-59. For all of the above reasons, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in showing that claims 1 and 9-14 are unpatentable under 35 U.S.C. § 103(a) as obvious over Westerman and Katou.8 G. Asserted Obviousness of Claims 2-8 over Westerman, Katou, and Warren Petitioner contends claims 2-8 are unpatentable under 35 U.S.C. § 103(a) as obvious over Westerman, Katou, and Warren. Pet. 59-67. Each of claims 2-8 depends either directly or indirectly from independent claim 1. For claims 2-8, Petitioner does not present arguments or evidence that remedy the deficiencies in Petitioner’s contentions identified above with regard to claim 1. Id. 59-67. Accordingly, for the same reasons discussed above, we are not persuaded that Petitioner has established a reasonable likelihood that Petitioner would prevail in its challenge that claims 2-8 are unpatentable under 35 U.S.C. § 103(a) as obvious over Westerman, Katou, and Warren. III. CONCLUSION For the foregoing reasons, we determine that Petitioner has not shown a reasonable likelihood that it would prevail in showing that any of the challenged claims of the ’767 patent are unpatentable. 8 Because we find Petitioner has not shown a reasonable likelihood of prevailing on this challenge for the reasons discussed above, we do not reach Patent Owner’s remaining arguments. IPR2021-01254 Patent 8,526,767 B2 30 IV. ORDER Accordingly, it is: ORDERED that the Petition is denied as to all challenged claims, and no trial is instituted. IPR2021-01254 Patent 8,526,767 B2 31 PETITIONER: Ryan Yagura Nicholas J. Whilt Caitlin P. Hogan O’MELVENY & MYERS LLP ryagura@omm.com nwhilt@omm.com chogan@omm.com PATENT OWNER: Neil A. Rubin Philip Wang RUSS AUGUST & KABAT nrubin@raklaw.com pwang@raklaw.com Copy with citationCopy as parenthetical citation