Google Inc.v.Grandeye Ltd.Download PDFPatent Trial and Appeal BoardNov 25, 201412475771 (P.T.A.B. Nov. 25, 2014) Copy Citation Trials@uspto.gov Paper 32 Tel: 571-272-7822 Entered: November 25, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE _______________ BEFORE THE PATENT TRIAL AND APPEAL BOARD _______________ GOOGLE INC., Petitioner, v. GRANDEYE LTD., Patent Owner. _______________ Case IPR2013-00546 Patent 8,077,176 B2 _______________ Before JAMESON LEE, DAVID C. McKONE, and PATRICK M. BOUCHER, Administrative Patent Judges. McKONE, Administrative Patent Judge. FINAL WRITTEN DECISION 35 U.S.C. § 318(a) and 37 C.F.R. § 42.73 IPR2013-00546 Patent 8,077,176 B2 2 I. INTRODUCTION A. Background Google Inc. (“Petitioner”) filed a Petition (Paper 1, “Pet.”) to institute an inter partes review of claims 1, 4, 12, 16, 17, and 19–21 of U.S. Patent No. 8,077,176 B2 (Ex. 1002, “the ’176 patent”). Grandeye Ltd. (“Patent Owner”) timely filed a Preliminary Response (Paper 13, “Prelim. Resp.”). Pursuant to 35 U.S.C. § 314, the Patent Trial and Appeal Board instituted trial on February 5, 2014, as to all of the challenged claims of the ’176 patent. Paper 14 (“Dec.”). During this trial, Patent Owner timely filed a Patent Owner Response (Paper 20, “PO Resp.”), and Petitioner timely filed a Reply to the Patent Owner Response (Paper 23, “Reply”). An oral hearing was held on September 5, 2014. Paper 31 (“Tr.”). We have jurisdiction under 35 U.S.C. § 6(c). This decision is a final written decision under 35 U.S.C. § 318(a) as to the patentability of the challenged claims. Based on the record before us, Petitioner has demonstrated by a preponderance of the evidence that claims 1, 4, 12, 16, 17, and 19–21 are unpatentable. B. Related Proceedings Petitioner has filed petitions for Inter Partes Review of U.S. Patent Nos. 6,243,099 B1 (Ex. 1001, “the ’099 patent”) and 7,542,035 B2 (Ex. 1003, “the ’035 patent”), in IPR2013-00547 and IPR2013-00548, respectively. Paper 6, at 3. The ’176 patent is a continuation of the ’035 IPR2013-00546 Patent 8,077,176 B2 3 patent, which is a continuation of the ’099 patent. Ex. 1003, at [63]; Ex. 1002, at [63]. View 360 Solutions LLC (“View 360”), a purported licensee of the ’035, ’099, and ’176 patents, has sued Petitioner for infringement of the ’035, ’099, and ’176 patents in View 360 Solutions LLC v. Google, Inc., Case No. 1:12-cv-1352 (N.D.N.Y.). Pet. 1–2; Paper 6, at 3. Patent Owner has asserted the ’035 and ’099 patents against others in Grandeye Ltd. v. Sentry 360 Security, Inc., Case No. 1:11-cv-02188 (N.D. Ill.). C. Grounds of Unpatentability Petitioner relies upon the following prior art reference: Wen-kae Tsao et al., Photo VR: A System of Rendering High Quality Images for Virtual Environments Using Sphere-like Polyhedral Environment Maps, THE SECOND WORKSHOP ON REAL-TIME AND MEDIA SYSTEMS (RAMS’96) 397–403 (July 30–31, 1996) (Ex. 1007, “Photo VR”). We instituted this proceeding based on the asserted ground that claims 1, 4, 12, 16, 17, and 19–21 are anticipated under 35 U.S.C. §§ 102(a) and 102(b) by Photo VR. Dec. 22. D. The ’176 patent The ’176 patent “relates generally to a method and corresponding apparatus for viewing images.” Ex. 1002, col. 1, ll. 51–52. For instance, a virtual pictosphere may be created using a conventional three-dimensional graphics system that results from “texture mapping” the visible world onto a sphere. Id. at col. 6, ll. 21–24. Different viewpoints enable different types of perspective views when rendered with the primitives of a conventional IPR2013-00546 Patent 8,077,176 B2 4 three-dimensional graphics system. For example, a linear perspective view is achieved with a viewpoint at the center of the sphere, while a circular perspective view is achieved with a viewpoint on the surface of the sphere with a view direction towards the center. Id. at col. 6, ll. 24–33. Figures 5 and 6 of the ’176 patent, reproduced below, are illustrative: Figures 5 and 6 show projections of a portion of the visible world onto a plane with a linear perspective view and a circular perspective view, respectively. Id. at col. 5, ll. 64–65. In an illustrative example, the ’176 patent describes the mapping of two fisheye images to adjoining hemispheres to generate spherical image data. Id. at col. 8, l. 19–col. 9, l. 18. A user interactively may move the viewpoint to different positions that include the center of the sphere and to points very near the inside of the sphere, thereby achieving the different perspective views. Id. at col. 9, ll. 11–16. The surface of the sphere also may be rotated to simulate looking around within the sphere. Id. at col. 9, ll. 16–18. Although this illustration is provided with two adjoining hemispheres, the ’176 patent more generally contemplates mapping with IPR2013-00546 Patent 8,077,176 B2 5 respect to polyhedral approximations of spheres described in the ’176 patent as “p-spheres.” Id. at col. 7, ll. 11–23. Claim 1, reproduced below, is illustrative of the claimed subject matter: 1. A method for modeling the visible world, comprising: texture mapping full-surround image data onto a p-surface to generate a model of the visible world substantially equivalent to projecting the image data onto the p-surface from a point of projection; allowing a user to select a direction of view from a view point on the model; and allowing a portion of the model mapped on p-surface based on the view point to be displayed; wherein the p-surface comprises polygons approximating at least a portion of a sphere. II. ANALYSIS A. Claim Construction The Board interprets claims of an unexpired patent using the broadest reasonable construction in light of the specification of the patent in which they appear. See 37 C.F.R. § 42.100(b); Office Patent Trial Practice Guide, 77 Fed. Reg. 48,756, 48,766 (Aug. 14, 2012). Claim terms generally are given their ordinary and customary meaning, as would be understood by one of ordinary skill in the art in the context of the entire disclosure. See In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007). IPR2013-00546 Patent 8,077,176 B2 6 1. Claim Terms Previously Construed In the Petition, Petitioner proposed constructions for the terms “full- surround [image] data,” “p-surface,” and “texture mapping,” as recited in independent claims 1 and 12. Pet. 12–14. In the Decision to Institute (Dec. 6–10), we construed these claim terms and others, as reproduced in the table below: Claim Phrase Claim Construction in the Decision to Institute “full-surround image data” 1 (claims 1, 12) data which samples the points P [defined as “[t]he visible world” (Ex. 1002, col. 6, l. 58)]. This data encodes, explicitly or implicitly, the association of a color value with a given direction from a given point of projection. “p-surface” (claims 1, 12) a computer graphics representation of any surface with a well-defined inside and outside, where there exists at least one point x inside (neither intersecting, nor lying outside) the surface which may be connected to every point of the surface with a distinct line segment, no portion of which said line segment lies outside the surface or intersects the surface at a point not an endpoint “texture mapping” (claims 1, 12) applying image data to a surface “texture-mapped p-surface” (claim 12) p-surface onto which image data have been applied “projecting the [full- surround] image data onto the p-surface” (claims 1, 12) generating a new image by moving image pixels along rays from the view point to the p- surface 1 At the hearing, counsel for Patent Owner confirmed that the claims use “full-surround image data,” “full-surround data,” and “image data” synonymously. Tr. 81:16–82:2. IPR2013-00546 Patent 8,077,176 B2 7 During trial, Patent Owner accepted our constructions of “p-surface” and “texture mapping,” disputed our constructions of “full-surround image data” and “projecting the [full-surround] image data on to the p-surface,” and proposed constructions for “view point” and “point of projection.” PO Resp. 12–17. In the Reply, Petitioner accepts our construction of “full- surround [image] data” and “projecting the [full-surround] image data on to the p-surface” (Reply 4–5, 10–11), opposes Patent Owner’s constructions of “view point” and “point of projection” (id. at 6–7 n.5), and does not contest our constructions of “p-surface” and “texture mapping.” 2. “full-surround image data” Independent claims 1 and 12 each recite “texture mapping full- surround image data onto a p-surface” (emphasis added). The specification includes the following description: (4) FULL-SURROUND IMAGE DATA: data which samples the points P. This data encodes, explicitly or implicitly, the association of a color value with a given direction from a given point of projection. It should be mentioned at this point that full-surround image data is useful in many fields of entertainment because, when delivered to many viewers, it enables the construction of an independent viewing system defined below. Ex. 1002, col. 7, ll. 3–10 (emphasis added). In the Decision to Institute, we found the italicized language in the description above to be an express definition of “full-surround image data” and concluded that the term should be construed in accordance with this definition. Dec. 6–7. Patent Owner proposes construing “full-surround image data” to mean: IPR2013-00546 Patent 8,077,176 B2 8 data sampling points P of the visible world encoding, explicitly or implicitly, the association of a color value C for each sampled point in P observed in a given direction [ray V] from a given point of projection (VP), the sampled points sufficient to provide in a standard [3D] computer graphics system a display in which changing the direction of view provides a human viewer the impression of being present at the point of projection from which the color was observed for the sampled point in P. PO Resp. 14 (emphasis and brackets in original; internal footnote omitted). Patent Owner contends that the unitalicized portion of its proposed construction is the same as our construction except that it adds letters and terms from “predicate conditions” described in the specification and from Figures 2, 3, 4A, and 4B of the ’176 patent. Id. Specifically, Patent Owner argues that full-surround image data is the encoding of a unique association of one point (P) and color (C) with a given ray (V) from a given viewpoint (VP). Id. According to Patent Owner (id. at 10–11), the claims must be read in light of what it calls three “embodiment-independent predicate conditions” disclosed in the specification: The method and corresponding apparatus according to the present invention are predicated on the following starting, i.e., given, conditions: (1) the set of all rays V from a given point VP, as illustrated in FIG. 1; (2) a set of points P not including VP, each point in P being contained by one and only one ray in V, as illustrated in FIG. 2; and (3) the set of color values C, each color in C being associated with one and only one ray in V, and also thereby associated with the point in P contained by said ray. IPR2013-00546 Patent 8,077,176 B2 9 Ex. 1002, col. 6, ll. 46–56. Patent Owner argues that these conditions apply to “the present invention” rather than to the recited embodiments, that they lead up to the definitions set forth in the patent and, thus, that they are incorporated into the definitions. PO Resp. 11. Petitioner argues that these additions are inconsistent with the express definition of “full-surround image data.” Reply 5. We agree with Patent Owner that the conditions reproduced above define the set of points P as each being associated with a single ray from a given viewpoint VP and that each color in the set C similarly is associated with a single ray. 2 Also, we are not persuaded by Petitioner’s argument that these conditions are inconsistent with the express definition of “full- surround image data.” Accordingly, we modify our construction to reflect that full-surround image data encode the associations of unique color values with given directions. As stated above, Patent Owner further proposes that full-surround image data include “the sampled points sufficient to provide in a standard [3D] computer graphics system a display in which changing the direction of view provides a human viewer the impression of being present at the point of projection from which the color was observed for the sampled point in P.” PO Resp. 14. To that end, Patent Owner argues (PO Resp. 14–15) that the specification provides the following additional definitions: (10) INDEPENDENT VIEWING SYSTEM: an interactive viewing system in which multiple viewers can freely, independently of one another, and independently of the 2 We note that C is not a color; it is a set of colors. Similarly, V is not a ray; it is a set of rays. IPR2013-00546 Patent 8,077,176 B2 10 source of the image data, pan that image data in all directions with the effect that each viewer feels like they are “inside” of that imagery, or present at the location from which the imagery was produced, recorded, or transmitted; and (11) STANDARD COMPUTER GRAPHICS SYSTEM: a computer graphics system which supports linear perspective viewing, including the changing of the focal length or the altering of the view angle, the apparent rotation of viewed objects, and/or the apparent changing of direction of vision, and the texture mapping of image data onto objects within the class of p-surface. Ex. 1002, col. 7, l. 65–col. 8, l. 12. According to Patent Owner, these are additional links in a “definitional chain” that we should incorporate into our construction of full-surround image data. PO Resp. 14–15. As noted above, the description of full-surround image data includes that “[i]t should be mentioned at this point that full-surround image data is useful in many fields of entertainment because, when delivered to many viewers, it enables the construction of an independent viewing system defined below.” Ex. 1002, col. 7, ll. 6–10. Patent Owner argues that this language invokes the definition of “independent viewing system” (definition 10). PO Resp. 15 n.3. Patent Owner further argues that an independent viewing system is implemented on a standard computer graphic system, invoking the definition of that term (definition 11). Id. At the hearing, Patent Owner conceded that a standard computer graphics system, the subject matter of definition 11 (Ex. 1002, col. 8, ll. 6–12), is not a requirement of full-surround image data. Tr. 70:23–71:23. Rather, “this was a construction that [Patent Owner] did in a different case.” Id. at 71:17– IPR2013-00546 Patent 8,077,176 B2 11 18. Accordingly, we decline to read the requirement of a standard computer graphics system into “full-surround image data.” Regarding an independent viewing system, the subject matter of definition 10 (Ex. 1002, col. 7, l. 65–col. 8, l. 5), Patent Owner also has conceded that this is not a requirement of “full-surround image data.” Tr. 71:21–73:20. Moreover, we conclude that the patent’s description of full-surround image data enabling the construction of an independent viewing system (Ex. 1002, col. 7, ll. 6–10) is an example of a result that can be achieved using full-surround image data, not an additional restriction on the scope of the term. See In re Am. Acad. of Sci. Tech. Ctr., 367 F.3d 1359, 1369 (Fed. Cir. 2004) (“We have cautioned against reading limitations into a claim from the preferred embodiment described in the specification, even if it is the only embodiment described, absent clear disclaimer in the specification.”) (citations omitted). Accordingly, we are not persuaded by Patent Owner’s argument. In sum, “full-surround image data” is “data which samples the points P. This data encodes, explicitly or implicitly, the association of a single color value with a given direction from a given point of projection.” 3. “projecting the [full-surround] image data onto the p- surface” We construed “projecting the [full-surround] image data onto the p- surface” to mean “generating a new image by moving image pixels along rays from the view point to the p-surface.” Dec. 9–10. Patent Owner argues that the image data recited in the claim term are not just any data, they are full-surround image data. PO Resp. 16. To that end, Patent Owner proposes IPR2013-00546 Patent 8,077,176 B2 12 construing “projecting the [full-surround] image data onto the p-surface” to mean “moving to the p-surface colors C along corresponding rays V from viewpoint VP according to and maintaining the unique association of C, V and VP encoded in the full-surround image data.” Id. at 17. Here, Patent Owner asks us to read into this construction many of the same requirements Patent Owner proposes that we read into the construction of “full-surround image data.” In support, Patent Owner again argues the correspondence of points P, colors C, and rays V with a viewpoint VP. Id. at 16. As explained above, with respect to “full-surround image data,” we accepted some of Patent Owner’s proposals and rejected others. We are not persuaded that the requirements of full-surround image data should be repeated expressly in the construction of “projecting the full-surround image data onto the p-surface.” We do, however, clarify our construction to reflect that the image pixels are full-surround image data. Thus, “projecting the [full-surround] image data” means “generating a new image by moving image pixels of the full-surround image data along rays from the view point to the p-surface.” 4. “view point” / “point of projection” “View point” and “point of projection” are defined together: (3) MAGIC POINT, VIEWPOINT, OR POINT OF PROJECTION: Point VP. Please note, no matter how points P are projected, their appearance will remain the same when viewed from point VP. This latter concept may best be understood by referring to FIGS. 4A and 4B. Ex. 1002, col. 6, l. 65–col. 7, l. 2. Patent Owner proposes construing “view point” and “point of projection” to mean “A ‘Point VP’ such that ‘no matter IPR2013-00546 Patent 8,077,176 B2 13 how points P are projected, their appearance will remain the same when viewed from point VP.’” PO Resp. 12. In other words, Patent Owner contends that the entire statement quoted above constitutes a definition of “view point” and “point of projection.” Petitioner contends that no construction is necessary, but if we do construe these terms, they should mean “Point VP.” Reply 6–7 n.5. 3 Petitioner does not explain why the remainder of the language of definition (3), defining magic point, viewpoint, or point of projection, should be omitted from our construction. We understand the language “[p]lease note, no matter how points P are projected, their appearance will remain the same when viewed from point VP” (Ex. 1002, col. 6, ll. 66 – col. 7, l. 1) as expressing a necessary consequence of the interrelationship of the ’176 patent’s embodiment- independent predicate conditions and definitions of “POINTS P,” “A PROJECTION OF P,” and “MAGIC POINT, VIEWPOINT, OR POINT OF PROJECTION.” See Ex. 1002, col. 6, l. 44 – col. 7, l. 2. As such, that language properly is considered definitional. Accordingly, we adopt Patent Owner’s proposed construction of “view point” and “point of projection,” namely “a ‘Point VP’ such that ‘no matter how points P are projected, their appearance will remain the same when viewed from point VP.’” 3 Petitioner argues that Patent Owner’s proposed construction is “belated.” Reply 6–7 n.5. We disagree. Patent Owner is required to set forth its arguments addressing Petitioner’s grounds for unpatentability in its Patent Owner response, which it did. See 37 C.F.R. § 42.120(a). Patent Owner’s Preliminary Response was not mandatory, see 37 C.F.R. § 42.107, and, thus, Patent Owner was not required to set forth its claim construction positions in that filing first. IPR2013-00546 Patent 8,077,176 B2 14 B. Petitioner’s Motion to Exclude Patent Owner supports its Response with an Expert Declaration of James H. Oliver, Ph.D. (Ex. 2011, “Oliver Decl.”). Petitioner moves to exclude ¶¶ 33, 41–44, 50, 55–57, 67–72, Figures 5 and 6, and Appendices B–D of the Oliver Declaration. 4 Petitioner Google Inc.’s Motion to Exclude Evidence Pursuant to 37 C.F.R. § 42.64(c) (Paper 25, “Mot. to Exclude”) 3. Petitioner contends that these portions of the Oliver Declaration rely on the consideration of two software applications that are inadmissible because they lack authentication and relevance. Id. at 1–2. First, Petitioner contends that Dr. Oliver’s testimony in ¶¶ 41–44, 55, 57, Figure 5, and Appendix C of Exhibit 2011 relied upon software purportedly corresponding to the system described in Photo VR. Mot. to Exclude 3–4, 6–7. Petitioner contends that the date stamp of the software is in 1999, after the 1996 date of Photo VR and argues that Dr. Oliver did not confirm adequately that this software is the same software discussed in Photo VR. Id. at 6–7. Accordingly, Petitioner argues, this software and the portions of Dr. Oliver’s declaration that rely upon it are irrelevant. Id. at 6. Second, Petitioner contends that Dr. Oliver’s testimony in ¶¶ 33, 50, 56, 67–72, Figure 6, and Appendices B and D of Exhibit 2011 relied upon software source code purportedly corresponding to the source code deposited with the Patent Office along with the application for the ’176 patent. Mot. to Exclude 4, 7–9. Petitioner contends that this source 4 Petitioner also moves to exclude portions of a Supplemental Oliver Declaration (Ex. 1032) “to the extent Patent Owner files and/or seeks to enter Patent Owner’s Response to Google’s Objections.” Paper 25 at 3, n.1. Patent Owner did not file the identified paper. IPR2013-00546 Patent 8,077,176 B2 15 code actually was modified by a third party after the filing date of the ’176 patent. Id. at 7–9. Accordingly, Petitioner argues, this software and the portions of Dr. Oliver’s declaration that rely upon it are irrelevant. Id. at 9. Petitioner further argues that any relevance of these two pieces of software is outweighed by its potential to confuse and mislead. Id. Petitioner also contends that this software and the testimony that relies upon it lack authentication and are hearsay. Id. at 10–12. Patent Owner responds that it does not seek to admit the Photo VR or ’176 patent software itself. Patent Owner’s Opposition to Petitioner’s Motion to Exclude Evidence (Paper 28, “PO Opp. to Mot. to Exclude”) 2. Patent Owner argues that Dr. Oliver considered this software as part of his review of the technology at issue in the case. Id. Patent Owner points out (id. at 6) that the bases for an expert’s opinion need not be admissible “[i]f experts in the particular field would reasonably rely on those kinds of facts or data in forming an opinion on the subject.” FED. R. EVID. 703. As to the ’176 patent software, Patent Owner argues that Dr. Oliver determined that it was functionally identical to that disclosed in the patent application (and consistent with the patent’s disclosure) except that it was modified slightly to work with a current operating system. PO Opp. to Mot. to Exclude 2–3, 6–8. As to the Photo VR software, Patent Owner argues that it corroborated the understanding Dr. Oliver gained from the Photo VR reference itself. Id. at 3. According to Patent Owner, Dr. Oliver had the knowledge and experience necessary to authenticate the software by comparing their functions with what they purport to be. Id. at 6 (citing FED. R. EVID. 901(b)(3) (listing “[a] comparison with an authenticated specimen IPR2013-00546 Patent 8,077,176 B2 16 by an expert witness” as an example of evidence that satisfies the authentication requirement)). We are not persuaded that Dr. Oliver’s testimony should be excluded. Rather, an expert reasonably could rely upon later versions of software to gain a general understanding of the technology at issue in documents that describe earlier versions of such software. Petitioner has not persuaded us that Patent Owner’s purported failure to establish an identity between the software Dr. Oliver actually reviewed and the software referenced in Photo VR and the ’176 patent makes such reliance unreasonable in this case. Rather, Petitioner’s arguments go toward the weight we should give Dr. Oliver’s testimony. Cf. Smith v. Ford Motor Co., 215 F.3d 713, 718 (7th Cir. 2000) (“The soundness of the factual underpinnings of the expert’s analysis and the correctness of the expert’s conclusions based on that analysis are factual matters to be determined by the trier of fact.”). C. Anticipation by Photo VR 1. Photo VR Photo VR is prior art to each of the challenged claims under 35 U.S.C. § 102(b) because its publication date of July 30–31, 1996, precedes the earliest effective filing date of the ’176 patent by more than one year. Photo VR is also prior art to those claims under 35 U.S.C. § 102(a) because Patent Owner does not allege an invention date earlier than the publication date of Photo VR. Petitioner contends that Photo VR discloses all limitations of claims 1, 4, 12, 16, 17, and 19–21. Pet. 15–16, 30–58. Photo VR is directed to panoramic view rendering “by generating a sphere-like polyhedral environment map from photo-realistic images and IPR2013-00546 Patent 8,077,176 B2 17 using the generated maps to render the scene by techniques of computer graphics.” Ex. 1007, p. 397, col. 1. 5 Photo VR illustrates its method in the context of rendering a scene in a room by positioning a camera in a “proper position, such as the center of the room,” from which images of the entire view are taken and arranged as “a sphere-like polyhedron consisting of textured trapezoids.” Id. at p. 397, col. 2. Because the center of projection of the pictures moves slightly as the camera is panned, Photo VR assumes that the objects in the scene are far enough away from the camera such that the effects of any movement are negligible. Id. at p. 398, col. 1. “Thus all optic axes of the images can be regarded as intersected at the [center of projection].” Id. The photographic images are registered to polygons of a polyhedron. Id. at p. 398, col. 2. For example, images can be registered to trapezoids or triangles of a sphere-like polyhedron. Id. at p. 399, col. 1. After the images are registered, a texture mapped polyhedron is generated. Id. at p. 398, col. 2. The figure from page 398 of Photo VR, reproduced below, illustrates an example: 5 Citations to Exhibit 1007 are in the form of page number of the article and column number of the page (p. x, col. y). IPR2013-00546 Patent 8,077,176 B2 18 The figure from page 398 illustrates the generation of a “texture mapped sphere-like polyhedron” by ray-casting of original images onto polygons arranged in the space by their registrations. Id. at p. 398, col. 2. Photo VR also describes an “object viewer,” as illustrated in a drawing from page 400, reproduced below: The drawing from page 400 illustrates that the object viewer allows a user to “interactively observe an object from different views in real time.” Id. at p. 400, col. 1. That is, images “A,” “B,” and “C” in the drawing show images taken of the same “object” from different views. Id. IPR2013-00546 Patent 8,077,176 B2 19 2. Anticipation of Claims 1, 4, 12, 16, 17, and 19–21 We have reviewed the evidence presented by Petitioner, including the claim charts in the Petition (Pet. 30–58) and the Declaration of John R. Grindon, D.Sc. (Ex. 1005) and, notwithstanding Patent Owner’s arguments, we are persuaded that Photo VR anticipates claims 1, 4, 12, 16, 17, and 19– 21. For example, Petitioner has shown (Pet. 15, 31–32) that Photo VR discloses modeling of the visible world using “full-surround image data” in that it acquires images of “the whole view from the camera position.” Ex. 1007, p. 397, col. 2. Likewise, Petitioner has shown that the sphere-like polyhedron disclosed by Photo VR is a “p-surface” (in fact, a “p-sphere”) because it provides a computer-graphics representation of a surface having a well-defined inside and outside, with at least one point inside the surface capable of connection to every point of the surface with a distinct line segment, wherein no portion of the line segment lies outside the surface or intersects the surface at a point that is not an endpoint. Pet. 16, 31–32. Petitioner also has shown (Pet. 16, 32) that Photo VR discloses “texture mapping” of image data onto the p-surface because the mapping it describes applies image data generated by taking images from the camera positioned, e.g., at the center of a room (i.e., from a viewpoint) onto polygons defined on the p-surface. Ex. 1007, p. 397, col. 2. We address Patent Owner’s arguments below. a. Photo VR Discloses Full-Surround Image Data Patent Owner contends that a single image, such as one of the images acquired by the camera disclosed in Photo VR, is not full-surround image data. PO Resp. 19–20. Petitioner does not contend, however, that a single IPR2013-00546 Patent 8,077,176 B2 20 image constitutes full-surround image data, nor did we make a preliminary finding to that effect in our Decision to Institute. See Pet. 16, 32; Dec. 6–7; see also Tr. 87:4–8. Rather, as explained in Section II.A.2, above, full- surround image data is data which samples the points P. Thus, Patent Owner’s argument is not persuasive. Patent Owner further contends that full-surround image data must have a single common view point from which the points P of the visible world are observed. PO Resp. 17–18. By contrast, Patent Owner argues, Photo VR discloses acquiring images from a camera with a moving center of projection. Id. at 18 (citing Ex. 1007, p. 398, col. 1). Patent Owner points out that Photo VR states that “it is almost impossible for some images to be registered without objects on the border being duplicated or lost in adjacent images because the camera was moved during panning.” PO Resp. 18 (quoting Ex. 1007, p. 400, col. 2). Petitioner responds that Photo VR discloses a single common viewpoint from which the points P are observed. Reply 6–8. As Petitioner points out, Photo VR discloses that the camera is positioned in the same place (e.g., the center of the room) for each of the photographs in a set and that, for the acquired images, “all optic axes of the images can be regarded as intersected at the [center of projection].” Id. at 6–7 (citing Ex. 1007, p. 398, col. 1). According to Petitioner, the purported deficiency to which Patent Owner cites simply is an acknowledgment in Photo VR that there are practical challenges to capturing multiple images from a common view point. Reply 8. Patent Owner’s declarant, Dr. Oliver, admits that the objective of Photo VR is to maintain the same center of projection for each of the images. Ex. 2013, 113:4–21. Once the images are captured, IPR2013-00546 Patent 8,077,176 B2 21 Petitioner argues, the generated texture map in Photo VR has a common view point. Reply 8. We agree with Petitioner. Photo VR regards each of the acquired images as having a common center of projection. Ex. 1007, p. 398, col. 1. As a result, Photo VR explains, when the images acquired with slightly different centers of projection are registered to the same center of projection, some objects on the borders of the images may be duplicated or lost. Id. at p. 400, col. 2. Thus, although the quality of some of the images might be degraded, all of the acquired data is registered to the same center of projection. This is not meaningfully different from the embodiment described in the specification of the ’176 patent, for example, in which a texture map is “built from two pictures, respectively, taken with a fisheye lens,” together comprising a “pictosphere.” Ex. 1002, col. 8, ll. 59–61. Because the camera must move to take the second picture, there will be a difference, however subtle, between the centers of projection of the acquired images. Patent Owner directs us to U.S. Patent No. 5,903,782 (Ex. 1019, “the ’782 patent”), incorporated by reference into the ’176 patent (see Ex. 1002, col. 9, ll. 63– 67), as disclosing a technique for taking multiple photographs from exactly the same view point. Tr. 56:22–57:9. In the embodiment disclosed in the ’782 patent, a first picture is taken, using a fisheye lens, after which the camera is rotated 180 degrees and a second picture is taken. Ex. 1019, col. 3, ll. 31–35. Because the camera moves, the view points of the two images cannot be in exactly the same place. As Patent Owner admits, there will be some “tolerance.” Tr. 57:7–23. Moreover, Patent Owner concedes IPR2013-00546 Patent 8,077,176 B2 22 that full-surround image data is not limited to data acquired using fisheye lenses. Tr. 58:15–22. Patent Owner also contends that Photo VR’s registered image data are not full-surround image data because the individual images, when mapped to polygons, may overlap one another. PO Resp. 18–19. According to Photo VR, a texture mapped sphere-like polyhedron is generated by ray casting of the original images onto the polygons of the polyhedron according to the registrations of the images, as shown in the picture from page 398 of Photo VR (reproduced above). Ex. 1007, p. 398, col. 2. Prior to Photo VR, where images overlapped, typical ray casting involved using the color from the first image hit by the ray. Id. at p. 399, col. 1. In Photo VR, in contrast, the colors for all images hit by the ray are averaged, using a weighted average to determine the color for that ray. Id. Patent Owner argues that this is not full-surround image data because there is no unique correspondence of color, ray, and viewpoint. PO Resp. 19. In response, Petitioner argues that an averaged color value is a single unique color value corresponding to a ray and viewpoint. Reply 9. Petitioner points to the testimony of Patent Owner’s declarant, Dr. Oliver, that the averaged color value is a single result approximating two colors. Id. (citing Ex. 2013, 122:17–123:6). We are persuaded by Petitioner that the resultant color disclosed by Photo VR, which is a weighted average of two overlapping colors, is a single color value corresponding to a ray from a viewpoint (here, the center of projection). Ex. 1007, p. 399, col. 1. The data that ultimately is texture mapped to the polyhedron contains averaged values IPR2013-00546 Patent 8,077,176 B2 23 in places where the originally acquired data would have overlapped. 6 Patent Owner concedes that, except for the fact that the original images were not acquired from the same viewpoint (an argument we reject above), the data actually mapped onto Photo VR’s polyhedron would be considered full- surround image data. Tr. 67:20–68:13. In sum, Petitioner has shown that Photo VR discloses that the data textured mapped to the polyhedron encodes, explicitly or implicitly, the associations of unique color values with given directions from a given point of projection. That is, Petitioner has shown that the data are full-surround image data. b. Photo VR Discloses “projecting the [full-surround] image data onto the p-surface from a point of projection” Patent Owner contends that Photo VR does not disclose texture mapping full-surround image data to generate a texture map “substantially equivalent to projecting the [full-surround] image data onto the p-surface 6 In response to our questioning at oral hearing, Patent Owner argued that the color-averaging process itself is part of Photo VR’s texture-mapping procedure and that this texture mapping is not performed on full-surround image data because full-surround image data only result after the color averaging. Tr. 68:14–69:2. This argument was not made previously by Patent Owner. In any case, we are not persuaded by this argument. Photo VR discloses texture mapping as performed with standard “rendering packages, libraries or graphic hardware accelerators.” Ex. 1007, p. 397, col. 2. Such standard rendering procedures are distinct from the color- averaging process in Photo VR’s overall procedure. Thus, even if the color- averaging process of Photo VR is a texture mapping, Photo VR discloses a distinct texture mapping that is performed on full-surround image data. Tr. 88:9–89:17. IPR2013-00546 Patent 8,077,176 B2 24 from a point of projection,” as recited in claims 1 and 12. PO Resp. 20–21. Specifically, Patent Owner argues that the data projected in Photo VR are not full-surround image data. For the reasons given in Section II.C.2.a, we disagree with this argument. Patent Owner also argues that Photo VR describes specular lighting as important but that Photo VR does not handle specular lighting. PO Resp. 21, n.6. Patent Owner relates this observation to an argument that Photo VR’s method of capturing and projecting linear images introduces distortion, contending that the claims of the ’176 patent require “distortion- free results.” Id. The claims do not recite “distortion-free results.” Moreover, as we explained in Section II.A.2, the term “full-surround image data” does not require “the sampled points sufficient to provide in a standard 3D computer graphics system a display in which changing the direction of view provides a human viewer the impression of being present at the point of projection from which the color was observed for the sampled point in P,” as argued by Patent Owner (PO Resp. 14). Thus, we are not persuaded that the claims require distortion-free results. Accordingly, Petitioner has shown that Photo VR discloses “projecting the [full-surround] image data onto the p-surface from a point of projection.” c. Photo VR Discloses Allowing a User to Select a Direction of View from a View Point on the Model Claim 1 recites “allowing a user to select a direction of view from a view point on the model.” Similarly, claim 12 recites “allowing a direction of view from a view point to be selected.” Patent Owner argues that these limitations are not disclosed in Photo VR because the “Photo VR ‘Object IPR2013-00546 Patent 8,077,176 B2 25 Viewer’ applies to [an] object, not [a] p-surface.” PO Resp. 22–23. Patent Owner makes this same argument for dependent claims 19 and 20. Patent Owner also argues that the Object Viewer described in Photo VR is “inapposite” because it “does not appear to have any connection with a texture-mapped p-surface and should be disregarded.” Id. at 21. It is unclear what Patent Owner contends is lacking in Photo VR when compared with the claims. Nevertheless, as Petitioner points out, Photo VR describes several instances of a user selecting a direction of view from a view point on a model. Pet. 15–16, 34, 46; Reply 11–12. For example, Figures 5(1)–5(8) (Ex. 1007, p. 403) illustrate images rendered at various viewing angles. Photo VR describes a user selecting a direction of view using the Object Viewer. Ex. 1007, p. 400, col. 1. The testimony of Patent Owner’s declarant, Dr. Oliver, further implies that a user of the Photo VR system selects a direction of view from a view point on a model. Ex. 2011 ¶ 57 (“user selects a viewpoint near the center of projection”); id (“[Photo VR] suggests navigating through adjacent overlapping spherical environment maps”). We also agree with Petitioner that Photo VR describes the Object Viewer as a component of the same Windows 95-based system that contains the other described components, such as the module that generates texture-mapped polyhedrons. Ex. 1007, p. 400, cols. 1–2. Thus, Patent Owner has not persuaded us that the Object Viewer is “inapposite.” Accordingly, Petitioner has shown that Photo VR discloses “allowing a user to select a direction of view from a view point on the model,” as recited in claim 1 and “allowing a direction of view from a view point to be selected,” as recited in claim 12. IPR2013-00546 Patent 8,077,176 B2 26 d. Claims 1, 4, 12, 16, 17, and 19–21 Are Anticipated by Photo VR In sum, based on our consideration of the evidence presented by Petitioner and the arguments detailed above, we are persuaded that Petitioner has proved by a preponderance of the evidence that claims 1 and 12 are anticipated by Photo VR. Claim 4 depends from claim 1. Claims 16, 17, and 19–21 depend from claim 12. Having reviewed Petitioner’s evidence of unpatentability for these dependent claims, we conclude that Petitioner also has proved by a preponderance of the evidence that claims 4, 12, 16, 17, and 19–21 are anticipated by Photo VR. III. CONCLUSION Petitioner has demonstrated by a preponderance of the evidence that claims 1, 4, 12, 16, 17, and 19–21 are anticipated under 35 U.S.C. §§ 102(a) and 102(b) by Photo VR. IV. ORDER For the reasons given, it is ORDERED that, based on a preponderance of the evidence, claims 1, 4, 12, 16, 17, and 19–21 of U.S. Patent No. 8,077,176 B2 are held to be unpatentable; FURTHER ORDERED that Petitioner’s motion to exclude ¶¶ 33, 41– 44, 50, 55–57, 67–72, Figures 5 and 6, and Appendices B–D of Exhibit 2011 is denied; and IPR2013-00546 Patent 8,077,176 B2 27 FURTHER ORDERED that, because this is a final written decision, parties to this proceeding seeking judicial review of our decision must comply with the notice and service requirements of 37 C.F.R. § 90.2. Petitioner: Cono Carrano Ruben Munoz ccarrano@akingump.com rmunoz@akingump.com Patent Owner: Stephen Chow Hsuanyeh Chang Seth Horwitz Robert Groover schow@burnslev.com hchang@burnslev.com shorwitz6576@gmail.com groover@technopatents.com Copy with citationCopy as parenthetical citation