Ex Parte SerlieDownload PDFPatent Trial and Appeal BoardMar 22, 201713880858 (P.T.A.B. Mar. 22, 2017) Copy Citation United States Patent and Trademark Office UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O.Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/880,858 09/11/2013 Iwo Willem Oscar Serlie 2010P01079WOUS 6088 24737 7590 03/24/2017 PTTTT TPS TNTFT T FfTTTAT PROPFRTY fr STANDARDS EXAMINER 465 Columbus Avenue DELANEY, MOLLY K Suite 340 Valhalla, NY 10595 ART UNIT PAPER NUMBER 2666 NOTIFICATION DATE DELIVERY MODE 03/24/2017 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): marianne. fox @ philips, com debbie.henn @philips .com patti. demichele @ Philips, com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte IWO WILLEM OSCAR SERLIE Appeal 2016-005705 Application 13/880,858 Technology Center 2600 Before JASON V. MORGAN, NABEEL U. KHAN, and KAMRAN JIVANI, Administrative Patent Judges. JIVANI, Administrative Patent Judge. DECISION ON APPEAL Appellant1 seeks our review, under 35 U.S.C. § 134(a), of the Examiner’s final decision rejecting claims 1, 3—13, and 15. We have jurisdiction under 35 U.S.C. § 6(b). We REVERSE. 1 According to Appellant, the real party in interest is the Koninklijke Philips, N.V. App. Br. 2. Appeal 2016-005705 Application 13/880,858 STATEMENT OF THE CASE The present invention is a “volume visualization . . . system comprising a feature detector for detecting a feature like a border between body and air (e.g. skin) in an image volume dataset.” Abst. Claim 1 is illustrative: 1. A system for generating a two-dimensional of an image volume dataset, comprising: a distance computing subsystem for computing a distance from a feature detected in the image volume data set to an image element of the image volume dataset; a weighting subsystem for weighting an image element value of the image element, based on the distance, to obtain a weighted image element value; a view generator for generating the two-dimensional view of the image volume dataset, based on the weighted image element value, wherein the two-dimensional view comprises a projection, and a pixel value of the view is based on the weighted image element value. REJECTIONS2 Claims 1, 3—13, and 15 stand rejected under 35 U.S.C. § 103(a) as unpatentable over Haque (US 2007/0237379; Oct. 11, 2007) and Olstad et al., 3D Transvaginal Ultrasound Imaging for Identification of Endometrial Abnormality, Proc. SPIE Vol. 2432, Medical Imaging 1995: Physics of Medical Imaging, pp. 543—53 (May 8, 1995). Ans. 2—6. 2 The Examiner has withdrawn a rejection under 35 U.S.C. § 112. Ans. 6. 2 Appeal 2016-005705 Application 13/880,858 ANALYSIS All claims stand rejected as obvious over Haque and Olstad. Appellant argues the Examiner does not establish a teaching or suggestion of the independent claims’ distance computing and weighting; e.g., claim 1 ’s “computing a distance from a feature detected in the image volume data set to an image element of the image volume dataset” and “weighting an image element value of the image element, based on the distance, to obtain a weighted image element value.” App. Br. 7. The Examiner finds Haque and Olstad together meet the above recited claim limitations. Final Act. 10—11, 14—17. Haque teaches a 2-D projection image iMIP(y, z) (“maximum intensity projection image”) that contrastingly displays, by variation of brightness, the overlapping tissues (e.g., blood vessels) captured by a 3-D image I(x, y, z) (which is, itself, a stack of 2-D images SI, S2,. . . Sm). Haque 3—7, 29-30, 43; Figs. 4A-C. The 2-D projection image enhances/brightens targeted tissue (e.g., a targeted blood vessel) by entirely suppressing the contribution of overlying tissue and attenuating the contribution of underlying tissue. Id. at || 31—32, 37—38. The intensity for each of the 2-D projection image’s pixels (for each y-z position) is calculated as a “weighted sum” of the 3D-image’s voxels along the respective x-direction “projection line.” Id. at 132. The voxels have: no weight at a depth of x less than a (“xa”). Id. Olstad teaches an algorithm for defining tissue surfaces within a 3-D image. Olstad, abst. “As the user moves to neighboring 2D-slices [of the 3-D image], the contour from the previous edge detection result is copied 3 Appeal 2016-005705 Application 13/880,858 and readjusted to the new 2D image[;] . . . allowing] for . . . extraction of a 3D surface representation of the [tissue] boundary.” Id. at 546. The Haque-Olstad combination detects and enhances a tissue surface (hereinafter “targeted surface”) within Haque’s 3-D image. Final Act. 10- 11; Ans. 7. More particularly, the targeted surface is detected in Haque’s 3-D image via Olstad’s teachings and then enhanced in Haque’s 2-D projection image via Haque’s teachings. Id. The Examiner reads claim 1 ’s detected feature on the targeted surface. Id. The Examiner reads the image element of claim 1 on Haque’s “image surface.” Id. In the Haque-Olstad combination, Haque’s image surface is located at x=0; such that, for each pixel of the 2-D projection image, there is a projection line distance a between the image surface and targeted surface (located at departure point x=a). See Ans. 7—8. The Examiner finds distance a meets the claimed computed distance between the image element and detected feature because distance a is computed as a function of the given y-z position, namely computed via a function expressed by Haque’s paragraph 32 as “a(y, z).” Id. Appellant argues in part: [Haque’s weighted sums] depend on relative positioning, not distance, of x with respect to a(y, z), where a(y, z) is a set distance from a departure point of a projection line in the x direction to an image surface (not to a feature detected in the image volume data set), and varies depending on (y, z) of a three-dimensional image data I(x, y, z). See paras. [0031]— [0032], . . . Indeed, the actual distance between x and a(y, z) appears to be irrelevant, since x being greater than, less than or equal to a(y, z) dictates the weighting function Wm(x). . . . Then, a series of images Si (where i = 1, 2, . . . , m) may be captured by performing maximum intensity projection (MIP) in the x direction on a value obtained by multiplying three 4 Appeal 2016-005705 Application 13/880,858 dimensional image data I(x, y, z) with the appropriate weighting function Wm(x). There is no teaching of computing the distance from a detected feature to an image element (both in the image volume data set). App. Br. 8 (emph. omitted). We agree that the Examiner has not shown a teaching or suggestion of the claimed computed distance between the detected feature (read on Haque’s targeted surface of depth x=a, i.e., “departure point”) and image element (read on Haque’s image surface of depth x=0). As argued by Appellant (above block quote), though Haque expresses a projection line (x-direction) distance of “a(y, z)” between the image surface and targeted surface, this disclosure merely reflects that the distance a varies throughout the 3-D image. Stated differently, the “a(y, z)” terminology reflects that a projection line distance a between the image surface and targeted surface will vary as the projection line is translated in the y and z directions. Even assuming the varying distances a(y, z) are known—i.e., the image surface and departure point depths are known throughout Haque’s 3D-image (as respectively x=0 and x=a)—that is not to say the distances a(y, z) are computed from the respective y-z positions of the projection lines. Even further assuming a disclosed need and/or step for computing the distances a(y, z), no such disclosure is established by the Examiner’s finding that: “In Haque [0033], the distance is calculated between the projection point [(i.e., at x=a)] and the image surface, as it is stated that this variable is dependent on the positioning of (y, z) this distance is calculated.” Ans. 7—8. As also argued by Appellant (above block quote), there is no apparent need to compute the projection line distances a(y, z) between the image surface and targeted surface. Rather, there is only an apparent need to know 5 Appeal 2016-005705 Application 13/880,858 the targeted surface’s varying depth such that: the pixels thereover can be given no weight (entirely suppressed); all pixels thereon can be given an initial weight (greatest brightness); and all pixels thereunder can be given exponentially less weight as their depth increases (dimmed with depth). See supra at 3 (explaining Haque). Thus, there is only an apparent need to compute respective distances between the targeted surface and pixels thereunder, so as to thereby determine each such pixel’s exponential decrease in weight (see id.). For the foregoing reasons, we reverse the rejection of independent claims 1,13, and 15. Because the same error persists for the dependent claims, we also reverse the rejection of claims 3—12. DECISION We reverse the Examiner’s rejection of claims 1, 3—13, and 15. REVERSED 6 Copy with citationCopy as parenthetical citation