Virtual Solutions, LLC v. Microsoft Corp.

925 F. Supp. 2d 550, 2013 WL 593764, 2013 U.S. Dist. LEXIS 21731
CourtDistrict Court, S.D. New York
DecidedFebruary 15, 2013
DocketNo. 12 Civ. 1118(SAS)
StatusPublished
Cited by3 cases

This text of 925 F. Supp. 2d 550 (Virtual Solutions, LLC v. Microsoft Corp.) is published on Counsel Stack Legal Research, covering District Court, S.D. New York primary law. Counsel Stack provides free access to over 12 million legal documents including statutes, case law, regulations, and constitutions.

Bluebook
Virtual Solutions, LLC v. Microsoft Corp., 925 F. Supp. 2d 550, 2013 WL 593764, 2013 U.S. Dist. LEXIS 21731 (S.D.N.Y. 2013).

Opinion

OPINION AND ORDER

SHIRA A. SCHEINDLIN, District Judge.

I. INTRODUCTION1

Virtual Solutions, LLC (‘Virtual”) brings this action against Microsoft Corporation (“Microsoft”). Virtual claims that Microsoft has infringed on claims 1-3, 5, 7, 8-9, and 22 of U.S. Patent No. 6,507,353 (“the '353 Patent”), of which Virtual is the exclusive licensee. Microsoft now moves [554]*554for summary judgment on the grounds that claims 1 and 8 of the '353 Patent are invalid for indefiniteness. Also before the Court are the parties’ submissions related to the Markman hearing that took place on January 22, 2013. At this hearing, the Court also heard oral arguments on the present motion. For the following reasons, the motion is granted.

II. BACKGROUND2

A. The Patent

The '353 patent was filed on December 10, 1999 and was issued by the Patent and Trademark Office (“PTO”) on January 14, 2003.3 It is entitled “Influencing Virtual Actors in an Interactive Environment.”4 The patent claims “[a] method for generating a behavior vector for a virtual actor in an interactive theat[er] by interpreting stimuli from visitors ....”5 The specification states that the object of the patent is “to provide a method for interacting with virtual actors in an interactive environment,]” and “to simulate ‘real-life’ behaviors[ ] of the virtual actors.”6

The specification teaches that “[e]ombining the security of a controlled environment with the possibility to interact almost naturally with animal kind has always been a dream[,]”7 and that viewing animals in captivity does not live up to this dream, because animals in captivity alter their behaviors.8 The specification further teaches that, prior to the '353 Patent, virtual reality had showed promise in bringing this dream to life, but that the interactivity of virtual reality at the time was limited by its use of scripted scenarios.9

The specification describes a preferred embodiment consisting of a dome-shaped theater into which images are projected for viewing by an audience, and notes that the projection of images could be replaced by holography or any other type of presentation.10 In a preferred embodiment of the main modules described in the patent, these modules are to implemented via software.11

The '353 patent also describes sensors in the theater area that detect physical information about audience members and “Stimulus Generators” that analyze that information.12 The patent states that a system could feed this sensor data into the “behavioral module” of a “virtual actor,” which is “likely [t]o be [a] virtual animal[ ] or [a] virtual physical actor[ ] which ha[s] [555]*555behaviors that are easier to simulate than those of humans.” 13

The “behavioral module” of the virtual actor would then calculate the reaction of the actor.14 One component of the “behavioral module” would be the “behavioral model” of a virtual actor, a set of factors specific to a virtual actor that would mediate its response to the data provided to it by the Stimulus Generator.15 For example, the specification describes the age of a virtual actor as a possible factor within that actor’s behavioral model, and states that “[a]n older animal will more likely be passive with respect to the stimuli of the visitors.”16

In sum, the system would collect and analyze physical information from visitors, feed that information to virtual actors, the behavioral models of which would mediate a response (or non-response).17 In this way the system would allow the virtual actors to respond to the physical information collected from viewers in real time.18

B. The Parties’ Experts19

Both parties have offered expert testimony in connection with this motion. Microsoft’s expert is Aaron T. Bobick, Ph.D. (“Bobick”). Bobick received bachelor of science degrees in mathematics and computer science from the Massachusetts Institute of Technology (“MIT”) in 1981, and a Ph.D. in cognitive science from MIT in 1987. He has been in the academic field since receiving his Ph.D., with stints at MIT and Stanford. Since 2003, he has been employed as full professor at the Georgia Institute of Technology’s College of Computing, where he was the founding chair of the School of Interactive Computing. Additionally, from 1993-1996, he served as the Chief Technology Officer of Cybergear, a company that he founded. In connection with Cybergear, Bobick received several patents for an “interactive exercise apparatus.”

Over the course of his career, Bobick has published twenty-two articles in peer-refereed journals on topics related to machine perception and virtual reality. He has also received nine grants, as principal investigator, on the same topics. Finally, Bobick has provided expert testimony in eight prior cases, mostly in the field of computer vision.

Virtual’s expert is Vyacheslav Zavadsky, Ph.D. (“Zavadsky”). Zavadsky received a Masters in Computer Science in 1994 from Belarusian State University, and a Ph.D. in the same field from Belarusian State University in 1998. He is currently employed as the Principal of Zavadsky Technologies, where his duties include patent assessment, software project management, and software development. From 2003 to 2011, he worked at UBM Techlnsights, where he was engaged in patent analysis [556]*556and reverse engineering. During his tenure at UBM, Zavadsky reviewed hundreds of patents.

Zavadsky is the named inventor on twelve issued United States patents, at least five of which pertain to image processing and computer vision. He has ten additional patents pending.

III. LEGAL STANDARD

Indefiniteness is an issue that is amenable to summary judgment.20 Summary judgment is appropriate “if the movant shows that there is no genuine dispute as to any material fact and the movant is entitled to judgment as a matter of law.”21 “ ‘An issue of fact is genuine if the evidence is such that a reasonable jury could return a verdict for the nonmoving party. A fact is material if it might affect the outcome of the suit under the governing law.’ ”22 “The moving party bears the burden of establishing the absence of any genuine issue of material fact.”23 “When the burden of proof at trial would fall on the nonmoving party, it ordinarily is sufficient for the movant to point to a lack of evidence ... on an essential element of the nonmovant’s claim.”24 In turn, to defeat a motion for summary judgment, the non-moving party must raise a genuine issue of material fact. To do so, the non-moving party “ ‘must do more than simply show that there is some metaphysical doubt as to the material facts,’ ”25 and “ ‘may not rely on conclusory allegations or unsubstantiated speculation.’ ”26

Free access — add to your briefcase to read the full text and ask questions with AI

Related

Technology Innovations, LLC v. Amazon.com, Inc.
35 F. Supp. 3d 613 (D. Delaware, 2014)
Wultz v. Bank of China Ltd.
291 F.R.D. 42 (S.D. New York, 2013)

Cite This Page — Counsel Stack

Bluebook (online)
925 F. Supp. 2d 550, 2013 WL 593764, 2013 U.S. Dist. LEXIS 21731, Counsel Stack Legal Research, https://law.counselstack.com/opinion/virtual-solutions-llc-v-microsoft-corp-nysd-2013.