People of ACM - Michael Zyda

May 4, 2021

Why is this an exciting time to be in the computer science games field?

Progress in computing technology in several key areas is pushing games to be the supreme form of modern entertainment, surpassing revenues of the legacy film industry by a factor of 5. Computer graphics lighting and shading, animations generated by machine learning systems creating deep fakes of entertainment stars, and the reinvention of our processor hardware are all leading the games industry toward a bright games future. All of this technology comes from computer scientists and computer engineers.

Will you tell us a little about your virtual reality projects with NPS? What is the most surprising way this field has developed in the last few decades?

In 1984 when I arrived at the Naval Postgraduate School, most of the computer graphics world of SIGGRAPH seemed to be working on photorealistic rendering, with a few groups working on computer graphics hardware. The photorealistic rendering people were quoting something like seven days to compute one frame of film. I am not that patient. I felt that my focus ought to be on what could be done in real-time 3D graphics instead, so I purchased a Silicon Graphics IRIS-1400 workstation, which could draw some 500 triangles per frame without a z-buffer for hidden surface elimination. My students and I started experimenting with the SGI machine and its later models, and we started building what we called “3D visual simulation” at the time.

By 1991, we had developed the first version of NPSNET, which became our virtual environment technical ideas testbed. On December 24, 1993, we held a phone meeting with DARPA and the US Army asking if we could work with the University of Pennsylvania and Sarcos Engineering to produce a demo of a soldier in a fully instrumented bodysuit as he walked/fought inside of our networked virtual environment NPSNET by February 14, 1994. We produced a great demo of the first networked virtual environment with fully instrumented bodysuits that played across the internet.

Our work on NPSNET and a report titled “Modeling and Simulation: Linking Entertainment and Defense” led to a request by the Chief Scientist of the Army asking me if I could design a research and operating plan for the USC Institute for Creative Technology as a place where researchers could work directly on virtual environment technologies and applications of interest to the Army. That plan was developed and USC ICT has been a success since its formation in 1999.

In 1994, a head-mounted display (HMD) was about $6,000+ for a low-resolution headset; now it’s a few hundred dollars for something much higher in quality. That is the most surprising thing that has happened in the virtual environment field. Hardware technology has zoomed ahead and continues to develop. Additionally, much of the software, graphics and networking we worked on in virtual reality has become bedrock tech for the games industry.

Some have argued that, despite a great deal of media attention, corporate investments, and hopeful predictions, virtual reality hasn’t really taken off. Do you agree with this assessment? What will it take for virtual reality to become fully integrated into daily life?

This most recent wave of virtual reality had lots of funding for people to produce head-mounted displays. When I was in Nanjing in December 2018, I visited a VR porting company and they had a wall of 75 different HMDs they had to support. One of the things that did not happen with this wave of VR is that there was next-to-no funding for content development, games/apps, and really no focus of making a standard for user interaction.

The lack of a VR user interaction standard made it so that almost every VR app you tried had a completely different and confusing interface. So consumers have paused on wanting to acquire VR for the home, and just the tech geeks have purchased it. Additionally, people don’t want to play games where they cannot see their friends who are in the same room. An HMD is not the way to go. There are several companies that are experimenting with ways to provide a VR experience without an HMD that look interesting. One, Athanos, has been putting together an iPad-like tablet that is tracked along with a lightweight headband for head tracking that gives an experience like looking through a window into a virtual world, as Ivan Sutherland indicated in 1965. I am an advisor to Athanos.

Young people, especially teenagers, are some of the biggest consumers of computer games. How can computer games be a gateway to attract young people to the wider world of computer science?

Computer game development is a gateway for young people to enter the field of computer science. When I arrived at USC in 2005 and started designing the Computer Science Games Program, I was concerned that parents of potential applicants to our program might not want to send their children to get an education on how to build games. That thought turned out to be backward. When I met with the first groups of parents, many of them would take me aside and say, “This is a computer science games program, right? We don’t want our child in a general computer science program as there are no jobs in that direction.” The parents were remembering the internet crash of some five years in the past, and I had to reassure them that it was most definitely a computer science games program. Since our program has been a huge success, we don’t have a problem with worrying about whether students will come.

What is an area in your field that hasn’t gotten enough attention, but is poised to have a big impact in the coming years?

The usage of biosensors to determine the physical and emotional state of the human would enable us to build machine learning-based AI characters that can interact with humans on an emotional and physical understanding level. Recently, Stanford University founded the Human Perception Laboratory to begin development in this area with a direct pipeline to productization (game development). I am a Distinguished Collaborator with that laboratory.

Michael Zyda is the Founding Director of the University of Southern California's Computer Science Games Program, and a Professor of Engineering Practice in the USC Department of Computer Science. As part of the Games Program, he instituted the year-long advanced game projects course that forms the core of USC Games. The program has been rated Number 1 by the Princeton Review for 10 of the last 11 years.

From 1993 to 2004, Zyda was a Professor of Computer Science at the Naval Postgraduate School (NPS) in Monterey, California. At NPS, Zyda’s group built the first networked virtual environment with fully instrumented bodysuits that played across the internet.

Zyda has been an ACM member since 1977 and was recently named an ACM Fellow for contributions to game design, game and virtual reality networking, and body tracking. In 1990, with Fred Brooks, Henry Fuchs and Mary Whitton, he co-founded the ACM SIGGRAPH Symposium on Interactive 3D Graphics (i3D). He is also an IEEE Fellow, an IEEE Virtual Reality Technical Achievement Award winner, a Senior Member of the National Academy of Inventors and a National Associate of the National Academies.