Video playback is a widely used technique in animal behavior. We created and evaluated a program that applies rules-based, interactive playback of 3-D computer animations in response to real-time, automated data on subject behavior.
Video playback is a widely-used technique for the controlled manipulation and presentation of visual signals in animal communication. In particular, parameter-based computer animation offers the opportunity to independently manipulate any number of behavioral, morphological, or spectral characteristics in the context of realistic, moving images of animals on screen. A major limitation of conventional playback, however, is that the visual stimulus lacks the ability to interact with the live animal. Borrowing from video-game technology, we have created an automated, interactive system for video playback that controls animations in response to real-time signals from a video tracking system. We demonstrated this method by conducting mate-choice trials on female swordtail fish, Xiphophorus birchmanni. Females were given a simultaneous choice between a courting male conspecific and a courting male heterospecific (X. malinche) on opposite sides of an aquarium. The virtual male stimulus was programmed to track the horizontal position of the female, as courting males do in the wild. Mate-choice trials on wild-caught X. birchmanni females were used to validate the prototype’s ability to effectively generate a realistic visual stimulus.
Previous methods for interactive video playback in animal behavior have relied on a human operator to provide responses to behavioral cues from subjects. With IVP, we created a program that applied rules-based interactivity in response to real-time, automated data on subject behavior. We briefly outline the steps involved in creating the program below.
The first step was to create digital male exemplars of X. birchmanni and X. malinche. We took an approach similar to previous studies6. We created 3D meshes that are modeled on textures based on photographs of real X. birchmanni and X. malinche. To capture the realistic textures of the real fish, the same photographs used to model the fish shapes were used as textures for the fish. A planar map applied to their UV coordinates aligned the UV map with the photograph texture. Second, the digital fish mesh must deform like a real fish. To accomplish this, a virtual skeleton was created for body and fins and “skinned” to the mesh. The skinning process enables the mesh to be deformed when the joints are rotated.
Second, we added motion to the digital fish. Six key movements that a male swordtail fish makes were animated. Three of the movements were used to represent the different speeds at which a fish would swim. The other three movements were the fish remaining still, turning, or exhibiting a lateral courtship display. Since males can raise or lower their dorsal fin in accordance to whether male or female receivers are present3, we decoupled the movement of the dorsal fin from that of the lateral courtship display. The dorsal fin was keyed so that it could be raised or lowered at any point during the cycle. A total of twenty-four animation cycles were used. Each cycle started and ended with the fish in the same posture so that the animation cycles could easily blend together. All of the twenty-four animation cycles were created by rotoscoping7,8 the desired motion from overhead video of a live, courting male X. birchmanni.
Third, we enabled interactivity. We used the Biobserve Viewer system to track in real time the position of the snout, body, and tail of the female swordtail and transmit that information in real time to the IVP program. This was done separately for each courting male on each monitor. The male animation followed the subject fish’s position. We modeled following using Reynolds arrived steering behavior9,10, which allowed the male to follow the female and decelerate as it approaches the female.
To calculate the position of the male swordtail fish at each time step, the system was supplied with the current position of the female, which allowed the program to calculate the forces that drive the male. First, the target-offset vector was calculated by subtracting the position of the male fish from the position of the female fish. Second the distance from the male fish to the female fish was determined by taking the magnitude of the target-offset vector. Third, the desired speed of the male fish was determined by dividing the distance by a constant deceleration value. This allowed the male fish to slow down as it approached the female fish. Last, the desired acceleration was calculated by subtracting the male’s current velocity from the desired velocity.
Since animations are rendered as discrete frames of video at 60 Hz, calculations were made for each discrete time step, at intervals of 0.016 seconds. Maximum velocity was set to a value of 10 cm/s for these experiments. If the magnitude of the new velocity was greater than the maximum velocity, the velocity was set to the maximum.
For this particular simulation, the interactive male fish raised its dorsal fin 50% of the time, and only during courtship interactions. The lateral courtship display behavior was triggered whenever the male stimulus was within 0.25 body lengths of the female swordtail fish in the Z dimension.
We were surprised that interactivity abolished the female preference for conspecifics, despite the fact that the non-interactive animations elicited a strong preference and given that females spent the majority of time associating with the interactive stimuli. One possibility is that closely following the female may override visual cues used to evaluate males, such as the sword and dorsal fin. Alternatively, females may be less likely to lose interest in a courting male, and therefore less likely to sample both individuals (figure 5).
Nevertheless, our results show that the operating principle of video-game technology namely, software-driven, rules-based agents responding to user input can be successfully applied to interactive playback in studies of animal behavior. This type of rules-based interactive playback should prove useful for studies of shoaling and collective movement11,12. In particular, the ability to manipulate the rules that a virtual exemplar uses for shoaling should give us insight into the processes that animals use to make shoaling decisions.
The authors have nothing to disclose.
We are indebted to Stephan Schwartz and Christian Gutzen of Biobserve GmbH for sponsoring this article and for much technical assistance. We thank Olivia Ochoa, Christian Kaufman, and Zachary Cress for assistance with fish care; we are grateful to the Mexican federal government for permission to collect fish. We are indebted to Glen Vigus, Frederic Parke, and the Visualization Lab at Texas A&M. Athena Mason and Ryan Easterling assisted in preparing this publication. Funding was provided by Texas A&M University and NSF IOS-1045226.
Material Name | Type | Company | Catalogue Number | Comment |
---|---|---|---|---|
Maya 8.0 | ||||
C# program using Microsoft’s XNA Game Studio 2.0 | ||||
BIOBSERVE Viewer 2 | ||||
Dell 15” CRT monitor (2) | ||||
20 X 20 X 80 cm Plexiglas testing aquarium | ||||
Dell Latitude computer (animation server) |