My laboratory focuses on the evolution of behavior, specifically femalemating preferences for male traits. Since my days as a PhD student, I have used playback of synthetic, computer-generated animations (‘fish TV’) to examine multivariate female preferences for male visual displays in swordtails, small livebearing fish. The preference assay we use is simple: we measure the amount of time the subject spends in proximity to each of two monitors at either end of an 80-liter aquarium.

Until 2007, a researcher would start a computer program which simultaneously presented the animations on either end of the aquarium, and then use stopwatches or a computer event recorder to quantify association time. We were beginning to work on the evolutionary genetics of multivariate preferences, which requires performing large number of tests on large numbers of animals. Accordingly, we needed an automated system where multiple fish could be tested in parallel, with minimal routine intervention by experimenters. Our goal was to conduct on the order of a hundred trials in a reasonable working day.

Over email and by phone, Christian and Stephan designed a system for us whereby the Monitortrainer plug-in for the Viewer tracking software would automatically synchronize playback of animation files with the Viewer’s tracking of the subject’s position. Two parallel Viewer systems each tracked the behavior of two fish in adjacent tanks; a cohort of four fish was therefore tested simultaneously in four separate tanks.

With these experiments, it takes as much time to remove fish from their home tanks, acclimatize them, and return them as it does to run the actual experiments. Accordingly, Stephan and Christian’s system included a software video switch, controlled by Monitortrainer, that changed the video input to the Viewer from two video cameras in one room, to two video cameras in another. We could thus shuttle fish back and forth and acclimatize them in one room while automated trials were running in the other. Stephan came all the way to College Station, Texas to install and test the system for us; as luck would have it, my lab renovations were delayed until the following week, which meant we had to tear down the Viewer and set it up again. Thanks to Stephan’s documentation, and continued support, this was not as dismal as it could have been.

Christian and Stephan were indefatigable at working through the inevitable unanticipated demands of this complex system; this included working through several software problems with the video switches, extending the number of videos that MonitorTrainer could load, inserting a “pause” feature on MonitorTrainer switches, and developing a batch export function for converting large number of Viewer files to Excel format.
Given the seven-hour time difference between Texas and Germany, the turnaround on support was, and is, remarkable; if we had an issue at 3 on Tuesday afternoon, it was often solved by 8 the following morning.

We can now conduct 8 trials in 40 minutes, or 96 trials in an eight-hour day. This has allowed our lab to generate some of the highest sample sizes and most detailed mappings of perceptual space that have been obtained in studies of visual behavior. The system now runs smoothly and, after some fiddling with lighting, has proven highly reliable at picking out a small (2 cm) gray fish below the water surface, despite reflections from adjacent monitor.

The Monitortrainer/Viewer system has allowed us to move beyond our original aims.
We are using neural network processing of Viewer raw data to try to detect signatures of mate preference independent of the basic association time assay; we can also use these data to phenotype individuals in terms of overall activity and space use. The most exciting application of our system is “video games for fish”. Working with Trisha Butkowski of TAMU’s Visualization Science Department, Christian and Stephan developed a plug-in for the Viewer that streams real-time position data to another computer. This position data is analogous to the joystick input in a video game: the actions of the “user”, in this case the subject fish, determine the behavior of the stimulus on screen. Trisha’s program allowed the synthetic male fish to track the movements of the female and adjust its courtship behavior accordingly. Automated, algorithm-based interactive playback has the potential to make an important contribution to how we study animal behavior.

Throughout it all, Christian and Stephan have been patient, tireless, flexible, and creative with our unusual and exacting demands.

I highly recommend the Viewer/Monitortrainer system to any labs endeavoring to innovate in the field of visual behavior.

user story rosenthal

Dr. Gil Rosenthal
Department of Biology
Texas A&M University
3258 TAMU
College Station, TX 77845