Viewer tracks, records and analyzes animal behavior
PermaTrack (Multiple filter regions)
Inspired by single arenas or multi-arena setups with irregular conditions of illumination, Viewer 17 now introduces a feature, which allows for continuous tracking of animals irrespective of local changes in illumination: Multiple Filter Regions.
With this feature you can create various filter regions in one setup to handle areas with different lighting conditions like the opened and closed arms in a plus maze. Define several tracking filters in opened and closed arms to guarantee a seamless tracking with constant quality.
Regardless whether you would like to analyse live images in real-time or video files already in existence, the renowned usability of Viewer will allow you to straightforwardly optimize tracking settings, and save them to a specific configuration in order to maintain reproducibility across experiments – no matter how complex and irregularly illuminated your setup appears to be!
The concomitant occurrence (or the temporal relationship, respectively) of certain parameters might give you important insights regarding the temporal organisation of behavioural performance. Viewer 17 therefore introduces an innovative visualization, which will provide you with customizable and comprehensive overview of the temporal organization of parameters. This functionality – the Data Replay – will organize parameters and automatic behavioural annotations of your choice along the temporal domain, as soon as you acquired your data – just one click away!
Subject tracking with body axis alignment and angle of view
Our solution does not only track the center of the animal (as competitive products do) but also the nose and the tail without marking the animal. Thus we can provide much more detailed behavioral analysis and automate more complex experiments like novel object recognition and social interaction.
Head wagging, head stretches and tail moves and more
Due to the fact that we know the x/y coordinates of three points (nose, body and tail) our software can automatically detect and count behaviors like head wagging, head stretches and tail moves. Also freezing behavior and ambulations are automatically scored. To make the system score like you would do by watching, parameters for each of these behaviors can be adjusted.
The zone definition function is a very useful function if you would like to observe, quantify or compare the behavior of an animal in one or more defined areas of the experiment arena. You can create up to 50 zones with different forms or you can use a grid with columns and rows.
You can use each and any zone entry and exit to start or finish an experiment. Moreover, zones visits can be used to control external hardware. An unlimited number of optional output channels can be applied.
Various parameters are calculated and displayed in real time during the experiment: Track length, experiment duration, covered area (percentage), velocity, velocity classes, distance to the wall, ambulation, zone crossings and activity/inactivity time shares. For each existing zone the following results are issued: Velocity, track length, duration of visit, number of visits, latency for first visit, head wagging, head stretches, tail moves, ambulations, freezings, activity.
Covered area parameter
Besides special test specific plug-in developements for customers, BIOBSERVE improves the Viewers`s standard features according to customers’ needs constantly. Watch this movie to see the Covered Area parameter.
Unlimited number of animals in one arena
The Viewer has no restrictions concerning the number of animals tracked at once. That means, you can observe as many individuals as your CPU capacity is able to handle. Twenty fruit flies have already been tracked successfully with an Intel QuadCore unit with 2 GB RAM. The image on the right shows tracking of six tadpoles.
To build interactive experimental environments additional input and output channels are available. Viewer’s is able to sync and save additional analog inputs and to control devices by output channels. While an animal is tracked, information like sound, ultrasound, EEG and other physiological data are recorded through the input channel. The additional output channel allows you to intervene during the experiment. This can be conducted manually or automatically. Time or position dependent. For example it is possible to apply a stimulus (e.g. light flash) according to a time script or as response to an event. An event could be touching of an object, a zone entry, or a behavioral pattern. As add-on we offer external Trigger boxes, that control and import eight analog input and any number of digital output channels. With this possibility you can build fully automated training and learning environments.
Easy control of external devices with a graphical script tool
With the Viewer system you can study complex learning and problem-solving strategies. As described in the previous paragraph, it is possible to connect and control several external devices. Viewer supports your creativity in designing interactive experimental setups with an easy to handle script configuration tool. You don\’t need to employ IT specialists to program your hardware. Write your own scripts only by drag and drop. We suggest that (especially sophisticated) experiments are discussed and designed as flow-charts on a sheet of paper first. In the Viewer Script Tool you can then use predefined elements (events/zones/hardware pieces) and connections (actions/conditions) to draw a flow-chart that can be executed instantly.Viewer even checks your chart for logical integrity.
Advanced video control
Especially in the context of analysing large video files of experiments with drastically extended time-scales, handling your video file (e.g., search for certain episodes, and focus on certain content) might be less efficient than the busy agenda of your lab would allow for. Therefore, Viewer 17 now introduces advanced video controls, which allow you to efficiently handle your video files – no matter how extended your experiments are!
Independent arena control
When monitoring sev
eral arenas with one camera, you might want to avoid artefacts caused by operating individual arenas successively, while overall recording has already started. Hence, Viewer 17 introduces a new feature, which will allow you to start monitoring of
individual arenas individually – even if your arenas are recorded by a single camera!
Customizable temporal resolution
Not all episodes of your experimental paradigm require the same temporal resolution when it comes to the analysis of behavioural performances. You might want to focus on details during episodes of increased interest (e.g., operant reactions), while you might prefer an all-embracing, less detailed overview during other episodes (e.g., spontaneous behaviour during habituation). Thus, with Viewer 17 you will be able to define a freely customizable set of summary intervals, even before you start the acquisition of your experiments. During the inspection and analysis of your data, you will be able to switch between your choices of temporal resolutions – and thereby increase the efficiency of your analysis with just a few keystrokes!