Introduction
Every researcher who has ever entered the animal room in the evening has noted a rapid increase of rats’ activity. Wild rats are nocturnal or crepuscular, and most of their activities occur under low-light conditions [1, 2]. The laboratory rats retain this propensity [2, 3]. This observation has several consequences. For instance, it implies the necessity of mimicking the natural light-dark cycle in the animal facilities and the importance of studying rats’ behavior during their active (dark) phase. As rats exhibit the positive and negative affect, expressed as the 50-kHz “happy” and 22-kHz “alarm” ultrasonic calls, respectively (see [4] for the recent review), it is also likely that the “positive” affect is more intense at the active (dark) than at the resting (light) phase. In this respect, Burgdorf et al., [5] reported that the lights-off is a strong signal for the induction of locomotor activity and prosocial behavior resulting in hedonic 50-kHz ultrasonic vocalizations (USVs) [5].
The rapid decrease of light intensity, signaling the active rats’ phase, appears as a perfect natural phenomenon for studying motivation because as the animals wake up, they begin their social and other activities. Here, we attempted to measure rats’ general activity, social behavior, and USVs using objective and state-of-the-art digital techniques to determine whether these methods can capture and further elucidate the nature of darkness-induced activity in semi-natural conditions.
To this aim, we have built 6 custom sound-attenuated boxes equipped with the lickometers monitoring head entries into the water supply, infrared (IR) sensitive cameras registering social activities, and microphones recording animals’ ultrasonic calls. The boxes were also equipped with electric fans supplying fresh air, white lights controlled by an ON-OFF clock, and IR LEDs providing conditions necessary for video recording in total darkness.
This setup was designed to precisely, automatically and objectively monitor rats’ activity during the two “twilight” periods, i.e., when the lights were rapidly turned on and off during the daily light/dark cycle. Prompted by Burgdorf et al., [5] research, we expected a high intensity of prosocial activity and positive affect at the beginning of the dark phase. Studying these phenomena at the end of the dark phase was regarded as a control condition.
Rat’s social life [6] is extremely complex [1, 7, 8]. Thus far, rodent social behaviors and ultrasonic vocalizations have been classified by trained observers who scored them either during the tests or off-line by watching the video tapes or computerized videos and manually analyzing and classifying USVs on a computer screen (see our recent paper by Popik et al., [9] for more details). Such an approach, however, is extremely time-consuming and prone to subjective bias.
Recent advances in computer vision and deep learning-based toolkits enabled almost semi-automatic video analysis of rodent social behavior [10]. Specifically, behavioral analyses offered by the open-source DeepLabCut https://github.com/DeepLabCut marker-less pose-estimation toolkit [11–13] have vastly facilitated the analytical workflow. This is because DeepLabCut offers the generation of neural networks (models) representing interacting animals’ body parts in time and space. This freely available software allows for semi-automatic tracking of animals’ body parts movements. A necessary second step requires post-processing software that would classify social behavior by using numerical data representing animals’ body parts in time and space (i.e., the output provided by the DeepLabCut). For these, the Simple Behavioral Analysis (SimBA, https://github.com/sgoldenlab/simba) open source python’s toolkit constitutes another analytical break-through [14, 15]. The third digital tool used in this work was the DeepSqueak https://github.com/DrCoffey/DeepSqueak, the Matlab’s machine learning software that vastly facilitated the analysis of ultrasonic calls [16].
This work has focused on examining whether the darkness-induced activity could be detected using semi-automatic and objective digital workflow and on finding the precise nature of the activity, expressed by the analysis of social behavior and USVs categories. The availability of categorized ultrasonic and behavioral data also allowed for the investigation of their relationships.
Materials and methods
Animals and ethics
Twelve male 7-week-old Sprague Dawley rats (Charles River, Germany) weighing ~ 300 g upon arrival were group-housed (4 per cage) in the standard laboratory cages for 2 weeks of acclimation. Both at that time, as well as during the experiment, the animals were maintained under standard colony A/C controlled conditions: room temperature 21 ± 2°C, humidity (40–50%), 12-hr light/dark cycle (lights on: 06:00) with ad libitum access to water and lab chow.
The animals were maintained, and experiments were conducted in accordance with the European Guidelines for Animal Welfare (2010/63/EU) and EQIPD guidelines. All experimental procedures were approved by the II Local Ethics Committee for Animal Experiments at the Maj Institute of Pharmacology, Polish Academy of Science, Kraków, Poland (ethical allowance LKE: 108/2023).
Apparatus
The experiment was conducted in six identical custom boxes (length x width x height: 50 x 50 x 50 cm) made of black Plexiglas. Every box’ top contained: a) 20 holes (diameter of 2.5 cm) allowing for fresh air circulation, b) a hole of 7.5 cm diameter with 12 V electric fan (Sunon, China) facilitating airflow, c) infrared (IR) sensitive Axis M1137 Mk II network camera (Axis Communications, Sweden) connected via Ethernet ports to the Synology NAS station, d) Avisoft UltraSoundGate CM16/CMPA ultrasonic microphone connected via UltrasoundGate 416H analog-digital converter into one of two laptop computers running RECORDER USGH software (version 4.2; Avisoft Bioacoustics; Glienicke/Nordbahn, Germany), e) white light LED strip providing 150 Lux (measured at the box’s bottom) of ambient white light, controlled by a precise circadian clock, and, f) two IR LEDs providing the illumination necessary for video recording at the total darkness. The box’s front had the 4 x 5.5 cm (width x height) opening located 7 cm above the floor, handling the water bottle + sipper tube (Med Associates model ENV-250BT) water supply, monitored with the IR Head Entry Detector (Med Associates model ENV-254-CB) connected via Med Associates SG-716B SmartCtrl™ Interface Modules and DIG-705 box to the third laptop computer. The inner surfaces of the boxes’ walls were covered with the P80 K20N (Norton, Poland) sandpaper up to the height of 24 cm; the rest of the inner walls were covered with the 2-cm thick black sponge (BitMat, Poland) restricting echo and allowing for good quality USV recordings. The boxes’ bottoms were covered with the Multifit [J1230123], Germany) Forest-Land wood shavings, allowing for a dark, contrasting background. The lab chow was freely available on the boxes’ floor. All 6 boxes were placed in the same animal colony room dedicated to this experiment.
Procedure and experimental design
Two unfamiliar rats (housed previously in different housing cages) of matched body weight (± 5 g) were placed in the custom box and remained there undisturbed for 6 days, i.e., until the end of the experiment. The water supply head entries were monitored continuously, while the video and audio recordings were done for 20 minutes daily in the dark-to-light transition sessions (05:50–06:10) and for 20 minutes in the light-to-dark transition sessions (17:50–18:10). The videos (mp4 H.264 encoded, 640 x 480 pixels, variable bitrate, image quality 3), were recorded at 12 frames per second (fps), while audio data were recorded at 250,000 Hz.
As this work should be regarded as exploratory, we did not perform a sample size analysis. As in this work we did not investigate differences among groups, the experimental protocol had no randomization and blinding.
Digital workflow
We provide a detailed workflow allowing easy reproduction of the steps leading to the semi-automated analysis of behavioral and USV data. While the DeepLabCut https://github.com/DeepLabCut, SimBA https://github.com/sgoldenlab/simba and DeepSqueak https://github.com/DrCoffey/DeepSqueak websites offer comprehensive details of the installation and use of the respective toolkits, we describe our workflow as an easy step-by-step guide for the less computer-oriented researchers.
DeepLabCut frames’ labeling, training, and network evaluation
We have chosen 37 random videos (Fig 1) and labeled 582 random images with 12 body parts of rat A and 12 body parts of rat B (nose, left eye, right eye, head, back, pelvis, anogenital area, left shoulder, right shoulder, middle, tail middle, and tail end).
[Figure omitted. See PDF.]
Care was taken to aim for as much variety in behavior, posture, individuals used, and backgrounds as possible within the dataset, as suggested by Hardin and Schlupp [17]. Labeling was done over several days using DeepLabCut “napari” plugin on a desktop PC with Windows 11 equipped with the Nvidia RTX 4090 graphic card and the GUI version 2.3.5 of DeepLabCut running on Python 3.8.16.
To achieve the best possible model, we trained several DeepLabCut “shuffles” gradually increasing the number of annotated frames, eliminating badly recognized frames, and varying the number of iterations. We finally trained the DeepLabCut model with 200,000 iterations. Other DeepLabCut’s variables were set at default values: default_net_type: dlcrnet_ms5, default_augmenter: multi-animal-imgaug and default_track_method: an ellipse. This was done iteratively over several weeks on the Nvidia DGX A100 station running Ubuntu 22 Linux, Python 3.9.15 (main, Nov 4 2022, 16:13:54), IPython 8.6.0, and DeepLabCut version 2.3rc2.
As with other variants, also the shuffle #37 (the DeepLabCut neural network) and its snapshot index “-4” (170,000 iterations) was followed by network evaluation (Fig 2). Shuffle #37 was chosen as the final model because it displayed the lowest Train error and Test error of 3.45 and 5.96 pixels, respectively, with a p-cutoff of 0.6. As a check point, the videos that were analyzed with shuffle #37 were played on a computer screen at reduced speed with the free mpv program https://mpv.io/. We noted that DeepLabCut has marked the body parts the way the observer would mark them, and that the incidents of interchanging body part(s) between rats or of “losing” the animal from the view were sporadic.
[Figure omitted. See PDF.]
As a result, DeepLabCut analysis provided videos with the “skeletons” and numerical data (CSV files) representing every body part’s position in space and time; all these files were used by the SimBA toolkit in the next steps.
Post-DeepLabCut processing with SimBA
Using shuffle #37, we analyzed 16 SimBA training videos, randomly chosen from the 39 videos of the dark-to-light transition sessions and 33 videos of the light-to-dark transition sessions. We ran SimBA ver. 1.87 GUI, loaded the initialization file and imported CSVs with INTERPOLATION METHOD: Body Parts Nearest and SMOOTHING: NONE. We skipped the OUTLIER CORRECTION.
Extracting SimBA’s “features” with two custom scripts
In the next step, we processed the input CSVs using python 3.6 scripts to save the custom set of “features”, which are a complex assembly of settings (e.g., distances, movements, etc.) used to recognize various relationships between body parts. Since this approach differs from the default SimBA’s workflow, it is described in more detail.
Our initial attempts to use a default set of features offered by a standard SimBA’s workflow failed to provide decent classifiers. SimBA offers to use an additional set of features, including, among others, the angles, convex hulls etc. However, even the enlarged set of features did not provide well-performing classifiers, and their analyses have often hung the program due to the computer’s memory limits. For these reasons, and prompted by Lapp et al., work [18], we decided to create a custom set of restricted features of interest.
The CSVs located in SimBA’s "outlier corrected movement location" folder were first processed by the two custom scripts.
Dr. Simon Nilsson’s polygons script
The script01, kindly provided by Dr. Simon Nilsson and slightly modified, produced the “polygons” features (“polygon_pct_overlap” and “difference_area”). These two features served mainly to detect how close the animals were to each other, see Fig 3. However, the script also filled the missing body parts usingpython panda’s function, which “guessed” the position of all body parts and filled them appropriately where possible. Such interpolation of body parts’ positions was necessary because the random forest machine learning approach requires all cells not to be empty. The missing body parts were filled with 0 using another script02.
[Figure omitted. See PDF.]
Dr.
The custom features script based on Dr. Hanna Lapp’s "Amber" project
The next script03 was based on Lapp et al., work [18], fully described at the github site https://github.com/lapphe/AMBER-pipeline/blob/main/AMBER_pose_estimation.py.
We modified it to suit present experimental conditions: that is, to analyze the relationships of two similarly looking rats rather than of a dam and her offspring. The script calculated several features, including the whole body, head, and anogenital area “centroids,” that is, x, y coordinates indicating the center of a given body part.
The genuine advantage of analyzing the centroids (rather than solely individual body parts’ points) is that even if a given body part was temporarily missing due to the occlusion by another animal, the centroids were almost always detected and could serve for calculating other features.
These derived features included the distances between different animals, for example, the heads (likely reflecting nose contacts and sniffing behavior) or the distances between the head and anogenital region (likely indicative of anogenital sniffing). Other behaviors’ detection, like mounting, crawling, nosing, and following, also depended on the distances between centroids of different animals. The script also calculated distances between the same animal body parts (useful for determining rearing and self-grooming behaviors).
While the centroids were saved as the pairs of x and y coordinates, functionally similar measures were the convex hulls, presenting the area of the animals’ whole body (likely useful for detecting rearing behavior) and the head (which could change its shape when the nose of the animal was not visible due to sniffing or self-grooming). For the same reason, we programmed the script to determine the movements, including 1-second rolling movements (likely useful for detecting the following behavior), as well as the angles between specific body parts (likely helpful in identifying rearing behavior). Finally, the script saved the probabilities of detections as this helped to determine behaviors that were less likely to be detected (e.g., during fast moving or in cases one animal occluded another when mounting or crawling). In total, the script determined only 109 custom features.
The next script04 allowed for visualization of the features saved in the previous step (Fig 4). As a check point, the videos were inspected in slow motion.
[Figure omitted. See PDF.]
The circles, triangles and rectangles correspond to the whole body, head and anogenital "centroids", respectively. The larger elements represent the actual position. The smaller elements represent 1-second rolling mean position. Red text indicates that a given feature was active, for instance, given body parts were closer than the threshold, or in movement.
Merging the polygons with Amber-like features into one set
Unnecessary columns were removed from the CSVs using the next script05 so that only 109 columns with the features were left. Then, Dr. Simon Nilsson’s 2 polygon features CSVs and Dr. Hannah Lapp’s-like 109 features CSVs were merged using another script06 that also adjusted the number of rows in every CSV to the minimal number. This created the final "features extracted" set of CSVs, which was copied to the "features extracted" folder.
SimBA training
We analyzed the following 11 behavioral categories: 1 adjacent lying (the time of side by side contact or “huddling”; see [19, 20]), 2 anogenital sniffing (one rat sniffing the anogenital region of the conspecific), 3 crawling (one rat moving over or under the conspecific), 4 fighting (mostly aggressive grooming: one rat chasing another then, while pinning down a conspecific or holding it with the forepaws: licking, chewing the fur of the conspecific, or punching; see [14, 15, 21–23]), 5 following (active movement of two individuals; one chasing and approaching another), 6 grooming (known also as allogrooming: cleaning the fur or amicably scratching, licking and chewing the fur of the conspecific, motionless [22]), 7 mounting (climbing or standing on the back of the conspecific), 8 nosing (rats touching each other with their noses, while stretching their body slightly; [24]), 9 rearing (one or both animals standing on their hind legs), 10 self-grooming (cleaning the fur or scratching: rapid movements of the head towards the own body; [24]) and 11 sniffing (sniffing or touching the body of the conspecific except anogenital region). The choice of these behaviors was based on a number of ethological observations [1, 7, 8, 14, 15, 21–24] and represents a highly characteristic set of behaviors of same-sex laboratory rats in dyadic encounters. Representative examples of the 11 scored behavioral categories are shown in Fig 5.
[Figure omitted. See PDF.]
Having a custom set of 2+109 features, it was possible to train SimBA models (the classifiers). We used 16 randomly chosen videos, some of a short duration but rich in a given social behavior. As we labeled frames with the behaviors earlier, using the default set of features, the CSVs columns with individual classifiers displaying 1 (behavior present) or 0 (behavior absent) and stored in the "targets inserted" training folder were combined with the CSVs stored in "features extracted" folder using the next script07.
Following training, the examination of classifiers’ performance using summary precision curves (Fig 6) with the custom script08 revealed decent F1’s and allowed for setting the initial Detection Thresholds, see Table 1.
[Figure omitted. See PDF.]
This Detection Threshold was initially set in the SimBA’s INI file to examine this classifier’s performance.
[Figure omitted. See PDF.]
Every classifier was then iteratively assigned with its detection threshold (how sure the program must be to classify a given frame as containing a given behavior). These values were tweaked by inspecting the resulting videos to provide even better detections. Each classifier has been assigned with its minimal bout length = 0, so if present, it was represented in the “frame”, and was not required to last for a longer time than 1/12 s. This allowed for the precise assessment of behaviors’ duration but not for the number of given behavioral category episodes; see Table 2.
[Figure omitted. See PDF.]
At a given time frame, a behavior classified as absent (0) or present (1) was annotated, together with the presence or absence of a USV call type and their co-occurrence.
The classifiers were created using the random forest approach, entropy criterion, 20% of the training data, and no under/over sample parameters.
SimBA testing
Having well-performing classifiers, the testing videos were examined for the specific behaviors. The next script09 was run to store annotated videos in a custom folder. As a check point, we examined ensemble predictions in a custom way to observe how the classifiers detected the behaviors (Fig 7).
[Figure omitted. See PDF.]
If the probability of detection was above the detection threshold, set in the INI file, its presence was marked red. Note a decent performance of the classifier in rats destroying their home cage.
At this stage, all annotated 39 videos of the dark-to-light transition sessions and 36 videos of the light-to-dark transition sessions were stored; their number was reduced later due to the lack of some corresponding audio files at the final stage of processing.
Analysis of ultrasonic vocalizations
The analysis of audio files was conducted off-line using DeepSqueak version 3.1.0, the program developed by Coffey et al., [16], on MATLAB (MathWorks, Inc., Natick, MA), version R2023a. The spectrograms were created using a fast Fourier transform (FFT) with a window size = 0.0032 s, nfft = 0.0032 s, and a 90% overlap. DeepSqueak uses a convolutional neural network for USV detection. The 50- and 22-kHz USVs were detected separately.
The 50-kHz USVs were detected using multidetect function by its default detection network “Rat Detector YOLO R1”, frequency cutoff high: 95 kHz, frequency cutoff low: 35 kHz, score threshold: 0 (Precision = 0.97, Recall = 0.99, F1 = 0.98).
The 22-kHz USVs were detected using DeepSqueak’s default detection network “Long Rat Detector Yolo R1”, frequency cutoff high: 10 kHz, frequency cutoff low: 35 kHz, and score threshold: 0.
The contour thresholds for all audio files were set to 0.215 for entropy and 0.825 for amplitude percentile. The detection files were processed using Post Hoc Denoiser to eliminate false positive results. Post Hoc Denoiser is a neural network that distinguishes USVs from common types of background noise and alarm reflections. The noise networks were trained in total on 216 samples (3 audio files from the present experiment). For training purposes, the “Denoiser” samples were manually labeled as the calls or the noise (the validation accuracy = 81.48%, validation frequency = 10 iterations).
Following the Post Hoc Denoising stage, all USV recordings were manually inspected, and the inaccurate detections were corrected. DeepSqueak’s detections, including inaccuracies (false positive and negative detections), were compared with the human detections on all USV calls using simple linear regression for all audio files.
The results of the regression analyses, shown in Fig 8, revealed significant similarities between human and DeepSqueak detections. The R2 values for 50-kHz and 22-kHz were 0.99 and 0.98, respectively (for both: P values < 0.0001).
[Figure omitted. See PDF.]
Each point represents the number of USVs emitted by the pairs of rats in each of the 6 boxes recorded on dark-to-light and light-to-dark transition sessions over 6 days of the experiment (6 x 2 x 6 ~ 72 audio files). As not all recordings were available, 34 light phase transition and 37 dark phase transition audio files were analyzed, totaling to 71 measurements.
Based on the acoustic call features, the 50-kHz calls were divided into the following types: short calls (call duration: < = 12 ms), flat calls (bandwidth: < = 6 kHz AND call duration: > 12 ms), and frequency-modulated calls (bandwidth: > 6 kHz AND call duration: > 12 ms) [25]. These calls were not subcategorized into steps, trills, etc. [26].
Analysis of associations between behavioral and ultrasonic categories
Beyond the analysis of rats’ activity at the two twilight periods, this work also addressed a more global question: if the ultrasonic vocalizations play a substantial role in rats’ social life, how strongly are they associated with a particular behavior and with other call type(s)?
To detect the behavior-USV call type and the USV call type-USV call type associations, i.e., their co-occurrences, the analysis of social behavior videos (i.e., the SimBA’s “machine results” CSVs) and of USV categories (DeepSqueak’s output) had to be temporarily matched as precisely as possible.
As the videos were recorded at 12 frames per second (FPS), one frame of 1/12 seconds being represented as a single row in CSV files constituted a single temporary unit and could contain the behavior(s), USV call type(s), both, or none; see Table 2.
The construction of this dataset took several steps. First, the video and audio files stored at different computers had to be identified, and their filenames had to match.
In this work, the primary file containing the USV analysis (DeepSqueak output) had columns including the date of observation, box name, exact start and end times relative to the beginning of the recording, the relevant *.wav file names and their paths, as well as the identified category (FLAT, SHORT, FREQUENCY MODULATED, and ALARM). This file contained over 18,000 rows.
This file’s rows had to be first filled with the corresponding SimBA’s CSV video filenames.
Matching the video and audio filenames
In an ideal technical scenario–like with video recording on a smartphone–a video and a correspondingly high quality (250 kHz) audio would be recorded simultaneously and accordingly enveloped. This was not the case in the present experimental setup because the video and audio recordings were done using different devices, stored at different computers/NAS stations, and differed in their filenames and durations by more than several seconds, despite well-synchronized computers’ clocks.
Thus, the next script10 analyzed the duration of videos stored in a custom folder and, based on FPS stored in video_info.csv SimBA’s file, it outputted frame numbers as well as the “Begin frame time (s)” and “End frame time (s)” columns in separate CSVs in a new folder. The next script11 added USV types’ columns and filled them with zeros. Two helper scripts12,13 matched specific audio and video filenames.
Creating an index file listing all audio and video filenames
Following several manual adjustments, including the deletion of duplicated rows, thanks to the next script14, the index file was created and updated with the columns indicating the start and end of USV’s *.wav audio recordings as well as with their duration in seconds.
Determining the duration of audio files
The next script15 has read the updated index and SimBA’s video_info.csv files. Knowing the video’s FPS–needed for constructing N frames–and the precise date and time of the start and end of audio recordings, it outputted a novel set of CSV files. These CSVs were of the number of rows equaling to the audio duration x FPS.
For instance, the CSV corresponding to the audio recording of a duration of 1,200.005 seconds had 14,400 rows, while a CSV corresponding to the audio recording of a duration of 1,202.168 seconds (~2 seconds longer) had 14,426 rows (26 more rows). The script, in addition, created novel columns (Probability_ALARM, ALARM, Probability_FLAT, FLAT, Probability_FRQ MODUL., FRQ MODUL., Probability_SHORT, SHORT) and filled their cells with zeros. In short, thanks to this step, we knew the precise beginning and the end of every audio file expressed as the frame numbers.
The next script16 updated audio CSVs so that if a specific call type was present at a given time of observation, the corresponding frame with that call type was updated with “1”. Thus, the resulting CSVs now had "1" in the rows where the call was detected; see Table 2 for an explanation. This was to create USV datasets based on DeepSqueak output because SimBA is intended to be trained and to create classifiers like "sniffing", etc., rather than 1-frame indices of e.g., ALARM USV present. Of note, we believe that the script could be adapted to broaden SimBA’s CSVs with any other (electrophysiological, telemetry, microdialysis, etc.,) numeric data that needed to be synchronized with the video recordings.
Updating the index file with the duration of individual videos
The next script17 filled out the video information columns in an updated index file. This index file was processed by the next script18, which compared the duration of audio and video files and provided information on whether the start and/or the end of the longer file needed to be cut to a) make the start of the video and audio at the same exact moment and, b) of exactly same duration. This script saved the final index CSV file.
Adjusting the number of frames in audio CSVs and in video CSVs
Knowing the adjustments to be made, the CVSs representing audio data were saved thanks to the next script19. CSVs representing videos were optionally processed by the two scripts20,21 and then adjusted by script22.
To ensure the length of video and audio was the same, as a check point, adjustments precision was visualized with the next script23 that showed the longest mismatch of 1 second (12 frames), that for the media lasting for 1,200 seconds, appeared as an acceptable value.
Saving the final SimBA’s “machine_results” CSVs
The final set of CSVs with frames containing both SimBA’s detected behaviors and the novel USVs information, adjusted to the same duration, were stored in the temporary folder thanks to the next script24.
Adding a human-hearable audio track to video files of the same start time and duration
Since we also aimed to merge the videos with human-hearable audio, the videos’ start and/or end were cut, if necessary, with the next script25 and stored in a novel temporary folder.
Then, thanks to ffmpeg’s custom function:the audio *.wav files were converted to human-hearable *.aac files with the next script26 and stored in a separate folder. The *.aac files’ start and/or end were then cut, if necessary, with the next script27.
As a check point, thanks to the next script28, the shortened videos received labels indicating behavioral categories as well as USV types superimposed on videos.
The final videos, merged with corresponding audios (Fig 9), were created with the next script29 that stored audible videos in a novel folder.
[Figure omitted. See PDF.]
Analysis of the number and duration of behaviors and ultrasonic calls over 1-minute epochs and days of observation
The scripts used in the following sections were created in the python 3.12 environment.
The database of behaviors, USV types, and their co-occurrences (the Excel file), presented in 20 (1-minute) epochs, was produced by the next script30. While SimBA offers a way to "stich" the episodes into the “bouts”, we were interested in recalculating them as the time because SimBA’s INI file was defined to count every episode of nosing, sniffing etc., as one frame (see Table 2). The episodes were converted manually into duration (seconds of "nosing", "sniffing", etc., per 1-minute epoch, i.e., were divided by FPS = 12).
We decided to use 1-minute "epochs" to simplify data presentation, as the whole observation lasted for 20 minutes. For the dark-to-light and light-to-dark transition sessions, minutes 1–10 represented the dark and light phases, respectively. If a behavior was observed continuously throughout the entire 1-minute epoch, it could be represented by up to 720 frames (60 seconds x 12 frames per second). In practice, we observed that rats could only display the ’adjacent lying’ behavior continuously for one minute.
Shortly, the script have read CSVs represented in the index file, assigned behaviors and USV call types into the relevant 1-minute epochs, and saved the results in another Excel file. The next script31 processed that file and saved it as a novel Excel file, assigning actual dates to the experimental days, since video and audio data were not available for all 6 days (Figs 10 and 11).
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
The next script32 have read the novel Excel file and outputted individual small CSVs with behaviors and USV category times in seconds per epoch and per experiment day.
Plotting 3-D graphs
The next scripts33 have read small CSVs from the previous step and plotted individual 3-D graphs, examples of which are shown in Figs 12 and 13.
[Figure omitted. See PDF.]
Blue and yellowish parts represent the dark and light phases, respectively; minute 11 is time 06:00.
[Figure omitted. See PDF.]
Yellowish and blue parts represent the light and dark phases, respectively; minute 11 is time 18:00.
Statistics
We hypothesized that at the very moment the lights in animal housing boxes were turned off, the intensity of social behaviors would increase. Behavioral and ultrasonic data could be analyzed in various ways, as they were collected every 1/12 seconds for 20 minutes in the dark-to-light and light-to-dark transition sessions over 4–6 days. However, the paucity of data expressed in frames precluded their analysis with repeated measures ANOVA. Moreover, the detailed analysis of each frame would include 10 min x 720 = 7,200 frames before and 7,200 frames after light change.
When inspecting 3-D figures plotted in 1-minute intervals, it was apparent that there is a clear difference between and after light change. To further simplify statistical analyses and graphs, for every box (A-F), data recorded before and after light change were averaged. Similarly, since not all data existed for all days, for every box, the data of the early phase (days 1–3) and late phase (days 4–6) were averaged. While not ideal, this step allowed for the construction of 2 datasets represented as epochs (before-after light change) and of 2 datasets represented as early phase (days 1–3) and late phase (days 4–6) of the experiment for each box.
Thus, data expressed as the duration (s) of 11 behavioral and of 4 ultrasonic categories were first checked for normality using Anderson-Darling, D’Agostino & Pearson, Shapiro-Wilk, and Kolmogorov-Smirnov tests. As most of them failed to show normal distribution, data were rank-transformed [27] and subjected to two-way, repeated measures ANOVA with epoch (before-after light change), and early/late phase of the experiment for each box. Boxes (A-F) served as subjects. To determine whether the change in lighting conditions affected the intensity of social behavior and of USV categories and whether the experiment phase played a role, 15 x 2 = 30 repeated measures analyses of variance were performed using IBM SPSS ver. 29. The alpha level was set at 0.05.
Analysis of the associations between behaviors and USV call types and between types of USVs
We hypothesized that certain behaviors (like fighting) are associated with some USV call types (like aggressive 22-kHz calls). The next scripts34 saved another Excel file, with data aggregated for the days 1–3 and 4–6 and for the epochs 1–10 and 11–20. The next scripts35 have read this Excel file and created the contact sheets with all co-occurrences. These plots represented the sums of behavior and USV type or two types of USVs. For instance, if in a given 1-minute epoch, there was 1 "short" USV call and 4 "frequency modulated" USV calls, the sum of their co-occurrences was 5.
Quality system
At the time of performing this study, the Department of Behavioral Neuroscience and Drug Development has been holding EQIPD [28]; see https://go-eqipd.org/ certificate no. PL-INS-DBNDD-11-2021-2 (Nov 12th, 2021 –Nov 30th, 2024); see https://paasp.net/new-eqipd-certified-research-unit/ and https://go-eqipd.org/about-eqipd/eqipd-certified-research-groups/. Thus, this study followed EQIPD guidelines: https://eqipd-toolbox.paasp.net/wiki/EQIPD_Quality_System; for instance, it was carefully designed in advance and was documented in detail at every stage of progress.
Results
General activity of rats as measured by continuous monitoring of head entries into the water supply
The daily pattern of rats’ general activity showed an increased number of head entries at the dark phase compared to the light phase. Two-way ANOVA with data collapsed for the dark/light phase demonstrated differences between phases: F(1,5314) = 236; P<0.001, Fig 14.
[Figure omitted. See PDF.]
Each point represents the mean +/- SEM of 3–6 experimental boxes monitored over 5–7 days of the experiment. This measure is a proxy of animals’ activity, likely but not necessarily reflecting water drinking, as we observed rats placing their heads into the lickometer while sniffing, or just lying near to the water tube. The black horizontal bars represent the dark phase (18:00–06:00). The yellow bar shows the 20-min dark-to-light transition sessions, beginning at 05:50 and ending at 06:10; the central dotted line indicates 06:00 hour, the end of darkness. The blue bar shows the 20-min light-to-dark transition sessions, beginning at 17:50 and ending at 18:10; the central dotted line indicating 18:00 hour, the beginning of darkness. Pairs of rats were monitored between the days of 2023-05-05 and 2023-05-15.
A separate ANOVA carried out on four 10-minute epochs (05:50–06:00, 06:00–06:10, 17:50–18:00, and 18:00–18:10) also demonstrated differences among groups: F(3,146) = 12.83; P<0.001, with data of the time 06:00–06:10 being different from the 05:50–06:00 time point, Fig 15.
[Figure omitted. See PDF.]
Symbols: *** P<0.001, 06:00–06:10 vs. 05:50–06:00, Tukey HSD.
Duration of rats’ social behaviors and ultrasonic calls at the beginning of the light phase
In the dark-to-light transition sessions (Fig 16), we observed light-on-induced increases in anogenital sniffing, adjacent lying, and rearing behavior. In addition, rats displayed less of grooming and rearing as well as less of alarm, frequency modulated, and short calls at days 4–6 than at days 1–3 (Table 3).
[Figure omitted. See PDF.]
The blue bars indicate the end (05:50–06:00; lights off) of the dark phase and yellow bars indicate the beginning (06:00–06:10; lights on) of the light phase. The early (days 1–3) and late (days 4–6) of the experiment are shown separately. Plots with shaded background indicate that the early/late days factor (horizontal line) and/or that the lights on/off factor (ON * OFF) was significant (P<0.05). Data are presented as median +/- interquartile range.
[Figure omitted. See PDF.]
Duration of rats’ social behaviors and ultrasonic calls at the end of the light phase
In the light-to-dark transition sessions (Fig 17), we observed light-off induced increases in fighting, mounting, crawling, and rearing (but not in anogenital sniffing and adjacent lying as was observed in the light phase transition sessions). Additionally, there was an increase in alarm, flat, and short calls, which were not observed during the light transition sessions. In addition, rats displayed less of flat (not observed in the light transition sessions) as well as of frequency modulated and short calls (like in the light transition sessions) at days 4–6 than at days 1–3 (Table 4).
[Figure omitted. See PDF.]
The yellow bars indicate the end (17:50–18:00; lights on) of the light phase and blue bars indicate the beginning (18:00–18:10; lights off) of the dark phase. The early (days 1–3) and late (days 4–6) of the experiment are shown separately. Plots with shaded background indicate that the early/late days factor (horizontal line) and/or that the lights on/off factor (ON * OFF) was significant (P<0.05). Data are presented as median +/- interquartile range.
[Figure omitted. See PDF.]
Associations between ultrasonic calls and social behaviors at the beginning of the light phase of pairs of rats presented at the twilight (dark/light phase) over the early (days 1–3) and late (days 4–6) phase of the experiment
The analyses of associations between ultrasonic calls and social behaviors could be summarized as follows. During the dark-to-light transition sessions, alarms (Fig 18) were associated with flat and frequency-modulated calls. The alarm (Fig 18), flat (Fig 19), frequency modulated (Fig 20) and short (Fig 21) calls were associated with the rearing behavior, mainly at the beginning of the light phase.
[Figure omitted. See PDF.]
As shown, these calls were associated with the flat and frequency-modulated calls at the dark phase, especially at the early days. The alarm calls were also associated with the rearing behavior during the dark phase of the late days.
[Figure omitted. See PDF.]
As shown, these calls were associated with the rearing behavior during the dark and light phases of the late days.
[Figure omitted. See PDF.]
As shown, these calls were associated with the rearing behavior during the dark and light phases of the late days.
[Figure omitted. See PDF.]
As shown, these calls were associated with the rearing behavior.
Associations between ultrasonic calls and social behaviors at the beginning of the dark phase of pairs of rats presented at the twilight (light/dark phase) over the early (days 1–3) and late (days 4–6) phase of the experiment
The analyses of the associations between ultrasonic calls and social behaviors could be summarized as follows. During the light-to-dark transition sessions, the alarm (Fig 22), flat (Fig 23), and frequency-modulated (Fig 24) calls were associated with other types of calls but also with sniffing, crawling, fighting, and rearing. Of note, the flat (Fig 23) and short (Fig 25) calls were associated with adjacent lying.
[Figure omitted. See PDF.]
As shown, these calls were associated with the flat and frequency-modulated calls at the dark phase, especially at the early days. The alarm calls were also associated with the crawling, fighting, sniffing, and rearing behaviors during the dark phase of the early days.
[Figure omitted. See PDF.]
These calls were associated mostly with the adjacent lying behavior. We also observed that the flat calls were associated with fighting, rearing, and sniffing behaviors at the beginning of the dark phase during early days.
[Figure omitted. See PDF.]
These calls were associated with fighting, sniffing, and rearing behaviors at the early days during the beginning of the dark phase. The FM calls were also associated with the alarm calls and adjacent lying behavior at the early days of the experiment.
[Figure omitted. See PDF.]
These calls were associated mostly with adjacent lying behavior. Short calls were associated with fighting, rearing, and sniffing behaviors at the beginning of the dark phase at the early days of the experiment.
Discussion
Humans display diurnal fluctuations in the positive affect. It is unclear when exactly they happen, as reports are showing the strongest peak in positive affect occurring either in the morning or at night [29, 30]. In the wild and laboratory rats, the daily rhythm of activity, measured as the percentage of rats active outside the nest [2] and as the positive affect, exemplified by the number of “hedonic” 50-kHz ultrasonic calls, appears to be maximal during the active lights-off period [5], showing, in addition, the early and late activity cycle peaks. Burgdorf et al., [5] further suggested that the peaks may translate to an ability to coordinate prosocial play behavior right after the start of the wake cycle and to complete prosocial behavior right before the end of the wake cycle.
The present work aimed to verify a common observation that the laboratory rats intensified their activity when the lights in the animal facility were turned off. As the phenomenon of darkness-induced prosocial activity has an apparent heuristic value, we were curious whether it could be investigated using the top modern digital techniques. Specifically, we examined the nature of the darkness-induced activity to determine whether it affects the social life, and if so, which behavioral categories are altered. In addition, we investigated whether darkness-induced activity is associated with the ultrasonic vocalizations, the “happy” (50-kHz) and/or “alarm” (22-kHz) calls.
The present study confirms that the animals were more active in the dark phase, as measured by water supply head entries (Fig 14). A more detailed analysis, contrary to our predictions, revealed that the animals did not start to poke into the lickometers right after the lights were turned off. Conversely, they significantly increased the water supply head entries right after the lights were turned on (Fig 15). While this phenomenon could be explained by light-induced agitation, it suggests that this simple way of monitoring animals’ activity will unlikely reveal darkness-induced activity. The discrepancy between the present and Stryjek et al. [2] results could be due to the way the data were collected. While the lickometeres monitored every nose poke, and then data were aggregated into 10-minute epochs, Stryjek et al., [2] observed every fifth photo from those taken once a minute by a video camera.
The analysis of light-to-dark transition sessions confirms that at the beginning of darkness, the rats increased their activity [2], but the increase did not concern social behavior categories in the expected way, see Fig 17 and summary Table 5. During the dark phase transition, the social investigatory behaviors, including nosing, following, anogenital sniffing, and sniffing, were not affected. The darkness also did not affect amicable grooming [22] and self-grooming [24]. In contrast, with the lights-off, the animals started to display increased fighting, mounting, and crawling behavior, the two latter being associated with and supporting aggression [14, 15, 21–23]. These behaviors were concomitant with several forms of USVs, including alarm, flat, and short calls. Of note, the frequency-modulated “happy” or “hedonic” calls were not affected by the darkness. Together with increased fighting, increased 22-kHz alarm calls strongly suggest that when the rats wake up, they start to display aggressive behavior. As expected, we observed no increase in adjacent lying, which, while amicable, is a way the rats sleep or rest together [19, 20]; logically, this type of behavior would not increase at the beginning of the active phase of the light-dark period. The darkness-induced activity was also due to increased rearing behavior, which represents a form of environmental exploration, and with the flat and short calls associated with investigative behavior and general arousal [31].
[Figure omitted. See PDF.]
Two far-right columns show how the social life adapts (decreases) at days 4–6 compared to days 1–3 of the experiment. Yellow columns indicate the dark-to-light transition sessions and blue columns indicate the light-to-dark transition sessions.
Analysis of behaviors recorded at the light-to-dark transition sessions also revealed that animals adapt to the novel home cages (see summary Table 5). As stated above, we tested pairs of rats continuously for 6 days in semi-natural conditions, which were novel to the animals. The establishment of social hierarchy takes time [32, 33], and one would predict that some behaviors, especially those related to aggression, could display lower intensity at the later (4–6) days than early (1–3) days of the experiment. However, at the dark phase transition sessions, we failed to show such adaptation for any behavioral category, though we noted that the 50-kHz USVs were reduced at the late days. This did not concern 22-kHz alarm calls, associated with aggressive behaviors.
The light phase transitions served as internal control conditions. When the lights were turned on and the animals were supposedly going to sleep, one would expect a decrease in social activities. Indeed, we noted an increase in the resting activity (adjacent lying; [19, 20]), consistent with decreased activity at a high light level, see Fig 16 and summary Table 5. However, as with the dark phase transitions, the exploratory activity (rearing) increased, likely due to the agitation induced by a rapid change in light intensity. Lights-on did not affect any category of USVs. Analysis of the light phase transition sessions also revealed adaptation to the novel home cages. At the later (4–6) days, we observed less of grooming and rearing behaviors and of ultrasonic calls than at early (1–3) days of the experiment. Of note, the duration of all other behaviors remained unchanged during the dark-to-light transition sessions over the course of the experiment.
The second issue we examined has addressed a broader question: if ultrasonic vocalizations play a substantial role in rats’ social life, how strongly are they associated with specific behaviors? For instance, are 22-kHz "alarm" calls associated with fighting behavior, as it is commonly [34, 35] but not universally [36] accepted? Both during the dark-to-light (Fig 18) and the light-to-dark (Fig 22) transition sessions, the alarm calls were indeed associated with fighting, crawling, mounting, and sniffing behavior, see summary Fig 26. This again supports the notion that the alarm calls communicate aggressive behavior [34, 35]. The 22-kHz calls were also associated with other calls’ types and with the rearing behavior, suggesting a complex pattern of ultrasonic communication.
[Figure omitted. See PDF.]
A summary of the associations between Alarm (A), Frequency Modulated (B), Flat (C) and Short (D) ultrasonic calls and social behavior categories presented in Figs 18–25. Blue and yellow lines indicate the light-to-dark and dark-to-light transition sessions’ associations, respectively. The green boxes show other ultrasonic categories associated with the main (red) ultrasonic call type.
The frequency modulated “happy” or “hedonic” calls were associated with investigatory sniffing and rearing behaviors as well as with adjacent lying and fighting. This pattern suggests that they co-occur with amicable but also with aggressive behaviors. A similar pattern of associations was noted for the flat and short calls, which were also associated with the adjacent lying; see summary Fig 26. As non-frequency modulated flat and short calls were postulated not to bear the positive affect [37–41], their presence at times when the rats were resting together is not surprising. Altogether, one may hypothesize that the lights-on stimulus agitates the animals, inducing rearing behavior and ultrasonic vocalizations. This is consistent with the increase in general activity, represented by water supply head entries (Fig 15). However, the present results do not support the hypothesis that the lights-off stimulates prosocial behavior and hedonic calls. This might be due to several reasons. The discrepancy between the present ultrasonic data and Burgdorf et al., [5] results could be due to technical reasons because while we calculated the total time of USVs, Burgdorf et al., [5] measured their number. Alternatively, the animals could have been displaying aggressive behaviors and alarm calls because they were continuously establishing the social hierarchy [33, 42]. However, we did not observe a significant adaptation (decrease) of the fighting behaviors and alarm calls at the light-to-dark transition sessions over the days of the experiment, though it is debatable how long it takes for the rats to establish the dominance status [43, 44]. As we observed a global decrease in social behavior over the course of the experiment, it is likely that extending the observation period beyond 6 days would further reduce rats’ social activities. One also cannot exclude that the aggressive behavior is indeed rewarding to the aggressor [45], as not only the alarm but also flat and short (though not frequency modulated) 50-kHz calls were increased. Unfortunately, the present experimental setup disallowed for distinguishing which animal in a pair was calling. This apparent limitation could be overcome by using more sophisticated than ours ultrasonic “camera” devices [46, 47].
The present work provides several novel utilities and observations. An experimental setup involving custom-built inexpensive boxes allowed for simultaneous recording of video, audio, and general activity. Using several python’s scripts, we were able to merge the ultrasonic information with the videos as precisely as possible, and we believe that a slight modification of the workflow would allow to synchronize other types of data. The videos, supplemented with the human-hearable audio track, allow for direct observation of rats’ behavior. The DeepSqueak machine learning tool [16], together with other free tools including DeepLabCut (for reviews, see [17, 48]) as well as the SimBA open-source package [14, 15], allowed for rapid and objective analysis of ultrasonic and behavioral data. The DeepSqueak analyses were similar to the results generated by an experienced experimenter and much faster than the human workflow. This laboratory recently communicated a remarkable similarity of rat social behavior annotations made by an experienced researcher and DeepLabCut [9].
In conclusion, we present a study with an easy-to-replicate step-by-step workflow allowing objective measurement of rat social behavior and ultrasonic vocalizations. Regardless of the nature of the darkness-induced activity, we speculate that it could be compromised in apathy-like states associated with a depressive-like phenotype. As spontaneous apathy or anergia awaits a valid animal model to be studied in semi-natural conditions, these hypotheses are currently tested in our laboratory.
Supporting information
S1 File.
https://doi.org/10.1371/journal.pone.0307794.s001
(DOCX)
Acknowledgments
This work could not be done without the availability of open-source DeepLabCut https://github.com/DeepLabCut marker-less pose-estimation and Simple Behavioral Analysis (SimBA) https://github.com/sgoldenlab/simba toolkits. For these, the authors deeply thank the DeepLabCut Team: Alexander Mathis and Mackenzie Mathis, Tanmay Nath and Jessy Lauer, the DeepLabCut Community as well as the Golden Laboratory: Simon Nilsson and Sam A. Golden, respectively. We thank Kevin R. Coffey for DeepSqueak, a deep learning-based system for the detection and analysis of ultrasonic vocalizations. Cephares https://www.cephares.pl/ project is acknowledged for providing the Nvidia DGX A100 computer station.
References
1. 1. Calhoun JB. The ecology and sociology of the Norway rat. Bethesda: U.S. Department of Health, Education and Welfare. Public Health Service Publication; 1962. 1–288 p.
2. 2. Stryjek R, Mioduszewska B, Spaltabaka-Gędek E, Juszczak GR. Wild Norway Rats Do Not Avoid Predator Scents When Collecting Food in a Familiar Habitat: A Field Study. Scientific Reports. 2018;8(1):9475. pmid:29930280
* View Article
* PubMed/NCBI
* Google Scholar
3. 3. Burn CC. What is it like to be a rat? Rat sensory perception and its implications for experimental design and rat welfare. Applied Animal Behaviour Science. 2008;112(1–2):1–32.
* View Article
* Google Scholar
4. 4. Schwarting RKW. Behavioral analysis in laboratory rats: Challenges and usefulness of 50-kHz ultrasonic vocalizations. Neurosci Biobehav Rev. 2023;152:105260. Epub 2023/06/03. pmid:37268181.
* View Article
* PubMed/NCBI
* Google Scholar
5. 5. Burgdorf JS, Vitaterna MH, Olker CJ, Song EJ, Christian EP, Sorensen L, et al. NMDAR activation regulates the daily rhythms of sleep and mood. Sleep. 2019;42(10):zsz135. Epub 2019/09/11. pmid:31504971; PubMed Central PMCID: PMC6783887.
* View Article
* PubMed/NCBI
* Google Scholar
6. 6. Lore R, Flannelly KJ. Rat societies. Scientific American. 1977;5(5):106–16. Epub 1977/05/01. pmid:558650
* View Article
* PubMed/NCBI
* Google Scholar
7. 7. Barnett SA. An analysis of social behaviour in wild rats. Proceedings of the Zoological Society (London). 1958;130:107–52.
* View Article
* Google Scholar
8. 8. Whishaw IQ, Kolb B. The behavior of the laboratory rat. A handbook with tests: Oxford University Press; 2005.
9. 9. Popik P, Cyrano E, Piotrowska D, Holuj M, Golebiowska J, Malikowska-Racia N, et al. Effects of ketamine on rat social behavior as analyzed by DeepLabCut and SimBA deep learning algorithms. Front Pharmacol. 2023;14:1329424. Epub 2024/01/25. pmid:38269275; PubMed Central PMCID: PMC10806163.
* View Article
* PubMed/NCBI
* Google Scholar
10. 10. Datta SR, Anderson DJ, Branson K, Perona P, Leifer A. Computational Neuroethology: A Call to Action. Neuron. 2019;104(1):11–24. Epub 2019/10/11. pmid:31600508; PubMed Central PMCID: PMC6981239.
* View Article
* PubMed/NCBI
* Google Scholar
11. 11. Mathis A, Mamidanna P, Cury KM, Abe T, Murthy VN, Mathis MW, et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci. 2018;21(9):1281–9. Epub 2018/08/22. pmid:30127430.
* View Article
* PubMed/NCBI
* Google Scholar
12. 12. Nath T, Mathis A, Chen AC, Patel A, Bethge M, Mathis MW. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols. 2019;14(7):2152–76. pmid:31227823
* View Article
* PubMed/NCBI
* Google Scholar
13. 13. Lauer J, Zhou M, Ye S, Menegas W, Schneider S, Nath T, et al. Multi-animal pose estimation, identification and tracking with DeepLabCut. Nat Methods. 2022;19(4):496–504. Epub 2022/04/14. pmid:35414125; PubMed Central PMCID: PMC9007739.
* View Article
* PubMed/NCBI
* Google Scholar
14. 14. Nilsson SRO, Goodwin NL, Choong JJ, Hwang S, Wright HR, Norville ZC, et al. Simple Behavioral Analysis (SimBA)–an open source toolkit for computer classification of complex social behaviors in experimental animals. bioRxiv. 2020. https://doi.org/10.1101/2020.04.19.049452
15. 15. Goodwin NL, Choong JJ, Hwang S, Pitts K, Bloom L, Islam A, et al. Simple Behavioral Analysis (SimBA) as a platform for explainable machine learning in behavioral neuroscience. Nature Neuroscience. 2024. Epub 2024/05/23. pmid:38778146.
* View Article
* PubMed/NCBI
* Google Scholar
16. 16. Coffey KR, Marx RG, Neumaier JF. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology. 2019;44(5):859–68. Epub 2019/01/06. pmid:30610191; PubMed Central PMCID: PMC6461910.
* View Article
* PubMed/NCBI
* Google Scholar
17. 17. Hardin A, Schlupp I. Using machine learning and DeepLabCut in animal behavior. acta ethologica. 2022;25(3):125–33.
* View Article
* Google Scholar
18. 18. Lapp HE, Salazar MG, Champagne FA. Automated maternal behavior during early life in rodents (AMBER) pipeline. Sci Rep. 2023;13(1):18277. Epub 2023/10/26. pmid:37880307; PubMed Central PMCID: PMC10600172.
* View Article
* PubMed/NCBI
* Google Scholar
19. 19. Morley KC, Arnold JC, McGregor IS. Serotonin (1A) receptor involvement in acute 3,4-methylenedioxymethamphetamine (MDMA) facilitation of social interaction in the rat. Prog Neuropsychopharmacol Biol Psychiatry. 2005;29(5):648–57. Epub 2005/05/24. pmid:15908091.
* View Article
* PubMed/NCBI
* Google Scholar
20. 20. Ando RD, Benko A, Ferrington L, Kirilly E, Kelly PA, Bagdy G. Partial lesion of the serotonergic system by a single dose of MDMA results in behavioural disinhibition and enhances acute MDMA-induced social behaviour on the social interaction test. Neuropharmacology. 2006;50(7):884–96. Epub 2006/02/14. pmid:16472832.
* View Article
* PubMed/NCBI
* Google Scholar
21. 21. Goodwin NL, Nilsson SRO, Golden SA. Rage Against the Machine: Advancing the study of aggression ethology via machine learning. Psychopharmacology (Berl). 2020;237(9):2569–88. Epub 2020/07/11. pmid:32647898; PubMed Central PMCID: PMC7502501.
* View Article
* PubMed/NCBI
* Google Scholar
22. 22. Schweinfurth MK. The social life of Norway rats (Rattus norvegicus). eLife. 2020;9. Epub 2020/04/10. pmid:32271713; PubMed Central PMCID: PMC7145424.
* View Article
* PubMed/NCBI
* Google Scholar
23. 23. Barnett SA. The story of rats: their impact on us, and our impact on them. Crows Nest, NSW, Australia: Allen & Unwin Crows Nest, NSW, Australia; 2001.
24. 24. Sams-Dodd F. Automation of the social interaction test by a video- tracking system: behavioural effects of repeated phencyclidine treatment. Journal of Neuroscience Methods. 1995;59(2):157–67. Epub 1995/07/01. pmid:8531482.
* View Article
* PubMed/NCBI
* Google Scholar
25. 25. Potasiewicz A, Holuj M, Litwa E, Gzielo K, Socha L, Popik P, et al. Social dysfunction in the neurodevelopmental model of schizophrenia in male and female rats: Behavioural and biochemical studies. Neuropharmacology. 2020;170(1873–7064 (Electronic)):108040. Epub 2020/03/14. pmid:32165218.
* View Article
* PubMed/NCBI
* Google Scholar
26. 26. Wright JM, Gourdon JC, Clarke PB. Identification of multiple call categories within the rich repertoire of adult rat 50-kHz ultrasonic vocalizations: effects of amphetamine and social context. Psychopharmacology. 2010;211(1):1–13. pmid:20443111
* View Article
* PubMed/NCBI
* Google Scholar
27. 27. Cardinal R, Aitken MRF. ANOVA for the behavioural sciences researcher. London: Lawrence Erlbaum Associates; 2006.
28. 28. Bespalov A, Bernard R, Gilis A, Gerlach B, Guillen J, Castagne V, et al. Introduction to the EQIPD quality system. eLife. 2021;10:e63294. Epub 2021/05/25. pmid:34028353; PubMed Central PMCID: PMC8184207.
* View Article
* PubMed/NCBI
* Google Scholar
29. 29. Stone AA, Schwartz JE, Schkade D, Schwarz N, Krueger A, Kahneman D. A population approach to the study of emotion: diurnal rhythms of a working day examined with the Day Reconstruction Method. Emotion. 2006;6(1):139–49. pmid:16637757
* View Article
* PubMed/NCBI
* Google Scholar
30. 30. Scott JP, McNaughton LR, Polman RC. Effects of sleep deprivation and exercise on cognitive, motor performance and mood. Physiol Behav. 2006;87(2):396–408. Epub 2006/01/13. pmid:16403541.
* View Article
* PubMed/NCBI
* Google Scholar
31. 31. Brudzynski SM. Biological Functions of Rat Ultrasonic Vocalizations, Arousal Mechanisms, and Call Initiation. Brain Sci. 2021;11(5). Epub 2021/06/03. pmid:34065107; PubMed Central PMCID: PMC8150717.
* View Article
* PubMed/NCBI
* Google Scholar
32. 32. Mowrer OH. Animal studies in the genesis of personality. Transactions of the New York Academy of Sciences. 1940;3:8–11.
* View Article
* Google Scholar
33. 33. Wesson D. Sniffing Behavior Communicates Social Hierarchy. Current Biology. 2013;23(7):575–80. pmid:23477727
* View Article
* PubMed/NCBI
* Google Scholar
34. 34. Wohr M, Engelhardt KA, Seffer D, Sungur AO, Schwarting RK. Acoustic Communication in Rats: Effects of Social Experiences on Ultrasonic Vocalizations as Socio-affective Signals. CurrTopBehavNeurosci. 2015.
* View Article
* Google Scholar
35. 35. Blanchard RJ, Blanchard DC, Agullana R, Weiss SM. Twenty-two kHz alarm cries to presentation of a predator, by laboratory rats living in visible burrow systems. Physiol Behav. 1991;50(5):967–72. Epub 1991/11/01. pmid:1805287.
* View Article
* PubMed/NCBI
* Google Scholar
36. 36. Bialy M, Bogacki-Rychlik W, Kasarello K, Nikolaev E, Sajdel-Sulkowska EM. Modulation of 22-khz postejaculatory vocalizations by conditioning to new place: Evidence for expression of a positive emotional state. Behavioral Neuroscience. 2016;130(4):415–21. pmid:27454624
* View Article
* PubMed/NCBI
* Google Scholar
37. 37. Burgdorf J, Kroes RA, Moskal JR, Pfaus JG, Brudzynski SM, Panksepp J. Ultrasonic vocalizations of rats (Rattus norvegicus) during mating, play, and aggression: Behavioral concomitants, relationship to reward, and self-administration of playback. Journal of Comparative Psychology. 2008;122 :357–67. pmid:19014259
* View Article
* PubMed/NCBI
* Google Scholar
38. 38. Burgdorf J, Panksepp J. The neurobiology of positive emotions. Neuroscience and Biobehavioral Reviews. 2006;30(2):173–87. pmid:16099508
* View Article
* PubMed/NCBI
* Google Scholar
39. 39. Knutson B, Burgdorf J, Panksepp J. Ultrasonic vocalizations as indices of affective states in rats. PsycholBull. 2002;128(6):961–77. pmid:12405139
* View Article
* PubMed/NCBI
* Google Scholar
40. 40. Brudzynski SM. Principles of rat communication: quantitative parameters of ultrasonic calls in rats. BehavGenet. 2005;35(1):85–92. pmid:15674535
* View Article
* PubMed/NCBI
* Google Scholar
41. 41. Brudzynski SM. Handbook of mammalian vocalization. An integrative neuroscience approach. Amsterdam: Academic Press; 2010.
* View Article
* Google Scholar
42. 42. Blanchard RJ, Flannelly KJ, Blanchard DC. Life-span studies of dominance and aggression in established colonies of laboratory rats. Physiology and Behavior. 1988;43(1):1–7. pmid:3413239
* View Article
* PubMed/NCBI
* Google Scholar
43. 43. Adams N, Boice R. A longitudinal study of dominance in an outdoor colony of domestic rats. Journal of Comparative Psychology. 1983;97(1):24–33.
* View Article
* Google Scholar
44. 44. Dixon AK, Kaesermann HP. Ethopharmacology of flight behaviour. In: Olivier BJ, Mos J, Brain PF, editors. Ethopharmacology of agonistic behaviour in animals and humans. Dordrecht: Nijhoff; 1987. p. 46–79.
45. 45. Borchers S, Carl J, Schormair K, Krieger JP, Asker M, Edvardsson CE, et al. An appetite for aggressive behavior? Female rats, too, derive reward from winning aggressive interactions. Transl Psychiatry. 2023;13(1):331. Epub 2023/10/28. pmid:37891191; PubMed Central PMCID: PMC10611704.
* View Article
* PubMed/NCBI
* Google Scholar
46. 46. Matsumoto J, Kanno K, Kato M, Nishimaru H, Setogawa T, Chinzorig C, et al. Acoustic camera system for measuring ultrasound communication in mice. iScience. 2022;25(8):104812. Epub 2022/08/20. pmid:35982786; PubMed Central PMCID: PMC9379670.
* View Article
* PubMed/NCBI
* Google Scholar
47. 47. Sterling ML, Teunisse R, Englitz B. Rodent ultrasonic vocal interaction resolved with millimeter precision using hybrid beamforming. Elife. 2023;12. Epub 2023/07/26. pmid:37493217; PubMed Central PMCID: PMC10522333.
* View Article
* PubMed/NCBI
* Google Scholar
48. 48. Harrison C. Shared science’s time to shine. Lab Anim (NY). 2023;52(8):179–82. Epub 2023/08/01. pmid:37524949.
* View Article
* PubMed/NCBI
* Google Scholar
Citation: Popik P, Cyrano E, Golebiowska J, Malikowska-Racia N, Potasiewicz A, Nikiforuk A (2024) Deep learning algorithms reveal increased social activity in rats at the onset of the dark phase of the light/dark cycle. PLoS ONE 19(11): e0307794. https://doi.org/10.1371/journal.pone.0307794
About the Authors:
Piotr Popik
Roles: Conceptualization, Formal analysis, Methodology, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing
E-mail: [email protected]
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
ORICD: https://orcid.org/0000-0003-0722-1263
Ewelina Cyrano
Roles: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
Joanna Golebiowska
Roles: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
Natalia Malikowska-Racia
Roles: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
Agnieszka Potasiewicz
Roles: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
Agnieszka Nikiforuk
Roles: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization
Affiliation: Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Kraków, Poland
[/RAW_REF_TEXT]
[/RAW_REF_TEXT]
[/RAW_REF_TEXT]
[/RAW_REF_TEXT]
[/RAW_REF_TEXT]
[/RAW_REF_TEXT]
1. Calhoun JB. The ecology and sociology of the Norway rat. Bethesda: U.S. Department of Health, Education and Welfare. Public Health Service Publication; 1962. 1–288 p.
2. Stryjek R, Mioduszewska B, Spaltabaka-Gędek E, Juszczak GR. Wild Norway Rats Do Not Avoid Predator Scents When Collecting Food in a Familiar Habitat: A Field Study. Scientific Reports. 2018;8(1):9475. pmid:29930280
3. Burn CC. What is it like to be a rat? Rat sensory perception and its implications for experimental design and rat welfare. Applied Animal Behaviour Science. 2008;112(1–2):1–32.
4. Schwarting RKW. Behavioral analysis in laboratory rats: Challenges and usefulness of 50-kHz ultrasonic vocalizations. Neurosci Biobehav Rev. 2023;152:105260. Epub 2023/06/03. pmid:37268181.
5. Burgdorf JS, Vitaterna MH, Olker CJ, Song EJ, Christian EP, Sorensen L, et al. NMDAR activation regulates the daily rhythms of sleep and mood. Sleep. 2019;42(10):zsz135. Epub 2019/09/11. pmid:31504971; PubMed Central PMCID: PMC6783887.
6. Lore R, Flannelly KJ. Rat societies. Scientific American. 1977;5(5):106–16. Epub 1977/05/01. pmid:558650
7. Barnett SA. An analysis of social behaviour in wild rats. Proceedings of the Zoological Society (London). 1958;130:107–52.
8. Whishaw IQ, Kolb B. The behavior of the laboratory rat. A handbook with tests: Oxford University Press; 2005.
9. Popik P, Cyrano E, Piotrowska D, Holuj M, Golebiowska J, Malikowska-Racia N, et al. Effects of ketamine on rat social behavior as analyzed by DeepLabCut and SimBA deep learning algorithms. Front Pharmacol. 2023;14:1329424. Epub 2024/01/25. pmid:38269275; PubMed Central PMCID: PMC10806163.
10. Datta SR, Anderson DJ, Branson K, Perona P, Leifer A. Computational Neuroethology: A Call to Action. Neuron. 2019;104(1):11–24. Epub 2019/10/11. pmid:31600508; PubMed Central PMCID: PMC6981239.
11. Mathis A, Mamidanna P, Cury KM, Abe T, Murthy VN, Mathis MW, et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci. 2018;21(9):1281–9. Epub 2018/08/22. pmid:30127430.
12. Nath T, Mathis A, Chen AC, Patel A, Bethge M, Mathis MW. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols. 2019;14(7):2152–76. pmid:31227823
13. Lauer J, Zhou M, Ye S, Menegas W, Schneider S, Nath T, et al. Multi-animal pose estimation, identification and tracking with DeepLabCut. Nat Methods. 2022;19(4):496–504. Epub 2022/04/14. pmid:35414125; PubMed Central PMCID: PMC9007739.
14. Nilsson SRO, Goodwin NL, Choong JJ, Hwang S, Wright HR, Norville ZC, et al. Simple Behavioral Analysis (SimBA)–an open source toolkit for computer classification of complex social behaviors in experimental animals. bioRxiv. 2020. https://doi.org/10.1101/2020.04.19.049452
15. Goodwin NL, Choong JJ, Hwang S, Pitts K, Bloom L, Islam A, et al. Simple Behavioral Analysis (SimBA) as a platform for explainable machine learning in behavioral neuroscience. Nature Neuroscience. 2024. Epub 2024/05/23. pmid:38778146.
16. Coffey KR, Marx RG, Neumaier JF. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology. 2019;44(5):859–68. Epub 2019/01/06. pmid:30610191; PubMed Central PMCID: PMC6461910.
17. Hardin A, Schlupp I. Using machine learning and DeepLabCut in animal behavior. acta ethologica. 2022;25(3):125–33.
18. Lapp HE, Salazar MG, Champagne FA. Automated maternal behavior during early life in rodents (AMBER) pipeline. Sci Rep. 2023;13(1):18277. Epub 2023/10/26. pmid:37880307; PubMed Central PMCID: PMC10600172.
19. Morley KC, Arnold JC, McGregor IS. Serotonin (1A) receptor involvement in acute 3,4-methylenedioxymethamphetamine (MDMA) facilitation of social interaction in the rat. Prog Neuropsychopharmacol Biol Psychiatry. 2005;29(5):648–57. Epub 2005/05/24. pmid:15908091.
20. Ando RD, Benko A, Ferrington L, Kirilly E, Kelly PA, Bagdy G. Partial lesion of the serotonergic system by a single dose of MDMA results in behavioural disinhibition and enhances acute MDMA-induced social behaviour on the social interaction test. Neuropharmacology. 2006;50(7):884–96. Epub 2006/02/14. pmid:16472832.
21. Goodwin NL, Nilsson SRO, Golden SA. Rage Against the Machine: Advancing the study of aggression ethology via machine learning. Psychopharmacology (Berl). 2020;237(9):2569–88. Epub 2020/07/11. pmid:32647898; PubMed Central PMCID: PMC7502501.
22. Schweinfurth MK. The social life of Norway rats (Rattus norvegicus). eLife. 2020;9. Epub 2020/04/10. pmid:32271713; PubMed Central PMCID: PMC7145424.
23. Barnett SA. The story of rats: their impact on us, and our impact on them. Crows Nest, NSW, Australia: Allen & Unwin Crows Nest, NSW, Australia; 2001.
24. Sams-Dodd F. Automation of the social interaction test by a video- tracking system: behavioural effects of repeated phencyclidine treatment. Journal of Neuroscience Methods. 1995;59(2):157–67. Epub 1995/07/01. pmid:8531482.
25. Potasiewicz A, Holuj M, Litwa E, Gzielo K, Socha L, Popik P, et al. Social dysfunction in the neurodevelopmental model of schizophrenia in male and female rats: Behavioural and biochemical studies. Neuropharmacology. 2020;170(1873–7064 (Electronic)):108040. Epub 2020/03/14. pmid:32165218.
26. Wright JM, Gourdon JC, Clarke PB. Identification of multiple call categories within the rich repertoire of adult rat 50-kHz ultrasonic vocalizations: effects of amphetamine and social context. Psychopharmacology. 2010;211(1):1–13. pmid:20443111
27. Cardinal R, Aitken MRF. ANOVA for the behavioural sciences researcher. London: Lawrence Erlbaum Associates; 2006.
28. Bespalov A, Bernard R, Gilis A, Gerlach B, Guillen J, Castagne V, et al. Introduction to the EQIPD quality system. eLife. 2021;10:e63294. Epub 2021/05/25. pmid:34028353; PubMed Central PMCID: PMC8184207.
29. Stone AA, Schwartz JE, Schkade D, Schwarz N, Krueger A, Kahneman D. A population approach to the study of emotion: diurnal rhythms of a working day examined with the Day Reconstruction Method. Emotion. 2006;6(1):139–49. pmid:16637757
30. Scott JP, McNaughton LR, Polman RC. Effects of sleep deprivation and exercise on cognitive, motor performance and mood. Physiol Behav. 2006;87(2):396–408. Epub 2006/01/13. pmid:16403541.
31. Brudzynski SM. Biological Functions of Rat Ultrasonic Vocalizations, Arousal Mechanisms, and Call Initiation. Brain Sci. 2021;11(5). Epub 2021/06/03. pmid:34065107; PubMed Central PMCID: PMC8150717.
32. Mowrer OH. Animal studies in the genesis of personality. Transactions of the New York Academy of Sciences. 1940;3:8–11.
33. Wesson D. Sniffing Behavior Communicates Social Hierarchy. Current Biology. 2013;23(7):575–80. pmid:23477727
34. Wohr M, Engelhardt KA, Seffer D, Sungur AO, Schwarting RK. Acoustic Communication in Rats: Effects of Social Experiences on Ultrasonic Vocalizations as Socio-affective Signals. CurrTopBehavNeurosci. 2015.
35. Blanchard RJ, Blanchard DC, Agullana R, Weiss SM. Twenty-two kHz alarm cries to presentation of a predator, by laboratory rats living in visible burrow systems. Physiol Behav. 1991;50(5):967–72. Epub 1991/11/01. pmid:1805287.
36. Bialy M, Bogacki-Rychlik W, Kasarello K, Nikolaev E, Sajdel-Sulkowska EM. Modulation of 22-khz postejaculatory vocalizations by conditioning to new place: Evidence for expression of a positive emotional state. Behavioral Neuroscience. 2016;130(4):415–21. pmid:27454624
37. Burgdorf J, Kroes RA, Moskal JR, Pfaus JG, Brudzynski SM, Panksepp J. Ultrasonic vocalizations of rats (Rattus norvegicus) during mating, play, and aggression: Behavioral concomitants, relationship to reward, and self-administration of playback. Journal of Comparative Psychology. 2008;122 :357–67. pmid:19014259
38. Burgdorf J, Panksepp J. The neurobiology of positive emotions. Neuroscience and Biobehavioral Reviews. 2006;30(2):173–87. pmid:16099508
39. Knutson B, Burgdorf J, Panksepp J. Ultrasonic vocalizations as indices of affective states in rats. PsycholBull. 2002;128(6):961–77. pmid:12405139
40. Brudzynski SM. Principles of rat communication: quantitative parameters of ultrasonic calls in rats. BehavGenet. 2005;35(1):85–92. pmid:15674535
41. Brudzynski SM. Handbook of mammalian vocalization. An integrative neuroscience approach. Amsterdam: Academic Press; 2010.
42. Blanchard RJ, Flannelly KJ, Blanchard DC. Life-span studies of dominance and aggression in established colonies of laboratory rats. Physiology and Behavior. 1988;43(1):1–7. pmid:3413239
43. Adams N, Boice R. A longitudinal study of dominance in an outdoor colony of domestic rats. Journal of Comparative Psychology. 1983;97(1):24–33.
44. Dixon AK, Kaesermann HP. Ethopharmacology of flight behaviour. In: Olivier BJ, Mos J, Brain PF, editors. Ethopharmacology of agonistic behaviour in animals and humans. Dordrecht: Nijhoff; 1987. p. 46–79.
45. Borchers S, Carl J, Schormair K, Krieger JP, Asker M, Edvardsson CE, et al. An appetite for aggressive behavior? Female rats, too, derive reward from winning aggressive interactions. Transl Psychiatry. 2023;13(1):331. Epub 2023/10/28. pmid:37891191; PubMed Central PMCID: PMC10611704.
46. Matsumoto J, Kanno K, Kato M, Nishimaru H, Setogawa T, Chinzorig C, et al. Acoustic camera system for measuring ultrasound communication in mice. iScience. 2022;25(8):104812. Epub 2022/08/20. pmid:35982786; PubMed Central PMCID: PMC9379670.
47. Sterling ML, Teunisse R, Englitz B. Rodent ultrasonic vocal interaction resolved with millimeter precision using hybrid beamforming. Elife. 2023;12. Epub 2023/07/26. pmid:37493217; PubMed Central PMCID: PMC10522333.
48. Harrison C. Shared science’s time to shine. Lab Anim (NY). 2023;52(8):179–82. Epub 2023/08/01. pmid:37524949.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024 Popik et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The rapid decrease of light intensity is a potent stimulus of rats’ activity. The nature of this activity, including the character of social behavior and the composition of concomitant ultrasonic vocalizations (USVs), is unknown. Using deep learning algorithms, this study aimed to examine the social life of rat pairs kept in semi-natural conditions and observed during the transitions between light and dark, as well as between dark and light periods. Over six days, animals were video- and audio-recorded during the transition sessions, each starting 10 minutes before and ending 10 minutes after light change. The videos were used to train and apply the DeepLabCut neural network examining animals’ movement in space and time. DeepLabCut data were subjected to the Simple Behavioral Analysis (SimBA) toolkit to build models of 11 distinct social and non-social behaviors. DeepSqueak toolkit was used to examine USVs. Deep learning algorithms revealed lights-off-induced increases in fighting, mounting, crawling, and rearing behaviors, as well as 22-kHz alarm calls and 50-kHz flat and short, but not frequency-modulated calls. In contrast, the lights-on stimulus increased general activity, adjacent lying (huddling), anogenital sniffing, and rearing behaviors. The animals adapted to the housing conditions by showing decreased ultrasonic calls as well as grooming and rearing behaviors, but not fighting. The present study shows a lights-off-induced increase in aggressive behavior but fails to demonstrate an increase in a positive affect defined by hedonic USVs. We further confirm and extend the utility of deep learning algorithms in analyzing rat social behavior and ultrasonic vocalizations.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer