Skip to content

Official implementation of EXPLORE: A novel deep learning-based analysis method for exploration learning in object recognition tests

Notifications You must be signed in to change notification settings

Wahl-lab/EXPLORE

Repository files navigation

EXPLORE: A novel deep learning-based analysis method for exploration behaviour in object recognition tests

🎓 About:

Object recognition tests are widely used in neuroscience to assess memory function in rodents. Despite the experimental simplicity of the task, the interpretation of behavioural features that are counted as object exploration can be complicated. Traditionally, analysis of object exploration thus is often based on manually scoring, which is time-consuming, limited to few behaviours, and variable across researchers. To overcome these limitations We developed "EXLORE", a simple, ready-to use and open source pipeline. Compared to costly commercial software, EXPLORE performs the different analysis steps for object recognition tests with higher precision, higher versatility and lower time investment. EXPLORE consists of a convolutional neural network trained in a supervised manner, that extracts features from images and classifies behavior of rodents near a presented object as “exploration” or “no exploration". EXPLORE achieves human-level accuracy in identifying and scoring exploration behaviour and outperforms commercial software, in particular under complex conditions, e.g., when multiple objects or larger objects to climb on are present. By labeling the respective training data set, users decide by themselves, which types of interactions are in- or excluded for scoring exploration behaviour. A GUI provides a beginning-to-end analysis with an automatic stop-watch function to calculate the duration of specific exploration behaviour, accelerating a fast and reproducible data analysis for neuroscientists with no expertise in programming or deep learning.

🔨 Install EXPLORE:

  • First install Anaconda (if not installed already): Install now
  • Clone this repository and store the folder EXPLORE-main at a preferred directory (first, you find it in your download folder)
  • Open a shell- or a terminal window and change the directory (the easiest way is to drag & drop your folder into the shell- or terminal window after typing cd and a space):
cd <your directory>/EXPLORE-main
  • create and activate your environment:
conda create -n XPL
conda activate XPL
  • Run the requirements.txt file (this will install all the necessary packages for EXPLORE into your new conda environment (could take a few minutes!)):
conda install -c conda-forge --file requirements.txt
  • install OpenCV with the following command on macOS:
pip install opencv-python==4.1.1.26

(use pip3 for macOS earlier than BigSur)

  • or install OpenCV with the following command on Windows:
conda install -c conda-forge opencv==4.5.0


 

🔥 Congratulations, you have now successfully installed EXPLORE! Now let's use it... 🔥


 

💡 How to use EXPLOREs deep learning-based exploration analysis:

EXPLOREs deep learning-based exploration analysis is the major part to investigate object recognition tests. There are three parts: 1. Training a network on a few manually scored samples. 2. Predict on all of your experiment videos. 3. Correct your prediction if necessary. The main measures taken are exploration time and exploration frequency on each defined object. :exclamation:Note: For acquisition session and testing session two distinct networks have to be trained.

Overview on method:


 

Open a shell- or a terminal window and change to your directory:

cd <your directory>/EXPLORE-main/scripts

Activate your virtual environment:

conda activate XPL

Training:

To train a network enter the following command:

python main_training.py

(python3 for macOS)


 

➡️ This will now open a GUI (see manual training1 and manual training2 and manual scoring for further instructions!)


 

Prediction:

To predict on your experiment videos enter the following command:

python main_prediction.py

(python3 for macOS)


 

➡️ This will now open a GUI (see manual prediction for further instructions!)


 

Correction:

To correct your prediction enter the following command:

python main_correct.py

(python3 for macOS)


 

➡️ This will now open a GUI (see manual correction for further instructions!)


 

Output files Type Description
Prediction videos folder For all of the selected experiment videos EXPLORE will generate colored squares around the objects whenever exploration behaviour was predicted and stores the newly created videos in a folder prediction videos
Dataframe .csv The predicted exploration times and frequencies at each object will be stored in a dataframe
Plots .png Training- and validation accuracy- and loss will be plotted and saved


 

💡 How to use EXPLOREs manual labeling tool:

Besides the automated analysis, EXPLORE provides a tool for manual scoring. The scoring will be saved as .csv file.

Open a shell- or a terminal window and change to your directory:

cd <your directory>/EXPLORE-main/scripts

Activate your virtual environment:

conda activate XPL

To start manual scoring type the following command:

python main_manual_scoring.py

(python3 for macOS)


 

➡️ This will now open a GUI (refer to the training manual for further instructions!)


 

💡 How to use EXPLOREs quadrant analysis:

With the quadrant analysis you can investigate and quantify movement throughout the experiment arena. Two measures are taken: the time animals spent in each quadrant over a given period (exploration time) and the frequency of transistions from one quadrant to another (exploration frequency).

Overview on method:


 

Open a shell- or a terminal window and change to your directory:

cd <your directory>/EXPLORE-main/scripts

Activate your virtual environment:

conda activate XPL

Then enter the following command:

python main_quadrant.py

(python3 for macOS)


 

➡️ This will now open a GUI (see manual quadrant for further instructions!)


 

Output files Type Description
Dataframe .csv The predicted exploration times and frequencies for each quadrant will be stored in a dataframe
Plots .png For each animal (video) the frequency will be plotted and stored
Heatmap .png An overview on the quadrants exploration- frequency and time will be plotted as heatmaps


 


 

Please refer to our publication for further information about more technical details: https://www.nature.com/articles/s41598-023-31094-w


 

📫 Contact:

[email protected]
[email protected]

About

Official implementation of EXPLORE: A novel deep learning-based analysis method for exploration learning in object recognition tests

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages