Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Pre Processors in Prediction (ART classifier) #2335

Open
RoeyBokobza opened this issue Nov 27, 2023 · 2 comments
Open

Using Pre Processors in Prediction (ART classifier) #2335

RoeyBokobza opened this issue Nov 27, 2023 · 2 comments
Assignees

Comments

@RoeyBokobza
Copy link

As a user, you would anticipate that adding a preprocessor defense to the estimator's 'preprocessing_defences' list would automatically activate it on the input inside the predict function.

Instead, the user must explicitly activate any of those defenses before sending the result to the predict function. This entire operation renders the attribute 'preprocessing_defences' obsolete.

Implementing the 'forward' function for this defense, and adding the defense instance to the 'preprocessing_operations' list are prerequisites for using the current implementation of the 'self._apply_preprocessing' function. Here is an example of the 'self._apply_preprocessing' function in Estimator.py file:
image

In the case of postprocessors, everything is as a user would anticipate. All post processors in the list of 'postprocessing_defeces' attribute are automatically activated as part of the predict function of the estimator. Here is an example from the Estimator.py file :
image

For now, adding a similar piece of code inside the 'self._apply_preprocessing' method is a straightforward workaround for it, as demonstrated in this little example:
image

@beat-buesser
Copy link
Collaborator

Hi @RoeyBokobza Thank you for your interest in ART! Have you observed that pre-processing steps are not being applied in your experiments?
The code for pre- and post-processing are slightly different because we have to add the pre-processing for normalisation into sequence with the pre-processing defences and have therefore renamed the combined list to self.preprocessing_operations as seen in your first screenshot above.

@beat-buesser beat-buesser self-assigned this Nov 30, 2023
@RoeyBokobza
Copy link
Author

Hi @RoeyBokobza Thank you for your interest in ART! Have you observed that pre-processing steps are not being applied in your experiments? The code for pre- and post-processing are slightly different because we have to add the pre-processing for normalisation into sequence with the pre-processing defences and have therefore renamed the combined list to self.preprocessing_operations as seen in your first screenshot above.

Hey, thank you for responding!
I wanted the option to activate different chains of pre-processors during each run as part of my experiments. This is why I expected there to be a function to which I could simply pass pre-processor instances, and it would perform the rest automatically. When I realized that this was not the case, I wanted to issue it, so you can tell me if I missed something, or perhaps this need is not addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants