Skip to content

Latest commit

 

History

History
289 lines (248 loc) · 17.5 KB

CHANGELOG.md

File metadata and controls

289 lines (248 loc) · 17.5 KB

ExplainableAI.jl

Version v0.8.0

This release removes the automatic reexport of heatmapping functionality. Users are now required to manually load VisionHeatmaps.jl and/or TextHeatmaps.jl.

This reduces the maintenance burden for new heatmapping features and the amount of dependencies for users who don't need heatmapping functionality.

  • BREAKING Removed reexport of heatmapping functionality by updating XAIBase dependency to v3.0.0 (#162).
  • Feature Added GradCAM analyzer (#155). Try it with VisionHeatmaps.jl's new heatmap_overlay feature.

Version v0.7.0

This release moves the core interface (Explanation, heatmap, analyze) into a separate package called XAIBase.jl. Developers can make use of the XAIBase.jl interface to quickly implement or prototype new methods without having to write boilerplate code.

As announced with version v0.6.2, this is first release without LRP, which has been moved to a separate package called RelevancePropagation.jl. This separation is enabled by the new common XAIBase.jl interface.

  • BREAKING Move core interface into XAIBase.jl package (#154).
    • Renamed Explanation field neuron_selection to output_selection
    • Added Explanation field heatmap for heatmapping presets
  • BREAKING Move LRP into RelevancePropagation.jl (#157)
  • BREAKING Remove ImageNet preprocessing utilities (#159)
  • Documentation Partially move documentation into the Julia-XAI ecosystem documentation

Version v0.6.3

  • Enhancement Allow Gradient analyzers on non-Flux models (#150)
  • Bugfix Fix typo in BATCHDIM_MISSING error (#150)

Version v0.6.2

This is first release of ExplainableAI.jl as part of the Julia-XAI organization (#149) and the last minor release that includes LRP before it is moved to its own separate package.

  • Feature Add Concept Relevance Propagation analyzer CRP (#146, #148)
  • Feature Add option to process heatmaps batch-wise using keyword argument process_batch=true (#146, #148)
  • Bugfix Remove FlatRule on dense layers from EpsilonPlusFlat and EpsilonAlpha2Beta1Flat composite presets (#147)

Version v0.6.1

This release brings GPU support to all analyzers.

  • Feature Support LRP on GPUs (#142, #140)
  • Feature Support gradient analyzers on GPUs (#144)
  • Enhancement Make Tullio optional dependency using package extensions (#141)
  • Documentation Document GPU support (#145)

Version v0.6.0

This release brings a large refactor of LRP analyzers, supporting nested "dataflow layers" from Flux.jl like Chain and Parallel layers. This enables LRP on more complex model architectures like ResNets.

Due to the fact that these new features require a breaking release, we've used the occasion to clean up the API. Since the number of changes is large, this changelog has been split between changes to LRP analyzers and more general changes to the package.

Changes to LRP analyzers

Breaking changes:

  • BREAKING Remove all unicode characters from user-facing API (#107)
    • EpsilonRule: argument epsilon replaces ϵ
    • GammaRule: argument gamma replaces γ
    • AlphaBetaRule: arguments alpha and beta replace α, β
  • BREAKING Rename LRP analyzer keyword argument is_flat=false to flatten=true (#119)
  • BREAKING Remove check_model, replaced by non-exported check_lrp_compat (#119)
  • BREAKING Replace layerwise_relevances field of Explanation return type by optional named tuple extras. Access layerwise relevances via extras.layerwise_relevances. (#126)
  • BREAKING Remove composite LastNTypeRule (#119)
  • BREAKING Rename composite primitives to avoid confusion with LRP rules (#130)
    • rename *Rule to *Map
    • rename *TypeRule to *TypeMap

Breaking changes to commonly extended internal functions:

  • BREAKING Internal lrp! rule calls require extra argument layer (#119)
  • BREAKING Pre-allocate modified layers, replacing modify_param! with modify_parameters (#102)

New features and enhancements:

  • Feature Support nested Flux Chains (#119)
  • Feature Support Parallel layers (#135, #138)
  • Feature Support BatchNorm layers (#129, #134)
  • Feature Add GeneralizedGammaRule (#109)
  • Feature Support nested indexing in composite primitive LayerMap (#131)
  • Enhancement Pre-allocate modified layers in LRP analyzer field modified_layers (#119)
  • Enhancement Set LRP output relevance to one (#128)
  • Enhancement lrp! rule calls require extra argument layer, avoiding copies of unmodified layers (#119)
  • Enhancement Performance fixes for LRP rules, reducing number of generated pullback functions (#106, #108)
  • Enhancement Simplify LRP analyzer (#112, #119)
  • Enhancement Simplify LRP model checks (#110, #119)
  • Enhancement Improve type stability of LRP rules

Documentation:

  • Documentation Update documentation, adding pages on model preparation, composites, custom LRP rules, developer documentation and a separate API reference for LRP analyzers (#137, #105)

Package maintenance:

General changes

Breaking changes:

  • BREAKING Rename Explanation field attribution to val (#136)

Documentation:

  • Documentation Update documentation, adding pages on heatmapping and input augmentations (#137, #105)

Package maintenance:

  • Maintenance Compatibility with Flux.jl v0.14 (#116)
  • Maintenance Drop dependency on LinearAlgebra.jl and PrettyTables.jl (#119)
  • Maintenance Add Aqua.jl tests (#125)

Version v0.5.7

  • Bugfix Fix WSquareRule dispatch on Dense layers
  • Maintenance Fix Vararg deprecation warnings from composites

Version v0.5.6

  • Bugfix Drop Flux v0.12 due to compatibility issues in preactivation (#99)

Version v0.5.5

  • Bugfix Ignore bias in WSquareRule
  • Enhancement Faster FlatRule on Dense layers (#96)
  • Enhancement Faster WSquareRule on Dense layers (#98)
  • Maintenance Update rule tests and references

Version v0.5.4

This release brings bugfixes and usability features:

  • Feature Add pretty printing of LRP analyzers, summarizing how layers and rules are matched up (#89)
  • Feature Add LRP support for ConvTranspose and CrossCor layers
  • Documentation Add equations of LRP rules to docstrings

Bugfixes:

  • Bugfix Fix bug affecting AlphaBetaRule, ZPlusRule and ZBoxRule, where mutating the layer modified Zygote pullbacks (#92)
  • Bugfix Fix bug in FlatRule bias (#92)
  • Bugfix Fix input modification for FlatRule and WSquareRule (#93)

Version v0.5.3

Big feature release that adds LRP composites and presets:

  • Feature Add LRP Composite and composite primitives (#84)
  • Feature Add LRP composite presets (#87)
  • Feature Add LRP ZPlusRule (#88)
  • Enhancement Export union-types of Flux layers for easy definition of LRP composites
  • Documentation Improvements to docstrings and documentation
  • Maintenance Add test/Project.toml with compat entries for test dependencies (#87)

Version v0.5.2

This release temporarily adds ImageNet pre-processing utilities. This enables users users to apply XAI methods on pretrained vision models from Metalhead.jl. Note that this functionality will be deprecated once matching functionality is in either Metalhead.jl or MLDatasets.jl.

  • Feature Add ImageNet preprocessing utility preprocess_imagenet (#80)
  • Enhancement Change default heatmap color scheme to seismic
  • Enhancement Updated README with the JuliaCon 2022 talk and examples on VGG16

Version v0.5.1

Small bugfix release addressing a bug in v0.5.0. Version of ExplainableAI.jl shown in the JuliaCon 2022 talk.

  • Bugfix Fix bug in FlatRule (#77)

Version v0.5.0

Breaking release that refactors the internals of LRP analyzers and adds several rules.

List of breaking changes:

  • BREAKINGEnhancement Introduce compatibility checks for LRP rule & layer combinations using check_compat(rule, layer) (#75)
  • BREAKING Applying GammaRule and ZBoxRule on a layer without weights and biases will now throw an error (#75)
  • BREAKING In-place updating modify_layer!(rule, layer) replaces modify_layer(rule, layer) (#73)
  • BREAKING In-place updating modify_param!(rule, param) replaces modify_params(rule, W, b) (#73)
  • BREAKING Removed named LRP constructors LRPZero, LRPEpsilon, LRPGamma (#75)

Added new LRP rules:

Bug fixes:

  • Bugfix Fix bug in ZBoxRule (#77)
  • Bugfix Fix broadcasting for Julia 1.6 (#74)
  • Bugfix Support MLUtils.flatten

Performance improvements:

  • Enhancement Replace LRP gradient computation with VJP using Zygote.pullback (#72)
  • Enhancement Faster GammaRule

Version v0.4.0

Changes:

  • BREAKING Update heatmapping normalizer, using ColorScheme's get. Breaking due to renaming normalize to ColorScheme's rangescale. (#57)
  • BREAKING Rename InputAugmentation to NoiseAugmentation. (#65)
  • BREAKING GammaRule and EpsilonRule now use default arguments instead of keyword arguments, removing the need for users to type unicode symbols. (#70)
  • BREAKINGBugfix ZBoxRule now requires parameters low and high instead of computing them from the input. (#69)
  • Feature Add IntegratedGradients analyzer. (#65)
  • Feature Add InterpolationAugmentation wrapper. (#65)
  • Feature Allow any type of Sampleable in NoiseAugmentation. (#65)

Performance improvements:

  • Enhancement Remove use of mapreduce. (#58)
  • Enhancement Load LoopVectorization.jl in tests and benchmarks to speed up Tullio on CPU. (#66)
  • Enhancement Type stability fixes for GammaRule. (#70)