You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using scipy.signal.correlate (scipy version 1.11.1, but I'll test on a newer one soon EDIT: same thing happens on scipy 1.13.0) to convolve an array of zeros and ones with a gaussian. The array is 3d, the gaussian kernel 1d. Most of the correlated data is zeros, so I expect most of the output to be exactly zeros, but I get a lot of values close to, but not identical to zero.
This looks like a bug to me, but I'm also curious why this would happen.
It may look benign but such close-to-zero low variance timepoints can produce high-amplitude noisy values from a standard statistical test (t test for example) and obscure things.
Reproducing Code Example
(seeinthemaintextdescribingtheproblem)
Error message
(no error message)
SciPy/NumPy/Python version and system information
1.11.1 1.24.4 sys.version_info(major=3, minor=11, micro=4, releaselevel='final', serial=0)
lapack_armpl_info:
NOT AVAILABLE
lapack_mkl_info:
NOT AVAILABLE
openblas_lapack_info:
NOT AVAILABLE
openblas_clapack_info:
NOT AVAILABLE
flame_info:
NOT AVAILABLE
accelerate_info:
NOT AVAILABLE
atlas_3_10_threads_info:
NOT AVAILABLE
atlas_3_10_info:
NOT AVAILABLE
atlas_threads_info:
NOT AVAILABLE
atlas_info:
NOT AVAILABLE
lapack_info:
libraries = ['lapack', 'blas', 'lapack', 'blas']
library_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\lib']
language = f77
blas_armpl_info:
NOT AVAILABLE
blas_mkl_info:
NOT AVAILABLE
blis_info:
NOT AVAILABLE
openblas_info:
NOT AVAILABLE
atlas_3_10_blas_threads_info:
NOT AVAILABLE
atlas_3_10_blas_info:
NOT AVAILABLE
atlas_blas_threads_info:
NOT AVAILABLE
atlas_blas_info:
NOT AVAILABLE
blas_info:
libraries = ['cblas', 'blas', 'cblas', 'blas', 'cblas', 'blas']
library_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\lib']
include_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\include']
language = f77
define_macros = [('HAVE_CBLAS', None)]
blas_opt_info:
define_macros = [('NO_ATLAS_INFO', 1), ('HAVE_CBLAS', None)]
libraries = ['cblas', 'blas', 'cblas', 'blas', 'cblas', 'blas']
library_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\lib']
include_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\include']
language = f77
lapack_opt_info:
libraries = ['lapack', 'blas', 'lapack', 'blas', 'cblas', 'blas', 'cblas', 'blas', 'cblas', 'blas']
library_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\lib']
language = f77
define_macros = [('NO_ATLAS_INFO', 1), ('HAVE_CBLAS', None)]
include_dirs = ['C:/Users/mmagnuski/anaconda3/envs/mne_1.4.2\\Library\\include']
Supported SIMD extensions in this NumPy install:
baseline = SSE,SSE2,SSE3
found = SSSE3,SSE41,POPCNT,SSE42,AVX,F16C,FMA3,AVX2,AVX512F,AVX512CD,AVX512_SKX,AVX512_CLX
not found = AVX512_CNL,AVX512_ICL
The text was updated successfully, but these errors were encountered:
mmagnuski
added
the
defect
A clear bug or issue that prevents SciPy from being installed or used as expected
label
May 9, 2024
lucascolley
changed the title
BUG: scipy.signal.correlate on 3d array gives many numerical close-to-zero errors?
BUG: signal.correlate: many numerical close-to-zero errors on 3D array
May 9, 2024
The automatic choice of fftconvolve might be the cause of this issue. This does not happen if I use scipy.signal.oaconvolve, for example (happens to a much lower extent actually, there are still some close-to-zero noisy points near the spikes in the signal).
Describe your issue.
I am using
scipy.signal.correlate
(scipy version 1.11.1, but I'll test on a newer one soon EDIT: same thing happens on scipy1.13.0
) to convolve an array of zeros and ones with a gaussian. The array is 3d, the gaussian kernel 1d. Most of the correlated data is zeros, so I expect most of the output to be exactly zeros, but I get a lot of values close to, but not identical to zero.This is the reproducible example:
this is the plot:
if you zoom in on samples 1000:1500 (for example) of the output of
correlate
you see the close-to-zero noise:Interestingly this does not happen if I reduce the operation to 1d array:
This looks like a bug to me, but I'm also curious why this would happen.
It may look benign but such close-to-zero low variance timepoints can produce high-amplitude noisy values from a standard statistical test (t test for example) and obscure things.
Reproducing Code Example
Error message
SciPy/NumPy/Python version and system information
The text was updated successfully, but these errors were encountered: