-
Hallo, because of a bug in an exporter, I now have metrics that have a few incorrect values, see below. The decrease from 452.997 to 452.996 is incorrect (rounding error on the producer side), breaking Is there a way to either patch these 452.996 values to 452.997, or just delete these incorrect samples without losing the complete series? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I might have found kind of a solution myself, so I leave it here for others. I was not able to modify samples, also not to delete single samples, but at least I could delete only a small time interval of a time series (the one including the incorrectly decreased values) which is good enough. Caveat: There seems to be a discrepancy in the samples with timestamps returned in #!/usr/bin/env bash
set -euo pipefail
# Must be GNU date. 'date' on Linux, 'gdate' on MacOS, if available
DATE=gdate
PROM_BASEURL="http://prometheus.domain.com:9090/api/v1"
# Look up time frame with incorrect samples. It seems like this is not very exact / granular
START='2023-08-06T04:11:00Z'
END='2023-08-06T04:33:00Z'
SERIES='opendtu_YieldTotal{name="Need%20to%escape%spaces%20here",type="AC"}'
PROM_START="$($DATE +%s -d "$START")"
PROM_END="$($DATE +%s -d "$END")"
URL=''$PROM_BASEURL'/admin/tsdb/delete_series?match[]='$SERIES'&start='$PROM_START'&end='$PROM_END''
curl -v -X PUT -g "$URL"
curl -v -X POST "$PROM_BASEURL/admin/tsdb/clean_tombstones" |
Beta Was this translation helpful? Give feedback.
I might have found kind of a solution myself, so I leave it here for others.
I was not able to modify samples, also not to delete single samples, but at least I could delete only a small time interval of a time series (the one including the incorrectly decreased values) which is good enough.
Caveat: There seems to be a discrepancy in the samples with timestamps returned in
/query
vs/range_query
, which made it hard for me to figure out the correct start and end timestamp, and this still puzzles me a bit... In the end I deleted a few minutes more before and after the "wrong" data, which is still better than eternally incorrect metrics.