Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty component_sent_bytes_total but not in component_sent_event_bytes_total #20356

Open
quantumsheep opened this issue Apr 22, 2024 · 3 comments 路 May be fixed by #20411
Open

Empty component_sent_bytes_total but not in component_sent_event_bytes_total #20356

quantumsheep opened this issue Apr 22, 2024 · 3 comments 路 May be fixed by #20411
Labels
domain: observability Anything related to monitoring/observing Vector sink: aws_s3 Anything `aws_s3` sink related type: bug A code related bug.

Comments

@quantumsheep
Copy link

quantumsheep commented Apr 22, 2024

A note for the community

  • Please vote on this issue by adding a 馃憤 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

component_sent_bytes_total has one empty metric without any component labels. This is not an issue with component_sent_event_bytes_total. Is component_sent_bytes_total deprecated?

I'm using prometheus_exporter and aws_s3 sinks.

component_sent_bytes_total{host="myserver",protocol="https"} 0 1713800645973
component_sent_bytes_total{component_id="metrics_out",component_kind="sink",component_type="prometheus_exporter",host="myserver",protocol="http"} 2135563654 1713800645973
component_sent_bytes_total{component_id="s3_out",component_kind="sink",component_type="aws_s3",host="myserver",protocol="https"} 118736485746 1713800645973

Configuration

api:
  enabled: true
  address: "127.0.0.1:{{ vector_api_port }}"

sources:
  docker_logs_in:
    type: docker_logs
    exclude_containers:
      - cadvisor
      - fluentbit
  vector_metrics_in:
    type: internal_metrics
    scrape_interval_secs: 30

transforms:
  docker_logs_remap:
    inputs:
      - docker_logs_in
    type: remap
    drop_on_abort: true
    source: |-
      msg, err = parse_json(.message)
      if err != null {
        abort
      }

      . = merge!(., msg)

      .hostname = "{{ ansible_fqdn }}"

      del(.message)
      del(.source_type)
      del(.host)
      del(.label)
      del(.ts)
      del(.stream)
      del(.system)
  docker_logs_agw_envoy:
    inputs:
      - docker_logs_remap
    type: filter
    condition: .service == "agw-envoy"
  docker_logs_agw_auth:
    inputs:
      - docker_logs_remap
    type: filter
    condition: .service == "agw-auth"

sinks:
  metrics_out:
    type: prometheus_exporter
    inputs:
      - vector_metrics_in
    address: "{{ private_ip }}:{{ vector_metrics_port }}"
  s3_out:
    type: aws_s3
    inputs:
      - docker_logs_agw_auth
      - docker_logs_agw_envoy
    encoding:
      codec: json
    buffer:
      type: disk
      max_size: 268435488 # ~256MB, the minimum allowed by the sink
    endpoint: "{{ vector_logging_s3_endpoint }}"
    auth:
      access_key_id: "<redacted>"
      secret_access_key: "<redacted>"
    bucket: "{{ vector_logging_s3_bucket }}"
    key_prefix: '{{ vector_logging_s3_key_prefix }}/{{ "{{ service }}" }}/'
    region: "{{ region }}"
    framing:
      method: newline_delimited

Version

0.37.0

Debug Output

No response

Example Data

No response

Additional Context

No response

References

No response

@quantumsheep quantumsheep added the type: bug A code related bug. label Apr 22, 2024
@quantumsheep quantumsheep changed the title Empty vector_component_sent_bytes_total but not in vector_component_sent_events_total Empty component_sent_bytes_total but not in component_sent_event_bytes_total Apr 22, 2024
@jszwedko
Copy link
Member

Hmm, that's odd. No component_sent_bytes_total is not deprecated. It looks like some span labels are missing somewhere. Thanks for the report!

@jszwedko jszwedko added domain: observability Anything related to monitoring/observing Vector sink: aws_s3 Anything `aws_s3` sink related labels Apr 22, 2024
@jszwedko
Copy link
Member

I bisected this regression down to c2cc94a

@jszwedko
Copy link
Member

Config:

sources:
  metrics:
    type: internal_metrics
    namespace: vector
    scrape_interval_secs: 1.0
    tags:
      host_key: host
  demo:
    type: demo_logs
    format: json

sinks:
  sink0:
    type: prometheus_exporter
    inputs:
    - metrics
  s3_out:
    type: aws_s3
    inputs:
      - demo
    endpoint: "http://localhost:4566"
    auth:
      access_key_id: "foo"
      secret_access_key: "bar"
    batch:
      timeout_secs: 1
    bucket: "foo"
    key_prefix: 'bar'
    region: "us-east-1"
    encoding:
      codec: json

Reproduction steps:

  • docker run --rm -it -p 4566:4566 -p 4510-4559:4510-4559 localstack/localstack
  • vector --config vector.yaml
  • curl localhost:9598/metrics | grep component_sent_bytes_total
  • Observe vector_component_sent_bytes_total being published without component labels

It seems like we need to be propagating a span somewhere that we aren't.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
domain: observability Anything related to monitoring/observing Vector sink: aws_s3 Anything `aws_s3` sink related type: bug A code related bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants