SparkSubmitOperator not mark task as success after spark job complete job #39524
Labels
area:providers
kind:bug
This is a clearly a bug
pending-response
provider:apache-spark
stale
Stale PRs per the .github/workflows/stale.yml policy file
Apache Airflow Provider(s)
apache-spark
Versions of Apache Airflow Providers
apache-airflow-providers-apache-hive==6.1.2
apache-airflow-providers-apache-spark==4.1.1
apache-airflow-providers-cncf-kubernetes==7.3.0
apache-airflow-providers-common-sql==1.6.0
Apache Airflow version
2.6.3
Operating System
Debian GNU/Linux 11 (bullseye)
Deployment
Official Apache Airflow Helm Chart
Deployment details
Dockerfile FROM image apache/airflow:2.6.3-python3.10
airflow helm Chart
What happened
We run lots of jobs every day, but every day one or two tasks the Spark complete the job, but the task still keeps in running state with no new logs, until manually change the state t success.
Task with wrong behavior, the logs keep like this:
What you think should happen instead
Expected behavior: a normal task will have the
taskinstance.py
part marking the task as success :How to reproduce
It's totally random, happens in different dags with different tasks
Anything else
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: