Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to run metaflow steps in custom docker containers on local? #1743

Open
AyushUnleashed opened this issue Feb 19, 2024 · 0 comments

Comments

@AyushUnleashed
Copy link

What I am trying to do:

I am trying to create a metaflow pipeline.
I have several steps one of those steps, I want to run in a custom docker container.
The ways I found was using either @kuberenetes decorator & specifying the image name or using @Batch decorator.
Now, since I want to test it first.

Approach I followed:

I thought of setting up minikube on my local, so for a particular step metaflow could spin up the custom container in minikube & do the process.

What went wrong:

but, while running the command.

 python metaflow_test.py run --with kubernetes:image=hub.docker.com/custom_container

I get error:

Kubernetes error:
The @kubernetes decorator requires --datastore=s3 or --datastore=azure or --datastore=gs at the moment.

Conclusion:

looking at this error what I am understading is I can only spawn containers on remote k8s clusters running on cloud not on minikube.

Question:

is there a way to test locally? or should I just do the setup on cloud.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant