Skip to content

Releases: Netflix/metaflow

2.9.14

18 Sep 15:23
2b1eab8
Compare
Choose a tag to compare

Improvements

Fixes merging of log lines

This release fixes an issue with merging broken log lines.

Fix issue with using LD_LIBRARY_PATH with Conda environments

In a Conda environment, it is sometimes necessary to set LD_LIBRARY_PATH to first include the Conda's environment libraries before anything else. Prior to this release, this used to cause issues with the escape hatch.

What's Changed

Full Changelog: 2.9.13...2.9.14

2.9.13

29 Aug 23:19
06840c0
Compare
Choose a tag to compare

Bug fix

Revert annotations changes to fix a regression

The recent annotations feature introduced an issue where project, flow_name or user annotations are not being populated for Kubernetes. This release reverts the changes.

What's Changed

Full Changelog: 2.9.12...2.9.13

2.9.12

29 Aug 18:25
6fddf5b
Compare
Choose a tag to compare

Known issues

The annotations feature introduced in this release has an issue where project, flow_name or user annotations are not being populated for Kubernetes. This has been reverted in the next release.

Features

Custom annotations for K8S and Argo Workflows

This release enables users to add custom annotations to the Kubernetes resources that Flows create. The annotations can be configured much in the same way as custom labels

  1. Globally with an environment variable. For example with
export METAFLOW_KUBERNETES_ANNOTATIONS="first=A,second=B"
  1. At a step level by passing a dictionary to the Kubernetes decorator.
@kubernetes(annotations={"first": "A", "second": "B"})

What's Changed

  • Adds custom annotations via env variables by @tylerpotts in #1442
  • Pass the user-defined executable to environment's executable by @romain-intel in #1454
  • Remove validate_environment from task lifecycle by @savingoyal in #1507
  • Fix/863 - Improve error message in metaflow.S3 class when DATATOOLS_S3ROOT is not configured. by @tfurmston in #1491
  • Fix an issue where 0 was not considered False for extension debug opt… by @romain-intel in #1511
  • Bump version to 2.9.12 by @saikonen in #1514

Full Changelog: 2.9.11...2.9.12

2.9.11

17 Jul 14:03
ec8fd6c
Compare
Choose a tag to compare

Bug Fix

Fix regression for @Batch decorator introduced by v2.9.10

This release reverts a validation fix introduced in 2.9.10, which prevented executions of Metaflow tasks on AWS Batch

What's Changed

Full Changelog: 2.9.10...2.9.11

2.9.10

13 Jul 16:40
70cbae9
Compare
Choose a tag to compare

Features

Introduce PagerDuty support for workflows running on Argo Workflows

With this release, Metaflow users can get events on PagerDuty when their workflows succeed or fail on Argo Workflows.
Setting up the notifications is similar to the existing Slack notifications support

  1. Follow these instructions on PagerDuty to set up an Events API V2 integration for your PagerDuty service
  2. You should be able to view the required integration key from the Events API V2 dropdown
  3. To enable notifications on PagerDuty when your Metaflow flow running on Argo Workflows succeeds or fails, deploy it using the --notify-on-error or --notify-on-success flags:
python flow.py argo-workflows create --notify-on-error --notify-on-success --notify-pager-duty-integration-key <pager-duty-integration-key>
  1. You can also set the following environment variable instead of specifying --notify-slack-webhook-url on the CLI everytime
METAFLOW_ARGO_WORKFLOWS_CREATE_NOTIFY_PAGER_DUTY_INTEGRATION_KEY=<pager-duty-integration-key>
  1. Next time the flow fails or succeeds, you should receive a new event on PagerDuty under Incidents (Flow failed) or Changes (Flow succeeded)

What's Changed

Full Changelog: 2.9.9...2.9.10

2.9.9

11 Jul 08:24
fd1f8db
Compare
Choose a tag to compare

Improvements

Fixes a bug with the S3 operations affecting @conda with some S3 providers

This release fixes a bug with the @conda bootstrapping process. There was an issue with the ServerSideEncryption support, that affected some of the S3 operations when using S3 providers that do not implement the encryption headers (for example MinIO).
Affected operations were all that handle multiple files at once:

get_many / get_all / get_recursive / put_many / info_many

which are used as part of bootstrapping a @conda environment when executing remotely.

What's Changed

Full Changelog: 2.9.8...2.9.9

2.9.8

07 Jul 17:34
8547074
Compare
Choose a tag to compare

Improvements

Fixes bug with Argo events parameters

This release fixes an issue with mapping values with spaces from the Argo events payload to flow parameters.

What's Changed

Full Changelog: 2.9.7...2.9.8

2.9.7

27 Jun 09:21
e11a7c3
Compare
Choose a tag to compare

Features

New commands for managing Argo Workflows through the CLI

This release includes new commands for managing workflows on Argo Workflows.
When needed, commands can be authorized by supplying a production token with --authorize.

argo-workflows delete

A deployed workflow can be deleted through the CLI with

python flow.py argo-workflows delete

argo-workflows terminate

A run can be terminated mid-execution through the CLI with

python flow.py argo-workflows terminate RUN_ID

argo-workflows suspend/unsuspend

A run can be suspended temporarily with

python flow.py argo-workflows suspend RUN_ID

Note that the suspended flow will show up as failed on Metaflow-UI after a period, due to this also suspending the heartbeat process. Unsuspending will resume the flow and its status will show as running again. This can be done with

python flow.py argo-workflows unsuspend RUN_ID

Improvements

Faster Job completion checks for Kubernetes

Previously the status for tasks running on Kubernetes was determined through the pod status, which can take a while to update after the last container finishes. This release changes the status checks to use container statuses directly instead.

What's Changed

Full Changelog: 2.9.6...2.9.7

2.9.6

21 Jun 15:59
4fdec0d
Compare
Choose a tag to compare

Features

AWS Step Function state machines can now be deleted through the CLI

This release introduces the command step-functions delete for deleting state machines through the CLI.

For a regular flow

python flow.py step-functions delete

For another users project branch

Comment out the @project decorator from the flow file, as we do not allow using --name with projects.

python project_flow.py step-functions --name project_a.user.saikonen.ProjectFlow delete

For a production or custom branch flow

python project_flow.py --production step-functions delete
# or
python project_flow.py --branch custom step-functions delete

add --authorize PRODUCTION_TOKEN to the command if you do not have the correct production token locally

Improvements

Fixes a bug with the S3 server side encryption feature with some S3 compliant providers.

This release fixes an issue with the S3 server side encryption support, where some S3 compliant providers do not respond with the expected encryption method in the payload. This bug specifically affected regular operation when using MinIO.

Fixes support for --with environment in Airflow

Fixes a bug with the Airflow support for environment variables, where the env values set in the environment decorator could get overwritten.

What's Changed

Full Changelog: 2.9.5...2.9.6

2.9.5

15 Jun 18:25
3d4f85a
Compare
Choose a tag to compare

Features

Ability to choose server side encryption method for S3 uploads

There is now the possibility to choose which server side encryption method to use for S3 uploads by setting an environment variable METAFLOW_S3_SERVER_SIDE_ENCRYPTION with an appropriate value, for example aws:kms or AES256

Improvements

Fixes double quotes with Parameters on Argo Workflows

This release fixes an issue where using parameters on Argo Workflows caused the values to be unnecessarily quoted.

In case you need any assistance or have feedback for us, ping us at chat.metaflow.org or open a GitHub issue.

What's Changed

New Contributors

Full Changelog: 2.9.4...2.9.5