如何将 Argo 工作流多个输出工件指定到单个目录?

问题描述

我正在使用 Argo Workflow,并且想要生成 2 个单独的工件。如下定义输出工件,它告诉我 path '/tmp' already mounted in inputs.artifacts.txt。如何将生产 2 个单独的工件安装到单个目录(在本例中为 /tmp)?

outputs:
  artifacts:
  - name: txt
    path: /tmp
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: test.txt.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
  - name: total-file-count
    path: /tmp
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: total-file-count.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0

解决方法

path 指的是要写入 S3 的工件的完整路径(不仅仅是找到文件的目录)。

要将两个工件写入 S3,请使用源文件的完整路径。假设文件名与键名匹配,这应该可以工作:

outputs:
  artifacts:
  - name: txt
    path: /tmp/test.txt.tgz
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: test.txt.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
  - name: total-file-count
    path: /tmp/total-file-count.tgz
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: total-file-count.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0