使用formidable和(knox或aws-sdk)将文件流上传到Node.js上的S3

我正在尝试使用 aws-sdkknox将通过表单提交的文件直接上传到Amazon S3存储桶.表单处理在 formidable完成.

我的问题是:如何使用aws-sdk(或knox)使用这些库的最新功能处理流来正确使用强大的功能

我知道这个主题已经在这里被提出了不同的风格,即:

> How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?
> node application stream file upload directly to amazon s3
> Accessing the raw file stream from a node-formidable file upload(以及关于覆盖form.onPart()的非常有用的接受答案)

但是,我认为答案有点过时和/或偏离主题(即CORS支持,我现在不希望出于各种原因使用)和/或最重要的是,没有提及来自的最新功能aws-sdk(参见:https://github.com/aws/aws-sdk-js/issues/13#issuecomment-16085442)或knox(特别是putStream()或其readableStream.pipe(req)变体,both explained in the doc).

经过几个小时的挣扎,我得出结论,我需要一些帮助(免责声明:我是一个流媒体的新手).

HTML表单:

<form action="/uploadPicture" method="post" enctype="multipart/form-data">
  <input name="picture" type="file" accept="image/*">
  <input type="submit">
</form>

Express bodyParser中间件以这种方式配置:

app.use(express.bodyParser({defer: true}))

POST请求处理程序:

uploadPicture = (req,res,next) ->
  form = new formidable.IncomingForm()
  form.parse(req)

  form.onPart = (part) ->
    if not part.filename
      # Let formidable handle all non-file parts (fields)
      form.handlePart(part)
    else
      handlePart(part,form.bytesExpected)

  handlePart = (part,fileSize) ->
    # aws-sdk version
    params =
      Bucket: "mybucket"
      Key: part.filename
      ContentLength: fileSize
      Body: part # passing stream object as body parameter

    awsS3client.putObject(params,(err,data) ->
      if err
        console.log err
      else
        console.log data
    )

但是,我收到以下错误

{ [RequestTimeout: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.]

message: ‘Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.’,
code: ‘RequestTimeout’,
name: ‘RequestTimeout’,
statusCode: 400,
retryable: false }

以这种方式定制的knox版本的handlePart()函数也惨遭失败:

handlePart = (part,fileSize) ->
  headers =
    "Content-Length": fileSize
    "Content-Type": part.mime
  knoxS3client.putStream(part,part.filename,headers,res) ->
    if err
      console.log err
    else
      console.log res
  )

我还得到一个带有400 statusCode的大型res对象.

在两种情况下,区域都配置为eu-west-1.

补充笔记:

node 0.10.12

latest formidable from npm (1.0.14)

latest aws-sdk from npm (1.3.1)

latest knox from npm (0.8.3)

解决方法

好吧,according to the creator of Formidable,直接流式传输到Amazon S3是不可能的:

The S3 API requires you to provide the size of new files when creating them. This information is not available for multipart/form-data files until they have been fully received. This means streaming is impossible.

实际上,form.bytesExpected指的是整个表单的大小,而不是单个文件的大小.

因此,在上传到S3之前,数据必须首先点击服务器上的内存或磁盘.

相关文章

这篇文章主要介绍“基于nodejs的ssh2怎么实现自动化部署”的...
本文小编为大家详细介绍“nodejs怎么实现目录不存在自动创建...
这篇“如何把nodejs数据传到前端”文章的知识点大部分人都不...
本文小编为大家详细介绍“nodejs如何实现定时删除文件”,内...
这篇文章主要讲解了“nodejs安装模块卡住不动怎么解决”,文...
今天小编给大家分享一下如何检测nodejs有没有安装成功的相关...