开发者

Process large files from S3

开发者 https://www.devze.com 2022-12-07 19:47 出处:网络
I am trying to get a large file开发者_运维百科 (>10gb) on s3 (stored as csv on s3) and send it as a csv in the response header. I am doing it by using the following procedure:

I am trying to get a large file开发者_运维百科 (>10gb) on s3 (stored as csv on s3) and send it as a csv in the response header. I am doing it by using the following procedure:

async getS3Object(params:any) {

        s3.getObject(params, function (err, data) {
            if (err) {
              console.log('Error Fetching File');
            }
            else {
                const csv = data.Body.toString('utf-8');
                res.setHeader('Content-disposition', `attachment; filename=${fileId}.csv`);
                res.set('Content-Type', 'text/csv');
                res.status(200).send(csv);
            }
          });

This is taking painfully long to process the file and send it as a csv attachments. How can I make this faster?


You're dealing with a huge file; you could break that into chunks using range (see also the docs, search for "calling the getobject property"). If you need the whole file, you could split the work off into workers, though at some point the limit will probably be your connection, and if you need to send the whole file as an attachment that won't help much.

A better solution would be to never download the file in the first place. You can do this by streaming from S3 (see also this, and this), or setting up a proxy in your server so the bucket/subdir seems to the client to be in your app.

0

精彩评论

暂无评论...
验证码 换一张
取 消