Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
399 views
in Technique[技术] by (71.8m points)

laravel - Fastest way to get a list of files from S3

My s3 bucket has 900k files (containing about 600GB of data) and will keep growing.

I'm working on a Laravel project which is deployed on a Vapor environment (serverless from aws lambda and I have a 15min execution limit). Using the default Laravel stuff Storage::disk('s3')->allFiles(''); it instantly runs out of memory, so given my limits I think I should be using straight CLI commands and interpret those results.

In this situation, just doing a simple command like this is not enough.

$ aws s3 ls s3://tc-v3-live --recursive --summarize

[.......]
2020-11-13 17:14:14   4325  bla-bla-file.pdf

Total Objects: 969451
   Total Size: 575558891206

What I'm looking for is a way to get only the filenames without the timestamp or the filesize - assuming that would be way much faster. The example above takes 7 minutes to execute with the default maximum chunk size of 1000.

So idealy I'd like to use something like

$ aws s3 ls s3://tc-v3-live --no-timestamp --no-size

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...