Affects Version/s: 6.2.X EE
Fix Version/s: 6.2.X EE
Note: although this problem happens both in 6.2.x and master, solution is different as S3Store impl has been modified. Therefore, we'll use 2 separate LPS. This one is specific for 6.2.x
Steps to reproduce:
- Configure portal to use S3Store
- Go to DM section in the control panel
- Upload a big document (>1GB).
- jets3t library creates an BasicHttpEntity entity to read the whole file into memory. This is due to we are passing an inputStream instead of a file, therefore there is no way to know the object size
Documentation for the jets3t putObject() method says:
Note: It is very important to set the object's Content-Length to match the size of the data input stream when possible, as this can remove the need to read data into memory to determine its size.
This makes the upload quite slow and makes it mandatory to have enough memory to store the whole file.
- Pass the size to the Store API (requires changing the APIs as there is no method for this)
- Use multipart upload. This feature allows to partition the file into chunks of a given size, then use separate threads to upload them. Threads are run in parallel up to a given number (jets3t defaults to 2)