Uploaded image for project: 'PUBLIC - Liferay Portal Community Edition'
  1. PUBLIC - Liferay Portal Community Edition
  2. LPS-60649

Large files are uploaded inneficiently when using the S3Store - 6.2.x



      Note: although this problem happens both in 6.2.x and master, solution is different as S3Store impl has been modified. Therefore, we'll use 2 separate LPS. This one is specific for 6.2.x

      Steps to reproduce:

      • Configure portal to use S3Store
      • Go to DM section in the control panel
      • Upload a big document (>1GB).

      Observed behavior

      • jets3t library creates an BasicHttpEntity entity to read the whole file into memory. This is due to we are passing an inputStream instead of a file, therefore there is no way to know the object size

      Documentation for the jets3t putObject() method says:

      Note: It is very important to set the object's Content-Length to match the size of the data input stream when possible, as this can remove the need to read data into memory to determine its size.

      This makes the upload quite slow and makes it mandatory to have enough memory to store the whole file.

      Possible solutions:

      • Pass the size to the Store API (requires changing the APIs as there is no method for this)
      • Use multipart upload. This feature allows to partition the file into chunks of a given size, then use separate threads to upload them. Threads are run in parallel up to a given number (jets3t defaults to 2)


          Issue Links



              haoliang.wu Haoliang Wu (Inactive)
              daniel.sanz Daniel Sanz
              Participants of an Issue:
              Recent user:
              Esther Sanz
              0 Vote for this issue
              2 Start watching this issue


                Days since last comment:
                5 years, 14 weeks, 5 days ago


                  Version Package
                  6.2.X EE