-
Type:
Bug
-
Status: Closed
-
Resolution: Fixed
-
Affects Version/s: 6.2.X EE, 7.0.0 CE GA1
-
Fix Version/s: 6.2.X EE, 7.0.0 CE GA1
-
Component/s: Portal Services, Portal Services > Store
-
Labels:
-
Branch Version/s:6.2.x
-
Backported to Branch:Committed
-
Fix Priority:5
-
Git Pull Request:
When all connections are leaked, S3Store stops working as connection pool keeps all threads waiting. As a result, Document Library stops working.
Steps to reproduce
Master
- Configure S3 store by specifying some valid credentials and bucket
- Upload a document from the documents & media portlet (from control panel, no need to prepare a page)
- Go to the DB and get the fileEntryId for that document. Following SQL query will give that piece of data:
select fileEntryId, createdate from DLFileEntry order by createdate desc limit 1;
- From the scripting console, run the following groovy script. Make sure to change the first line with the fileEntryId you got in previous step:
long dlFileEntryId=//<put here the file entry Id>; import com.liferay.document.library.kernel.service.DLFileEntryLocalServiceUtil import com.liferay.document.library.kernel.model.DLFileEntry; import com.liferay.portal.kernel.log.Log; import com.liferay.portal.kernel.log.LogFactoryUtil; Log _log = LogFactoryUtil.getLog("Test S3Store http connections"); DLFileEntry dlfe = DLFileEntryLocalServiceUtil.getDLFileEntry(dlFileEntryId); for (int i = 0; i < 220; i++) { try { _log.error("Getting S3 file " + i + " times"); DLFileEntryLocalServiceUtil.getFile(dlFileEntryId, dlfe.getVersion(), false, 0); } catch(Exception e) { e.printStackTrace() } Thread.sleep(10); }
Observed result
- After 51 iterations (i=50) script can't get more connections (master configures S3Store "Http client max connections" parameter to 50)
- Portal is unable to access document library. This is the exception you get:
11:23:18,326 ERROR [http-nio-8080-exec-8] [DLPreviewableProcessor:926] com.liferay.portal.kernel.exception.SystemException: Unable to execute HTTP request: Timeout waiting for connection from pool com.liferay.portal.kernel.exception.SystemException: Unable to execute HTTP request: Timeout waiting for connection from pool at com.liferay.portal.store.s3.S3Store.transform(S3Store.java:672) at com.liferay.portal.store.s3.S3Store.getS3Object(S3Store.java:535) at com.liferay.portal.store.s3.S3Store.hasFile(S3Store.java:271) at com.liferay.document.library.kernel.store.BaseStore.hasFile(BaseStore.java:418) at com.liferay.portlet.documentlibrary.store.DLStoreImpl.hasFile(DLStoreImpl.java:354) at com.liferay.document.library.kernel.store.DLStoreUtil.hasFile(DLStoreUtil.java:491) at com.liferay.document.library.kernel.util.DLPreviewableProcessor.hasThumbnail(DLPreviewableProcessor.java:921) at com.liferay.document.library.kernel.util.DLPreviewableProcessor.hasThumbnails(DLPreviewableProcessor.java:933) at com.liferay.portlet.documentlibrary.util.ImageProcessorImpl.hasImages(ImageProcessorImpl.java:159) at com.liferay.document.library.kernel.util.ImageProcessorUtil.hasImages(ImageProcessorUtil.java:149) at com.liferay.portlet.documentlibrary.util.DLImpl.getThumbnailSrc(DLImpl.java:734) at com.liferay.document.library.kernel.util.DLUtil.getThumbnailSrc(DLUtil.java:250) at org.apache.jsp.document_005flibrary.view_005fentries_jsp._jspService(view_005fentries_jsp:969)
- Thread dump shows that portal is waiting to get a free connection when showing document library page:
"http-nio-8080-exec-10" daemon prio=10 tid=0x00007f3fb000b000 nid=0x6928 waiting on condition [0x00007f3f670ea000] java.lang.Thread.State: TIMED_WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x00000000ec8b48b8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.parkUntil(LockSupport.java:267) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitUntil(AbstractQueuedSynchronizer.java:2130) at org.apache.http.pool.PoolEntryFuture.await(PoolEntryFuture.java:131) at org.apache.http.pool.AbstractConnPool.getPoolEntryBlocking(AbstractConnPool.java:282) at org.apache.http.pool.AbstractConnPool.access$000(AbstractConnPool.java:64) at org.apache.http.pool.AbstractConnPool$2.getPoolEntry(AbstractConnPool.java:177) at org.apache.http.pool.AbstractConnPool$2.getPoolEntry(AbstractConnPool.java:170) at org.apache.http.pool.PoolEntryFuture.get(PoolEntryFuture.java:102) at org.apache.http.impl.conn.PoolingClientConnectionManager.leaseConnection(PoolingClientConnectionManager.java:208) at org.apache.http.impl.conn.PoolingClientConnectionManager$1.getConnection(PoolingClientConnectionManager.java:195) at sun.reflect.GeneratedMethodAccessor582.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.amazonaws.http.conn.ClientConnectionRequestFactory$Handler.invoke(ClientConnectionRequestFactory.java:70) at com.amazonaws.http.conn.$Proxy741.getConnection(Unknown Source) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:423) at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:728) at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:489) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:310) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3608) at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1135) at com.liferay.portal.store.s3.S3Store.getS3Object(S3Store.java:519) at com.liferay.portal.store.s3.S3Store.hasFile(S3Store.java:271) at com.liferay.document.library.kernel.store.BaseStore.hasFile(BaseStore.java:418) at com.liferay.portlet.documentlibrary.store.DLStoreImpl.hasFile(DLStoreImpl.java:354) at com.liferay.document.library.kernel.store.DLStoreUtil.hasFile(DLStoreUtil.java:491) at com.liferay.document.library.kernel.util.DLPreviewableProcessor.hasThumbnail(DLPreviewableProcessor.java:921) at com.liferay.document.library.kernel.util.DLPreviewableProcessor.hasThumbnails(DLPreviewableProcessor.java:933) at com.liferay.portlet.documentlibrary.util.ImageProcessorImpl.hasImages(ImageProcessorImpl.java:159) at com.liferay.document.library.kernel.util.ImageProcessorUtil.hasImages(ImageProcessorUtil.java:149) at com.liferay.portlet.documentlibrary.util.DLImpl.getThumbnailSrc(DLImpl.java:734) at com.liferay.document.library.kernel.util.DLUtil.getThumbnailSrc(DLUtil.java:250) at org.apache.jsp.document_005flibrary.view_005fentries_jsp._jspService(view_005fentries_jsp:969)
Expected result
- Portal operates normally
- No connection starvation
- No exceptions
6.2.x
Same steps depicted for master apply here, with the exception of the script to execute, which is sligthly different:
long dlFileEntryId=//<put here the file entry Id>; import com.liferay.portlet.documentlibrary.service.DLFileEntryLocalServiceUtil import com.liferay.portlet.documentlibrary.model.DLFileEntry; import com.liferay.portal.kernel.log.Log; import com.liferay.portal.kernel.log.LogFactoryUtil; Log _log = LogFactoryUtil.getLog("Test S3 store http connections"); DLFileEntry dlfe = DLFileEntryLocalServiceUtil.getDLFileEntry(dlFileEntryId); for (int i = 0; i < 220; i++) { try { _log.error("Getting S3 file " + i + " times"); DLFileEntryLocalServiceUtil.getFile(dlfe.getUserId(), dlFileEntryId, dlfe.getVersion(), false, 0); } catch(Exception e) { e.printStackTrace() } Thread.sleep(10); }
For ee-6.2.x, the connection limit is set by the httpclient library pool max connections (defaults to 20)
Root cause (valid for master and 6.2.x)
S3Store maintains a File cache so that there is no need to download the file from S3 on each request if last modification time did not change. When calling S3Store.getFile() we get the S3Object from S3 service to compare last modification time.
Whenever we get an S3Object from the S3Service, the input stream to that object is opened, even if we don't download it.
Usually, Store callers will close that input stream after consuming it. But this can be done only if Store callers retrieve the inputStream from the Store. When callers retrieve a File object from the Store, then it's the Store who has to deal with stream closing.
When managing File cache, S3Store closes the stream when adding the file to the cache. Subsequent accesses will retrieve the file from cache but forget to close the stream from the S3Object retrieved to compare timestamps.
All these considerations are valid for 6.2.x even though we use jets3t library to access S3: fetching the S3Object implies that input stream is opened.
- relates
-
LPE-14869 Document Library stops working when using the S3 Store due to leaked HTTP connections
-
- Closed
-