Details

    • Branch Version/s:
      6.1.x
    • Backported to Branch:
      Committed
    • Fix Priority:
      5
    • Similar Issues:
      Show 5 results 

      Description

      I'm Using these settings in portal-ext.properties.

      image.hook.impl=com.liferay.portal.image.DatabaseHook
      dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore

      changing the Logo from the portal and store the new one as blob inside the database provides these expation

      17:26:30,712 ERROR [ImageProcessorImpl:269] java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1
      java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1
              at org.postgresql.largeobject.BlobInputStream.read(BlobInputStream.java:98)
              at java.io.InputStream.read(InputStream.java:170)
              at java.io.InputStream.read(InputStream.java:101)
              at com.liferay.portal.kernel.util.StreamUtil.transferByteArray(StreamUtil.java:206)
              at com.liferay.portal.kernel.util.StreamUtil.transfer(StreamUtil.java:162)
              at com.liferay.portal.kernel.util.StreamUtil.transfer(StreamUtil.java:129)
              at com.liferay.portal.util.FileImpl.getBytes(FileImpl.java:401)
              at com.liferay.portal.util.FileImpl.getBytes(FileImpl.java:387)
              at com.liferay.portal.util.FileImpl.getBytes(FileImpl.java:381)
              at com.liferay.portal.kernel.util.FileUtil.getBytes(FileUtil.java:153)
              at com.liferay.portlet.documentlibrary.util.ImageProcessorImpl._generateImages(ImageProcessorImpl.java:248)
              at com.liferay.portlet.documentlibrary.util.ImageProcessorImpl.generateImages(ImageProcessorImpl.java:85)
              at com.liferay.portlet.documentlibrary.util.ImageProcessorUtil.generateImages(ImageProcessorUtil.java:38)
              at com.liferay.portlet.documentlibrary.messaging.ImageProcessorMessageListener.doReceive(ImageProcessorMessageListener.java:36)
              at com.liferay.portal.kernel.messaging.BaseMessageListener.receive(BaseMessageListener.java:25)
              at com.liferay.portal.kernel.messaging.InvokerMessageListener.receive(InvokerMessageListener.java:63)
              at com.liferay.portal.kernel.messaging.SerialDestination$1.run(SerialDestination.java:103)
              at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask._runTask(ThreadPoolExecutor.java:669)
              at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask.run(ThreadPoolExecutor.java:580)
              at java.lang.Thread.run(Thread.java:679)
      
      1. liferay.2012-08-16.log
        11 kB
        James Hinkey
      2. node2.liferay.2012-08-16.log
        11 kB
        James Hinkey
      3. node2.portal-ext.properties
        0.6 kB
        James Hinkey
      4. portal-ext.properties
        0.6 kB
        James Hinkey
      5. what-is-liferay-portal.odt
        31 kB
        James Hinkey

        Activity

        Hide
        Stephen Kostas added a comment -

        This also occurs with Postgres 9.0 with any kind of large file upload (linke an image or video), which is the current supported version.

        Show
        Stephen Kostas added a comment - This also occurs with Postgres 9.0 with any kind of large file upload (linke an image or video), which is the current supported version.
        Hide
        James Hinkey added a comment -

        FAILED adding binary document to Documents and Media

        Environment:

        Debian Linux 32 bit, Postgres 9.0, Cluster-enabled, DBStore for dl.store.impl

        Steps to reproduce:

        1. Create a Postgres db named lportal for user/role liferay
        2. Create two Liferay nodes (node1 and node2)
        3. Set in portal-ext.properties for each node:
        cluster.link.enabled=true
        dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore
        dl.file.max.size=0
        4. Start both nodes
        5. Add the Documents and Media portlet to the welcome page
        6. Add a binary file (e.g. a .odt file)

        Stack trace:

        00:26:56,272 ERROR [liferay/document_library_raw_metadata_processor-1][TikaRawMetadataProcessor:65] Unable to parse
        java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1
        at org.postgresql.largeobject.BlobInputStream.read(BlobInputStream.java:98)
        at java.io.InputStream.read(InputStream.java:170)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at java.io.FilterInputStream.read(FilterInputStream.java:107)
        at org.apache.tika.mime.MimeTypes.readMagicHeader(MimeTypes.java:303)
        at org.apache.tika.mime.MimeTypes.detect(MimeTypes.java:548)
        at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:60)
        at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:126)
        at com.liferay.portal.metadata.TikaRawMetadataProcessor.extractMetadata(TikaRawMetadataProcessor.java:62)
        at com.liferay.portal.metadata.TikaRawMetadataProcessor.extractMetadata(TikaRawMetadataProcessor.java:109)
        at com.liferay.portal.metadata.BaseRawMetadataProcessor.getRawMetadataMap(BaseRawMetadataProcessor.java:71)
        at com.liferay.portal.kernel.metadata.RawMetadataProcessorUtil.getRawMetadataMap(RawMetadataProcessorUtil.java:50)
        at com.liferay.portlet.documentlibrary.util.RawMetadataProcessorImpl.saveMetadata(RawMetadataProcessorImpl.java:140)
        at com.liferay.portlet.documentlibrary.util.RawMetadataProcessorUtil.saveMetadata(RawMetadataProcessorUtil.java:93)
        at com.liferay.portlet.documentlibrary.messaging.RawMetadataProcessorMessageListener.doReceive(RawMetadataProcessorMessageListener.java:36)
        at com.liferay.portal.kernel.messaging.BaseMessageListener.receive(BaseMessageListener.java:25)
        at com.liferay.portal.kernel.messaging.InvokerMessageListener.receive(InvokerMessageListener.java:63)
        at com.liferay.portal.kernel.messaging.SerialDestination$1.run(SerialDestination.java:110)
        at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask._runTask(ThreadPoolExecutor.java:671)
        at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask.run(ThreadPoolExecutor.java:582)
        at java.lang.Thread.run(Thread.java:679)

        Show
        James Hinkey added a comment - FAILED adding binary document to Documents and Media Environment: Debian Linux 32 bit, Postgres 9.0, Cluster-enabled, DBStore for dl.store.impl Steps to reproduce: 1. Create a Postgres db named lportal for user/role liferay 2. Create two Liferay nodes (node1 and node2) 3. Set in portal-ext.properties for each node: cluster.link.enabled=true dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore dl.file.max.size=0 4. Start both nodes 5. Add the Documents and Media portlet to the welcome page 6. Add a binary file (e.g. a .odt file) Stack trace: 00:26:56,272 ERROR [liferay/document_library_raw_metadata_processor-1] [TikaRawMetadataProcessor:65] Unable to parse java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1 at org.postgresql.largeobject.BlobInputStream.read(BlobInputStream.java:98) at java.io.InputStream.read(InputStream.java:170) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at java.io.FilterInputStream.read(FilterInputStream.java:107) at org.apache.tika.mime.MimeTypes.readMagicHeader(MimeTypes.java:303) at org.apache.tika.mime.MimeTypes.detect(MimeTypes.java:548) at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:60) at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:126) at com.liferay.portal.metadata.TikaRawMetadataProcessor.extractMetadata(TikaRawMetadataProcessor.java:62) at com.liferay.portal.metadata.TikaRawMetadataProcessor.extractMetadata(TikaRawMetadataProcessor.java:109) at com.liferay.portal.metadata.BaseRawMetadataProcessor.getRawMetadataMap(BaseRawMetadataProcessor.java:71) at com.liferay.portal.kernel.metadata.RawMetadataProcessorUtil.getRawMetadataMap(RawMetadataProcessorUtil.java:50) at com.liferay.portlet.documentlibrary.util.RawMetadataProcessorImpl.saveMetadata(RawMetadataProcessorImpl.java:140) at com.liferay.portlet.documentlibrary.util.RawMetadataProcessorUtil.saveMetadata(RawMetadataProcessorUtil.java:93) at com.liferay.portlet.documentlibrary.messaging.RawMetadataProcessorMessageListener.doReceive(RawMetadataProcessorMessageListener.java:36) at com.liferay.portal.kernel.messaging.BaseMessageListener.receive(BaseMessageListener.java:25) at com.liferay.portal.kernel.messaging.InvokerMessageListener.receive(InvokerMessageListener.java:63) at com.liferay.portal.kernel.messaging.SerialDestination$1.run(SerialDestination.java:110) at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask._runTask(ThreadPoolExecutor.java:671) at com.liferay.portal.kernel.concurrent.ThreadPoolExecutor$WorkerTask.run(ThreadPoolExecutor.java:582) at java.lang.Thread.run(Thread.java:679)
        Hide
        James Hinkey added a comment -
        • Node1 logfile and properties
        • ODT file used in the test
        Show
        James Hinkey added a comment - Node1 logfile and properties ODT file used in the test
        Hide
        James Hinkey added a comment -

        Logfile and properties file for node2. Note, "node2" is prepended to the files to distinguish from node1 for viewing in this ticket.

        Show
        James Hinkey added a comment - Logfile and properties file for node2. Note, "node2" is prepended to the files to distinguish from node1 for viewing in this ticket.
        Hide
        James Hinkey added a comment -

        Environment:
        Single-node, Linux, Postgres 9.0, DBStore

        Properties:
        dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore
        (max dl file size left at default of portal.properties)

        Reproduction steps:
        Same as those from cluster environment except on a single node

        Same ERROR occurs:

        ERROR [liferay/document_library_raw_metadata_processor-1][TikaRawMetadataProcessor:65] Unable to parse
        java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1
        at org.postgresql.largeobject.BlobInputStream.read(BlobInputStream.java:98)

        Show
        James Hinkey added a comment - Environment: Single-node, Linux, Postgres 9.0, DBStore Properties: dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore (max dl file size left at default of portal.properties) Reproduction steps: Same as those from cluster environment except on a single node Same ERROR occurs: ERROR [liferay/document_library_raw_metadata_processor-1] [TikaRawMetadataProcessor:65] Unable to parse java.io.IOException: org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 1 at org.postgresql.largeobject.BlobInputStream.read(BlobInputStream.java:98)
        Hide
        James Hinkey added a comment -

        An error still occurs on uploading binary files.

        ERROR [http-bio-8080-exec-8][JDBCExceptionReporter:76] Large Objects may not be used in auto-commit mode.

        This error shows when uploading completes and when the binary files are accessed in Docs and Media.

        Environment:
        Single-node, Linux, Postgres 9.0, DBStore

        Properties:
        dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore
        (max dl file size left at default of portal.properties)

        Reproduction steps:
        1. Upload a binary file (e.g. .jpg) file
        ... error occurs
        2. Access that binary file from Docs and Media
        ... error occurs

        Show
        James Hinkey added a comment - An error still occurs on uploading binary files. ERROR [http-bio-8080-exec-8] [JDBCExceptionReporter:76] Large Objects may not be used in auto-commit mode. This error shows when uploading completes and when the binary files are accessed in Docs and Media. Environment: Single-node, Linux, Postgres 9.0, DBStore Properties: dl.store.impl=com.liferay.portlet.documentlibrary.store.DBStore (max dl file size left at default of portal.properties) Reproduction steps: 1. Upload a binary file (e.g. .jpg) file ... error occurs 2. Access that binary file from Docs and Media ... error occurs
        Hide
        Matthew Lee (Inactive) added a comment -

        Committed on:
        Portal 6.2.x GIT ID: 09b6cb299fc0f132ec79388d3f76483becb8361b.

        Show
        Matthew Lee (Inactive) added a comment - Committed on: Portal 6.2.x GIT ID: 09b6cb299fc0f132ec79388d3f76483becb8361b.
        Hide
        Mark Jin added a comment -

        PASSED Manual Testing following James's comments.

        Reproduced on:
        Tomcat 7.0 + MySQL 5. Portal 6.2.x GIT ID: 3243fd0d2082aa54c4e8488ff7d8b4f75ecad9ab.

        Upload document cause error message.

        Fixed on:
        Tomcat 7.0 + MySQL 5. Portal 6.1.x EE GIT ID: 3a5b00674729f3e6a76426d5478e48864f426ba7.
        Tomcat 7.0 + MySQL 5. Portal 6.2.x GIT ID: aae7d23d11dda92f3bc9bcac740f64e09a4bdaef.

        Unable to see the error message.

        Show
        Mark Jin added a comment - PASSED Manual Testing following James's comments. Reproduced on: Tomcat 7.0 + MySQL 5. Portal 6.2.x GIT ID: 3243fd0d2082aa54c4e8488ff7d8b4f75ecad9ab. Upload document cause error message. Fixed on: Tomcat 7.0 + MySQL 5. Portal 6.1.x EE GIT ID: 3a5b00674729f3e6a76426d5478e48864f426ba7. Tomcat 7.0 + MySQL 5. Portal 6.2.x GIT ID: aae7d23d11dda92f3bc9bcac740f64e09a4bdaef. Unable to see the error message.

          People

          • Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:
              Days since last comment:
              2 years, 20 weeks, 1 day ago

              Development

                Structure Helper Panel