Uploaded image for project: 'WebDanica'
  1. WebDanica
  2. WEBDAN-186

java.lang.OutOfMemoryError during analysis

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Blocker
    • None
    • None
    • ANALYSIS, DATABASE
    • None
    • 2017 sprint - webdanica

    Description

      The list of loaded data settings is empty. Is this OK?org.apache.phoenix.exception.PhoenixIOException: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to create new native thread
      	at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
      	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:695)
      	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:638)
      	at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
      	at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
      	at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
      	at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingResultIterator.java:64)
      	at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResultIterator.java:39)
      	at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:778)
      	at dk.kb.webdanica.core.datamodel.dao.HBasePhoenixSeedsDAO.existsUrl(HBasePhoenixSeedsDAO.java:116)
      	at dk.kb.webdanica.core.datamodel.criteria.CriteriaIngest.process(CriteriaIngest.java:196)
      	at dk.kb.webdanica.core.datamodel.criteria.CriteriaIngest.processFile(CriteriaIngest.java:96)
      	at dk.kb.webdanica.core.interfaces.harvesting.HarvestLog.processCriteriaResults(HarvestLog.java:184)
      	at dk.kb.webdanica.core.datamodel.criteria.CriteriaIngest.ingest(CriteriaIngest.java:50)
      	at dk.kb.webdanica.core.tools.CriteriaIngestTool.main(CriteriaIngestTool.java:71)
      Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to create new native thread
      	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
      	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
      	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:690)
      	... 13 more
      Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to create new native thread
      	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:208)
      	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
      	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
      	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
      	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
      	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:821)
      	at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:121)
      	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108)
      	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      	at java.lang.Thread.run(Thread.java:745)
      Caused by: java.lang.OutOfMemoryError: unable to create new native thread
      	at java.lang.Thread.start0(Native Method)
      	at java.lang.Thread.start(Thread.java:714)
      	at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:950)
      	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1357)
      	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService.submit(ResultBoundedCompletionService.java:142)
      	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.addCallsForCurrentReplica(ScannerCallableWithReplicas.java:273)
      	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:169)
      	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:59)
      	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
      	... 13 more
      ERROR: criteria ingest failed
      

      Attachments

        Activity

          People

            nicl@kb.dk Nicholas Clarke (Inactive)
            svc Søren Vejrup Carlsen (Inactive)
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: