通过bufferedmutator在hbase中插入数据时出现以下错误,
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at org.apache.hadoop.hbase.client.AsyncRequestFutureImpl.sendMultiAction(AsyncRequestFutureImpl.java:547)
at org.apache.hadoop.hbase.client.AsyncProcess.submitMultiActions(AsyncProcess.java:337)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:320)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:228)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.doFlush(BufferedMutatorImpl.java:303)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.close(BufferedMutatorImpl.java:241)
在这里,我插入的数据长度200-300字节每个请求与单一连接。并生成同步查询。也可以找到下面的线程转储。其中定时等待线程是htable池的一部分。线程转储
你能回答一下如何处理这个例外吗。样本代码-
try(BufferedMutator mutator = Connection.creatConnection().getBufferedMutator(TableName.valueOf(tableName)) {
Put put = new Put(Bytes.toBytes(rowKey));
map.forEach(element->put.addColumn(CF,element.getKey(),element.getValue());
mutator.mutate(put);
} catch (Exception e) {
e.printStackTrace();
}
暂无答案!
目前还没有任何答案,快来回答吧!