hadoop - Trying to load more than 32 hfiles to one family of one region -


I'm importing many files in the Hibiz table, so I decided to use bulk load. I managed to prepare data via MapReduce job, but when I try to full load using the command

  HBase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles & LT; Src & gt; & Lt; Target_table & gt;   

I get the following error:

  error mapreduce.LoadIncrementalHFiles: In the formula start with the key exception of the area's family d to more than 32 hfiles trying to load the "main" java.io.IOException: an area in org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad (LoadIncrementalHFiles.java:288) on org.apache.hadoop.hbase is trying to load more than 32 hfiles for a family .mapreduce.LoadIncrementalHFiles.run (LoadIncrementalHFiles.java:842) org.apache.hadoop.util.ToolRunner.run (ToolRunner.java:70) org. Apache.hadoop.util.ToolRunner.run On Su (ToolRunner.java:84) org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main (LoadIncrementalHFiles.java:847)   

Do you know the family and sectors per hfiles How to make a change in the number of And is it possible to change it from CLI?

After

You can either configure "hbase.hregion .max.filesize" in the form of an Hbase-site.xml file or high value -D logic (depending on your input file size), the number of hfiles created will be reduced default value of 10 GB < / Em>.

or "hbase.mapreduce.bulkload.max.hfiles.perRegion.perFamily" can be configured in the hbase- site .xml or as -D logic high value (Column can be obtained from HDFS for the maximum number of HF files created for the family). Default value is 32 .

Comments

Popular posts from this blog

ios - Adding an SKSpriteNode to SKScene from a child SKSpriteNode -

Matlab transpose a table vector -

c# - Textbox not clickable but editable -