hbase2.1.x 批量get数据 作者:马育民 • 2021-11-30 07:27 • 阅读:10149 # 说明 执行 `get` 查询,可以传入 多个 `rowkey` 相当于 SQL 的:`select * from ... where id in ('1','2','3')` ### 底层实现 hbase java api,批量操作的底层,都是调用 `batch` 方法 批量 `get()` 方法在 `HTable` 类中,声明如下: ``` @Override public Result[] get(List gets) throws IOException { if (gets.size() == 1) { return new Result[]{get(gets.get(0))}; } try { Object[] r1 = new Object[gets.size()]; batch((List extends Row>)gets, r1, readRpcTimeoutMs); // Translate. Result [] results = new Result[r1.length]; int i = 0; for (Object obj: r1) { // Batch ensures if there is a failure we get an exception instead results[i++] = (Result)obj; } return results; } catch (InterruptedException e) { throw (InterruptedIOException)new InterruptedIOException().initCause(e); } } ``` **解释:** 将 多个 `Get` 对象放入 `List` 中,然后将 list 传入 `get` 方法,返回 `Result[]` # 案例 查询 `rowkey` 是 `1001`、`1002` 的数据 相当于 SQL中的: `select * from book where id in ('1001','1002')` ### 添加测试数据 清空表: ``` truncate 'student' ``` 添加数据: ``` put 'book','1001','c1:title','htlm从入门到放弃' put 'book','1001','c1:author','lucy' put 'book','1001','c1:price','097.50' put 'book','1002','c1:title','css从入门到放弃' put 'book','1002','c1:author','lili' put 'book','1002','c1:price','099.90' put 'book','1003','c1:title','hadoop从入门到放弃' put 'book','1003','c1:author','韩梅梅' put 'book','1003','c1:price','190.00' ``` ### 关键代码 导包: ``` import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.CellUtil; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Get; import org.apache.hadoop.hbase.client.Put; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.Table; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; import java.util.ArrayList; import java.util.List; ``` 方法: ``` public List getBatch(String tableName, String[] rowkeys) throws IOException { TableName tn = TableName.valueOf(tableName); //admin封装对表的操作,创建表、删除表、禁用表、启用表 Table table = conn.getTable(tn); List list = new ArrayList<>(); for (String item : rowkeys) { Get get = new Get(Bytes.toBytes(item)); list.add(get); } List retList = new ArrayList<>(); //执行get操作 Result[] results = table.get(list); for (Result result : results) { //获取rowkey String rowkey = Bytes.toString(result.getRow()); RowData rowData = new RowData(rowkey); List cells = result.listCells(); if (cells != null) { for (Cell item : cells) { String family = Bytes.toString(CellUtil.cloneFamily(item)); String qualifier = Bytes.toString(CellUtil.cloneQualifier(item)); String value = Bytes.toString(CellUtil.cloneValue(item)); CellData cellData = new CellData(family, qualifier, rowkey, item.getTimestamp(), value); rowData.addCellData(cellData); } retList.add(rowData); } } table.close(); return retList; } ``` ### 测试 ``` HbaseUtils2_getBatch utils=new HbaseUtils2_getBatch(); utils.connect("hadoop1,hadoop2,hadoop3","2181"); String[] rowkeys = {"1020","1021"}; List list = utils.getBatch("book", rowkeys); System.out.println("-------------------------"); for (RowData rowData : list){ List cellDatas = rowData.getCellDatas(); for(CellData cellData :cellDatas){ System.out.print(cellData.getFamily()+":"+cellData.getQualifier()+"="+cellData.getValue()+" | "); } System.out.println("\n-------------------------"); } utils.close(); } ``` 原文出处:http://malaoshi.top/show_1IX2Jspd7byw.html