Package | Description |
---|---|
parquet.column | |
parquet.column.impl | |
parquet.column.page | |
parquet.hadoop |
Provides classes to store use Parquet files in Hadoop
In a map reduce job:
|
parquet.io | |
parquet.schema | |
parquet.tools.command | |
parquet.tools.util |
Modifier and Type | Method and Description |
---|---|
ColumnDescriptor |
ColumnReader.getDescriptor() |
ColumnDescriptor |
UnknownColumnException.getDescriptor() |
Modifier and Type | Method and Description |
---|---|
int |
ColumnDescriptor.compareTo(ColumnDescriptor o) |
ColumnReader |
ColumnReadStore.getColumnReader(ColumnDescriptor path) |
ColumnWriter |
ColumnWriteStore.getColumnWriter(ColumnDescriptor path) |
ValuesReader |
Encoding.getDictionaryBasedValuesReader(ColumnDescriptor descriptor,
ValuesType valuesType,
Dictionary dictionary)
To read decoded values that require a dictionary
|
ValuesReader |
Encoding.getValuesReader(ColumnDescriptor descriptor,
ValuesType valuesType)
To read decoded values that don't require a dictionary
|
ValuesWriter |
ParquetProperties.getValuesWriter(ColumnDescriptor path,
int initialSizePerCol) |
Dictionary |
Encoding.initDictionary(ColumnDescriptor descriptor,
DictionaryPage dictionaryPage)
initializes a dictionary from a page
|
Constructor and Description |
---|
UnknownColumnException(ColumnDescriptor descriptor) |
Modifier and Type | Method and Description |
---|---|
Set<ColumnDescriptor> |
ColumnWriteStoreImpl.getColumnDescriptors() |
Modifier and Type | Method and Description |
---|---|
ColumnReader |
ColumnReadStoreImpl.getColumnReader(ColumnDescriptor path) |
ColumnWriter |
ColumnWriteStoreImpl.getColumnWriter(ColumnDescriptor path) |
Modifier and Type | Method and Description |
---|---|
PageReader |
PageReadStore.getPageReader(ColumnDescriptor descriptor) |
PageWriter |
PageWriteStore.getPageWriter(ColumnDescriptor path) |
Modifier and Type | Method and Description |
---|---|
void |
ParquetFileWriter.startColumn(ColumnDescriptor descriptor,
long valueCount,
CompressionCodecName compressionCodecName)
start a column inside a block
|
Constructor and Description |
---|
ParquetFileReader(org.apache.hadoop.conf.Configuration configuration,
org.apache.hadoop.fs.Path filePath,
List<BlockMetaData> blocks,
List<ColumnDescriptor> columns) |
Modifier and Type | Method and Description |
---|---|
ColumnDescriptor |
PrimitiveColumnIO.getColumnDescriptor() |
Modifier and Type | Method and Description |
---|---|
ColumnDescriptor |
MessageType.getColumnDescription(String[] path) |
Modifier and Type | Method and Description |
---|---|
List<ColumnDescriptor> |
MessageType.getColumns() |
Modifier and Type | Method and Description |
---|---|
static void |
DumpCommand.dump(PrettyPrintWriter out,
ColumnReadStoreImpl crstore,
ColumnDescriptor column,
long page,
long total,
long offset) |
static void |
DumpCommand.dump(PrettyPrintWriter out,
PageReadStore store,
ColumnDescriptor column) |
Modifier and Type | Method and Description |
---|---|
static void |
MetadataUtils.showDetails(PrettyPrintWriter out,
ColumnDescriptor desc) |
Copyright © 2015. All rights reserved.