ParquetDataset (path_or_paths[, filesystem, …]) |
Encapsulates details of reading a complete Parquet dataset possibly consisting of multiple files and partitions in subdirectories |
ParquetFile (source[, metadata, …]) |
Reader interface for a single Parquet file |
ParquetWriter (where, schema[, filesystem, …]) |
Class for incrementally building a Parquet file for Arrow tables |
read_table (source[, columns, use_threads, …]) |
Read a Table from Parquet format |
read_metadata (where[, memory_map]) |
Read FileMetadata from footer of a single Parquet file |
read_pandas (source[, columns, use_threads, …]) |
Read a Table from Parquet format, also reading DataFrame index values if known in the file metadata |
read_schema (where[, memory_map]) |
Read effective Arrow schema from Parquet file metadata |
write_metadata (schema, where[, version, …]) |
Write metadata-only Parquet file from schema |
write_table (table, where[, row_group_size, …]) |
Write a Table to Parquet format |
write_to_dataset (table, root_path[, …]) |
Wrapper around parquet.write_table for writing a Table to Parquet format by partitions. |