Sending data to multiple location
-
Updated
Sep 25, 2019 - Scala
Sending data to multiple location
Simple utility package to convert EDF/EDF+ files into Apache Parquet format.
A summative coursework for CSC8101 Engineering for AI
🔄 Convert csv to parquet and explore parquet data structure
A Cap'n Proto compiler plugin to create a Parquet schema from a Cap'n Proto schema
Various Arrow C++ examples
Streamline Amazon RDS PostgreSQL to Parquet conversion via AWS Lambda and GitHub Actions for effortless S3 storage.
Exploratory Analysis of Amazon Product Reviews Dataset comprising of various categories spanning over 14 years
Daily consolidated and enriched snapshots of endoflife.date
a toolkit that provides an object-oriented interface for working with parquet datasets on AWS
🦖 Efficiently evolve your old fixed-length data files into more modern file formats, fully parallelized!
Realtime distributed OLAP datastore, designed to answer OLAP queries with low latency written in Go. In Active development
A C++ library for easily writing Parquet files containing columns of (mostly) any type you wish.
Converts between file formats such as CSV and Parquet
PHP implementation for reading and writing Apache Parquet files/streams. NOTICE: Please migrate to https://github.com/codename-hub/php-parquet.
This is a library for working with Apache Arrow and Parquet data.
This is a simple Java POC to create Parquet files This is a Spring Boot project.
Add a description, image, and links to the apache-parquet topic page so that developers can more easily learn about it.
To associate your repository with the apache-parquet topic, visit your repo's landing page and select "manage topics."