This crate includes several examples of how to use various DataFusion APIs and help you on your way.
Prerequisites:
Run git submodule update --init to init test files.
avro_sql.rs: Build and run a query plan from a SQL statement against a local AVRO filecsv_sql.rs: Build and run a query plan from a SQL statement against a local CSV filecustom_datasource.rs: Run queris against a custom datasource (TableProvider)dataframe.rs: Run a query using a DataFrame against a local parquet filedataframe_in_memory.rs: Run a query using a DataFrame against data in memorydeserialize_to_struct.rs: Convert query results into rust structs using serdeexpr_api.rs: Use theExprconstruction and simplification APImemtable.rs: Create an query data in memory using SQL andRecordBatchesparquet_sql.rs: Build and run a query plan from a SQL statement against a local Parquet fileparquet_sql_multiple_files.rs: Build and run a query plan from a SQL statement against multiple local Parquet filesquery-aws-s3.rs: Confiureobject_storeand run a query against files stored in AWS S3rewrite_expr.rs: Define and invoke a custom Query Optimizer passsimple_udaf.rs: Define and invoke a User Defined Aggregate Function (UDAF)simple_udf.rs: Define and invoke a User Defined (scalar) Function (UDF)
flight-client.rsandflight-server.rs: Run DataFusion as a standalone process and execute SQL queries from a client using the Flight protocol.