- Notifications
You must be signed in to change notification settings - Fork 4.5k
Sql calcite connection properties docs #36939
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Sql calcite connection properties docs #36939
Conversation
This commit adds support for escaping and unescaping glob wildcard characters in file path specifications, addressing the issue where files with literal glob metacharacters (*, ?, {, }) in their names cannot be matched. Changes: - Added escapeGlobWildcards(String spec) method to escape glob metacharacters by prefixing them with backslash - Added unescapeGlobWildcards(String spec) method to remove backslash prefixes from escaped glob characters - Added comprehensive test cases for both methods including round-trip testing These utilities provide the foundation for allowing users to treat glob metacharacters as literals when they appear in actual filenames. Fixes BEAM-13231 When CDC (Change Data Capture) is enabled with STORAGE_WRITE_API method, the system was incorrectly using PENDING streams instead of the default stream, causing an IllegalStateException due to checkState validation. Changes: - Fixed StorageApiWriteUnshardedRecords to use default stream when CDC is enabled - Added comprehensive test case to prevent regression CDC requires default streams because PENDING streams don't support the RowMutationInformation functionality needed for upserts and deletes. Fixes apache#31422
…AT_LEAST_ONCE - Updated JavaDoc to reflect that triggeringFrequency applies to FILE_LOADS, STORAGE_WRITE_API, and STORAGE_API_AT_LEAST_ONCE methods - Fixed validation logic in expand() to require triggeringFrequency for STORAGE_API_AT_LEAST_ONCE when writing unbounded PCollections - Removed conflicting warning that incorrectly stated STORAGE_API_AT_LEAST_ONCE ignores triggeringFrequency - Added comprehensive test cases to verify proper validation behavior: * Test failure when triggeringFrequency is missing for unbounded collections * Test success when triggeringFrequency is provided for unbounded collections * Test success for bounded collections without triggeringFrequency requirement - Updated error messages to include all three supported methods This ensures consistent behavior across all BigQuery write methods that support triggered writes for unbounded collections.
- Added Python implementation of slowly updating global window side inputs pattern - Uses PeriodicSequence instead of GenerateSequence for broader compatibility - Implements Latest.Globally().without_defaults() as mentioned in issue apache#35934 - Added test coverage for the new pattern - Updated documentation to include Python example alongside Java Fixes missing Python side input pattern in documentation. Addresses issue apache#35934 by using Latest.Globally().without_defaults() for non-global windowing scenarios and ensuring Dataflow compatibility.
… in YAML - Add comprehensive SQL transform examples showing calcite_connection_properties usage - Create dedicated sql/ directory under yaml/examples/transforms with 5 examples: - sql_basic_example.yaml: Basic SQL without special configuration - sql_postgresql_functions.yaml: PostgreSQL functions like SPLIT_PART - sql_bigquery_functions.yaml: BigQuery syntax and functions - sql_mysql_functions.yaml: MySQL date/string functions - sql_advanced_configuration.yaml: Multiple configuration options - Add detailed README.md explaining calcite_connection_properties options - Update yaml/tests/sql.yaml with calcite_connection_properties test cases - Update examples/README.md to reference new SQL documentation This addresses the issue where calcite_connection_properties configuration was 'tricky to get right' by providing clear examples and documentation for different SQL dialects and use cases. Fixes: SQL options in YAML/xlang pipelines need better documentation
Summary of ChangesHello @Arunodoy18, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances Apache Beam by expanding the capabilities and documentation for SQL transforms in YAML pipelines, improving BigQuery I/O functionality, and adding useful file system utilities. The core focus is on providing clearer guidance and more robust support for SQL dialect configurations through Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
| Checks are failing. Will not request review until checks are succeeding. If you'd like to override that behavior, comment |
- Move options block to correct transform-level location - Fix indentation issues that caused precommit validation failures
| assign set of reviewers |
| Assigning reviewers: R: @shunping for label python. Note: If you would like to opt out of this review, comment Available commands:
The PR bot will only process comments in the main thread (not review comments). |
Please add a meaningful description for your change here
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>instead.CHANGES.mdwith noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.