Ecosyste.ms: Issues
An open API service for providing issue and pull request metadata for open source projects.
GitHub / ddotta/parquetize issues and pull requests
#56 - Replace read_delim by read_delim_arrow
Issue -
State: open - Opened by ddotta 5 months ago
- 1 comment
#55 - add get_parquet_info
Pull Request -
State: closed - Opened by nbc 5 months ago
- 1 comment
#54 - table_to_parquet() can now convert files with uppercase extensions
Pull Request -
State: closed - Opened by ddotta 7 months ago
- 2 comments
#53 - Move arrow to Suggests
Pull Request -
State: closed - Opened by thisisnic 8 months ago
- 4 comments
#52 - Fix error on fedora-clang OS
Issue -
State: closed - Opened by ddotta 8 months ago
- 2 comments
#51 - rds gzfile cannot open connection
Issue -
State: closed - Opened by ChristosMichaliaslis 10 months ago
- 5 comments
#50 - Conversions from SAS tables with extension names in uppercase doesn't work
Issue -
State: closed - Opened by ddotta 11 months ago
#49 - Adds argument `read_delim_args` to `csv_to_parquet`
Pull Request -
State: closed - Opened by nikostr 12 months ago
- 1 comment
#48 - Improves documentation for `csv_to_parquet()` for txt files
Pull Request -
State: closed - Opened by ddotta about 1 year ago
- 1 comment
#47 - Specify minimal version for haven
Pull Request -
State: closed - Opened by ddotta over 1 year ago
- 1 comment
#46 - Specify minimal version for haven
Issue -
State: closed - Opened by ddotta over 1 year ago
#45 - fix: remove single quotes in SQL statement
Pull Request -
State: closed - Opened by leungi over 1 year ago
- 2 comments
#44 - Add user_na argument in table_to_parquet function
Pull Request -
State: closed - Opened by ddotta over 1 year ago
- 2 comments
#43 - test: work on download_extract tests to limit the need to download files
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 2 comments
#42 - fix: 503 errors in download_extract tests
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 1 comment
#41 - evol: add an option to check arguments passed by user
Pull Request -
State: open - Opened by nbc over 1 year ago
- 3 comments
#40 - table_to_parquet: SPSS-file is not correctly converted to .parquet when it has user defined missings
Issue -
State: closed - Opened by Schakel17 over 1 year ago
- 8 comments
#39 - Compression and inheritParams
Pull Request -
State: closed - Opened by ddotta over 1 year ago
- 1 comment
#38 - Rely more on `@inheritParams` to simplify documentation of function arguments
Issue -
State: closed - Opened by ddotta over 1 year ago
#37 - Group `@importFrom` in a file to facilitate their maintenance
Issue -
State: closed - Opened by ddotta over 1 year ago
#36 - Arguments `compression` and `compression_level` are never passed to `write_parquet_at_once`
Issue -
State: closed - Opened by ddotta over 1 year ago
#35 - feat: add fst_to_parquet function
Pull Request -
State: closed - Opened by ddotta over 1 year ago
- 4 comments
#34 - Feature/dbi and refactor
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 4 comments
#33 - Add a `duckdb_to_parquet` using low level arrow functions
Issue -
State: open - Opened by ddotta over 1 year ago
- 3 comments
#32 - Feature/dbi to parquet
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 1 comment
#31 - Feature/deprecate chunk size
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 1 comment
#30 - Feature/snapshots
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 1 comment
#29 - Add `fst_to_parquet()` function to convert fst files to parquet format
Issue -
State: closed - Opened by ddotta over 1 year ago
#28 - Add in the other functions of parquetize the functionality with chunks proposed in `table_to_parquet()`
Issue -
State: open - Opened by ddotta over 1 year ago
- 1 comment
#27 - Feature/dbi
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 14 comments
#26 - Feature/refactor
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 4 comments
#25 - There are warnings in the snapshots for unit tests to be deleted
Issue -
State: closed - Opened by ddotta over 1 year ago
#24 - Update vignette when PR #23 will be merged
Issue -
State: closed - Opened by ddotta over 1 year ago
#23 - Feature/chunk by memory
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 4 comments
#22 - Feature/allow chunk compression
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 2 comments
#21 - fix: bug in bychunk logic
Pull Request -
State: closed - Opened by nbc over 1 year ago
- 1 comment
#20 - Added columns selection to `table_to_parquet()` and `csv_to_parquet()` functions
Pull Request -
State: closed - Opened by ddotta over 1 year ago
- 1 comment
#19 - Add the feature to be able to select only a certain number of columns
Issue -
State: closed - Opened by ddotta over 1 year ago
#18 - Erreur conversion variable en raison de l'encodage
Issue -
State: closed - Opened by PtiGourou26 over 1 year ago
- 1 comment
#17 - Solve problems sent by Brian Ripley (CRAN) for Linux
Issue -
State: closed - Opened by ddotta over 1 year ago
- 1 comment
#16 - Add metrics
Issue -
State: open - Opened by ddotta almost 2 years ago
#15 - Use a callback function in read_by_chunk() ?
Issue -
State: closed - Opened by ddotta almost 2 years ago
#14 - Add the feature to convert duckdb files
Issue -
State: closed - Opened by ddotta almost 2 years ago
- 1 comment
#13 - Add the feature to convert sqlite files
Issue -
State: closed - Opened by ddotta almost 2 years ago
#12 - Add the feature to convert json files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement
#11 - Add the feature to convert txt files
Issue -
State: closed - Opened by ddotta almost 2 years ago
- 5 comments
Labels: enhancement
#10 - Add the feature to convert pickle files
Issue -
State: closed - Opened by ddotta almost 2 years ago
- 1 comment
Labels: enhancement
#9 - Improve code coverage with utilities functions
Issue -
State: closed - Opened by ddotta almost 2 years ago
#8 - Check if `path_to_parquet` exists
Issue -
State: closed - Opened by py-b almost 2 years ago
#7 - Add function to convert to parquet by partitioning with `write_dataset()`
Issue -
State: closed - Opened by ddotta almost 2 years ago
#6 - Add compression argument to csv_to_parquet
Issue -
State: closed - Opened by ddotta almost 2 years ago
- 1 comment
#5 - Add a function for SPSS files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement
#4 - Add a function for SAS files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement
#3 - Add the feature to convert rds files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement
#2 - Add a function for RData files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement
#1 - Allow csv_to_parquet to process large csv files
Issue -
State: closed - Opened by ddotta almost 2 years ago
Labels: enhancement