id
int64
599M
3.48B
number
int64
1
7.8k
title
stringlengths
1
290
state
stringclasses
2 values
comments
listlengths
0
30
created_at
timestamp[s]date
2020-04-14 10:18:02
2025-10-05 06:37:50
updated_at
timestamp[s]date
2020-04-27 16:04:17
2025-10-05 10:32:43
closed_at
timestamp[s]date
2020-04-14 12:01:40
2025-10-01 13:56:03
body
stringlengths
0
228k
user
stringlengths
3
26
html_url
stringlengths
46
51
pull_request
dict
is_pull_request
bool
2 classes
1,208,449,335
4,183
Document librispeech configs
closed
[ "I think the main purpose of #4179 was how to be able to load both configs into one, so should we maybe add this part of the code: https://github.com/huggingface/datasets/issues/4179#issuecomment-1102383717 \r\n\r\nto the doc? \r\n\r\nActually @lhoestq would this work given that they have different split names: htt...
2022-04-19T14:26:59
2023-09-24T10:02:24
2022-04-19T15:15:20
Added an example of how to load one config or the other
lhoestq
https://github.com/huggingface/datasets/pull/4183
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4183", "html_url": "https://github.com/huggingface/datasets/pull/4183", "diff_url": "https://github.com/huggingface/datasets/pull/4183.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4183.patch", "merged_at": null }
true
1,208,285,235
4,182
Zenodo.org download is not responding
closed
[ "[Off topic but related: Is the uptime of S3 provably better than Zenodo's?]", "Hi @dkajtoch, please note that at HuggingFace we are not hosting this dataset: we are just using a script to download their data file and create a dataset from it.\r\n\r\nIt was the dataset owners decision to host their data at Zenodo...
2022-04-19T12:26:57
2022-04-20T07:11:05
2022-04-20T07:11:05
## Describe the bug Source download_url from zenodo.org does not respond. `_DOWNLOAD_URL = "https://zenodo.org/record/2787612/files/SICK.zip?download=1"` Other datasets also use zenodo.org to store data and they cannot be downloaded as well. It would be better to actually use more reliable way to store original ...
dkajtoch
https://github.com/huggingface/datasets/issues/4182
null
false
1,208,194,805
4,181
Support streaming FLEURS dataset
closed
[ "Yes, you just have to use `dl_manager.iter_archive` instead of `dl_manager.download_and_extract`.\r\n\r\nThat's because `download_and_extract` doesn't support TAR archives in streaming mode.", "Tried to make it streamable, but I don't think it's really possible. @lhoestq @polinaeterna maybe you guys can check: \...
2022-04-19T11:09:56
2022-07-25T11:44:02
2022-07-25T11:44:02
## Dataset viewer issue for '*name of the dataset*' https://huggingface.co/datasets/google/fleurs ``` Status code: 400 Exception: NotImplementedError Message: Extraction protocol for TAR archives like 'https://storage.googleapis.com/xtreme_translations/FLEURS/af_za.tar.gz' is not implemented in str...
patrickvonplaten
https://github.com/huggingface/datasets/issues/4181
null
false
1,208,042,320
4,180
Add some iteration method on a dataset column (specific for inference)
closed
[ "Thanks for the suggestion ! I agree it would be nice to have something directly in `datasets` to do something as simple as that\r\n\r\ncc @albertvillanova @mariosasko @polinaeterna What do you think if we have something similar to pandas `Series` that wouldn't bring everything in memory when doing `dataset[\"audio...
2022-04-19T09:15:45
2025-06-17T13:08:50
2025-06-17T13:08:50
**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Currently, `dataset["audio"]` will load EVERY element in the dataset in RAM, which can be quite big for an audio dataset. Having an iterator (or sequence) type of object, would make inference ...
Narsil
https://github.com/huggingface/datasets/issues/4180
null
false
1,208,001,118
4,179
Dataset librispeech_asr fails to load
closed
[ "@patrickvonplaten Hi! I saw that you prepared this? :)", "Another thing, but maybe this should be a separate issue: As I see from the code, it would try to use up to 16 simultaneous downloads? This is problematic for Librispeech or anything on OpenSLR. On [the homepage](https://www.openslr.org/), it says:\r\n\r\...
2022-04-19T08:45:48
2022-07-27T16:10:00
2022-07-27T16:10:00
## Describe the bug The dataset librispeech_asr (standard Librispeech) fails to load. ## Steps to reproduce the bug ```python datasets.load_dataset("librispeech_asr") ``` ## Expected results It should download and prepare the whole dataset (all subsets). In [the doc](https://huggingface.co/datasets/libris...
albertz
https://github.com/huggingface/datasets/issues/4179
null
false
1,207,787,073
4,178
[feat] Add ImageNet dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Thanks for the comments. I believe I have addressed all of them and also decreased the size of the dummy data file, so it should be ready for a re-review. I also made a change to allow adding synset mapping and valprep script in conf...
2022-04-19T06:01:35
2022-04-29T21:43:59
2022-04-29T21:37:08
To use the dataset download the tar file [imagenet_object_localization_patched2019.tar.gz](https://www.kaggle.com/competitions/imagenet-object-localization-challenge/data?select=imagenet_object_localization_patched2019.tar.gz) from Kaggle and then point the datasets library to it by using: ```py from datasets impo...
apsdehal
https://github.com/huggingface/datasets/pull/4178
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4178", "html_url": "https://github.com/huggingface/datasets/pull/4178", "diff_url": "https://github.com/huggingface/datasets/pull/4178.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4178.patch", "merged_at": "2022-04-29T21:37...
true
1,207,535,920
4,177
Adding missing subsets to the `SemEval-2018 Task 1` dataset
open
[ "Datasets are not tracked in this repository anymore. You should move this PR to the [discussions page of this dataset](https://huggingface.co/datasets/sem_eval_2018_task_1/discussions)" ]
2022-04-18T22:59:30
2022-10-05T10:38:16
null
This dataset for the [1st task of SemEval-2018](https://competitions.codalab.org/competitions/17751) competition was missing all subtasks except for subtask 5. I added another two subtasks (subtask 1 and 2), which are each comprised of 12 additional data subsets: for each language in En, Es, Ar, there are 4 datasets, b...
micahcarroll
https://github.com/huggingface/datasets/pull/4177
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4177", "html_url": "https://github.com/huggingface/datasets/pull/4177", "diff_url": "https://github.com/huggingface/datasets/pull/4177.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4177.patch", "merged_at": null }
true
1,206,515,563
4,176
Very slow between two operations
closed
[]
2022-04-17T23:52:29
2022-04-18T00:03:00
2022-04-18T00:03:00
Hello, in the processing stage, I use two operations. The first one : map + filter, is very fast and it uses the full cores, while the socond step is very slow and did not use full cores. Also, there is a significant lag between them. Am I missing something ? ``` raw_datasets = raw_datasets.map(split_func...
yanan1116
https://github.com/huggingface/datasets/issues/4176
null
false
1,205,589,842
4,175
Add WIT Dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Hi! Coming in late with some context.\r\n\r\nThere are two versions of the WIT dataset:\r\n1. The original source dataset managed by Wikimedia. It has more information, raw image representations, and each row corresponds to an image ...
2022-04-15T13:42:32
2023-09-24T10:02:38
2022-05-02T14:26:41
closes #2981 #2810 @nateraw @hassiahk I've listed you guys as co-author as you've contributed previously to this dataset
thomasw21
https://github.com/huggingface/datasets/pull/4175
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4175", "html_url": "https://github.com/huggingface/datasets/pull/4175", "diff_url": "https://github.com/huggingface/datasets/pull/4175.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4175.patch", "merged_at": null }
true
1,205,575,941
4,174
Fix when map function modifies input in-place
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-15T13:23:15
2022-04-15T14:52:07
2022-04-15T14:45:58
When `function` modifies input in-place, the guarantee that columns in `remove_columns` are contained in `input` doesn't hold true anymore. Therefore we need to relax way we pop elements by checking if that column exists.
thomasw21
https://github.com/huggingface/datasets/pull/4174
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4174", "html_url": "https://github.com/huggingface/datasets/pull/4174", "diff_url": "https://github.com/huggingface/datasets/pull/4174.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4174.patch", "merged_at": "2022-04-15T14:45...
true
1,204,657,114
4,173
Stream private zipped images
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "oops looks like some tests are failing sorry, will fix them tomorrow\r\n\r\nEDIT: not today but asap hopefully", "cc @mariosasko this is ready for review, let me know what you think !" ]
2022-04-14T15:15:07
2022-05-05T14:05:54
2022-05-05T13:58:35
As mentioned in https://github.com/huggingface/datasets/issues/4139 it's currently not possible to stream private/gated zipped images from the Hub. This is because `Image.decode_example` does not handle authentication. Indeed decoding requires to access and download the file from the private repository. In this P...
lhoestq
https://github.com/huggingface/datasets/pull/4173
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4173", "html_url": "https://github.com/huggingface/datasets/pull/4173", "diff_url": "https://github.com/huggingface/datasets/pull/4173.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4173.patch", "merged_at": "2022-05-05T13:58...
true
1,204,433,160
4,172
Update assin2 dataset_infos.json
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-14T11:53:06
2022-04-15T14:47:42
2022-04-15T14:41:22
Following comments in https://github.com/huggingface/datasets/issues/4003 we found that it was outdated and casing an error when loading the dataset
lhoestq
https://github.com/huggingface/datasets/pull/4172
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4172", "html_url": "https://github.com/huggingface/datasets/pull/4172", "diff_url": "https://github.com/huggingface/datasets/pull/4172.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4172.patch", "merged_at": "2022-04-15T14:41...
true
1,204,413,620
4,170
to_tf_dataset rewrite
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "[Magic is now banned](https://www.youtube.com/watch?v=WIn58XoY728#t=36s) by decree of @sgugger. This is honestly much cleaner, and the functionality will make much more sense in `transformers` anyway!", "@gante I renamed the defaul...
2022-04-14T11:30:58
2022-06-06T14:31:12
2022-06-06T14:22:09
This PR rewrites almost all of `to_tf_dataset()`, which makes it kind of hard to list all the changes, but the most critical ones are: - Much better stability and no more dropping unexpected column names (Sorry @NielsRogge) - Doesn't clobber custom transforms on the data (Sorry @NielsRogge again) - Much better han...
Rocketknight1
https://github.com/huggingface/datasets/pull/4170
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4170", "html_url": "https://github.com/huggingface/datasets/pull/4170", "diff_url": "https://github.com/huggingface/datasets/pull/4170.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4170.patch", "merged_at": "2022-06-06T14:22...
true
1,203,995,869
4,169
Timit_asr dataset cannot be previewed recently
closed
[ "Thanks for reporting. The bug has already been detected, and we hope to fix it soon.", "TIMIT is now a dataset that requires manual download, see #4145 \r\n\r\nTherefore it might take a bit more time to fix it", "> TIMIT is now a dataset that requires manual download, see #4145\r\n> \r\n> Therefore it might ta...
2022-04-14T03:28:31
2023-02-03T04:54:57
2022-05-06T16:06:51
## Dataset viewer issue for '*timit_asr*' **Link:** *https://huggingface.co/datasets/timit_asr* Issue: The timit-asr dataset cannot be previewed recently. Am I the one who added this dataset ? Yes-No No
YingLi001
https://github.com/huggingface/datasets/issues/4169
null
false
1,203,867,540
4,168
Add code examples to API docs
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "> Do you think it is clearer to make every code example fully reproducible so when users copy the code they can actually run it and get an output? This seems quite repetitive - maybe even unnecessary - but it is definitely clearer.\r...
2022-04-13T23:03:38
2022-04-27T18:53:37
2022-04-27T18:48:34
This PR adds code examples for functions related to the base Datasets class to highlight usage. Most of the examples use the `rotten_tomatoes` dataset since it is nice and small. Several things I would appreciate feedback on: - Do you think it is clearer to make every code example fully reproducible so when users co...
stevhliu
https://github.com/huggingface/datasets/pull/4168
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4168", "html_url": "https://github.com/huggingface/datasets/pull/4168", "diff_url": "https://github.com/huggingface/datasets/pull/4168.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4168.patch", "merged_at": "2022-04-27T18:48...
true
1,203,761,614
4,167
Avoid rate limit in update hub repositories
closed
[ "I also set GIT_LFS_SKIP_SMUDGE=1 to speed up git clones", "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-13T20:32:17
2022-04-13T20:56:41
2022-04-13T20:50:32
use http.extraHeader to avoid rate limit
lhoestq
https://github.com/huggingface/datasets/pull/4167
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4167", "html_url": "https://github.com/huggingface/datasets/pull/4167", "diff_url": "https://github.com/huggingface/datasets/pull/4167.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4167.patch", "merged_at": "2022-04-13T20:50...
true
1,203,758,004
4,166
Fix exact match
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-13T20:28:06
2022-05-03T12:23:31
2022-05-03T12:16:27
Clarify docs and add clarifying example to the exact_match metric
emibaylor
https://github.com/huggingface/datasets/pull/4166
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4166", "html_url": "https://github.com/huggingface/datasets/pull/4166", "diff_url": "https://github.com/huggingface/datasets/pull/4166.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4166.patch", "merged_at": "2022-05-03T12:16...
true
1,203,730,187
4,165
Fix google bleu typos, examples
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-13T19:59:54
2022-05-03T12:23:52
2022-05-03T12:16:44
null
emibaylor
https://github.com/huggingface/datasets/pull/4165
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4165", "html_url": "https://github.com/huggingface/datasets/pull/4165", "diff_url": "https://github.com/huggingface/datasets/pull/4165.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4165.patch", "merged_at": "2022-05-03T12:16...
true
1,203,661,346
4,164
Fix duplicate key in multi_news
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-13T18:48:24
2022-04-13T21:04:16
2022-04-13T20:58:02
To merge after this job succeeded: https://github.com/huggingface/datasets/runs/6012207928
lhoestq
https://github.com/huggingface/datasets/pull/4164
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4164", "html_url": "https://github.com/huggingface/datasets/pull/4164", "diff_url": "https://github.com/huggingface/datasets/pull/4164.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4164.patch", "merged_at": "2022-04-13T20:58...
true
1,203,539,268
4,163
Optional Content Warning for Datasets
open
[ "Hi! You can use the `extra_gated_prompt` YAML field in a dataset card for displaying custom messages/warnings that the user must accept before gaining access to the actual dataset. This option also keeps the viewer hidden until the user agrees to terms. ", "Hi @mariosasko, thanks for explaining how to add this f...
2022-04-13T16:38:01
2022-06-09T20:39:02
null
**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. We now have hate speech datasets on the hub, like this one: https://huggingface.co/datasets/HannahRoseKirk/HatemojiBuild I'm wondering if there is an option to select a content warning messa...
TristanThrush
https://github.com/huggingface/datasets/issues/4163
null
false
1,203,421,909
4,162
Add Conceptual 12M
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Looks like your dummy_data.zip file is not in the right location ;)\r\ndatasets/datasets/conceptual_12m/dummy/default/0.0.0/dummy_data.zip\r\n->\r\ndatasets/conceptual_12m/dummy/default/0.0.0/dummy_data.zip" ]
2022-04-13T14:57:23
2022-04-15T08:13:01
2022-04-15T08:06:25
null
thomasw21
https://github.com/huggingface/datasets/pull/4162
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4162", "html_url": "https://github.com/huggingface/datasets/pull/4162", "diff_url": "https://github.com/huggingface/datasets/pull/4162.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4162.patch", "merged_at": "2022-04-15T08:06...
true
1,203,230,485
4,161
Add Visual Genome
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Hum there seems to be some issues with tasks in test:\r\n - some tasks don't fit anything in `tasks.json`. Do I remove them in `task_categories`?\r\n - some tasks should exist, typically `visual-question-answering` (https://github.co...
2022-04-13T12:25:24
2022-04-21T15:42:49
2022-04-21T13:08:52
null
thomasw21
https://github.com/huggingface/datasets/pull/4161
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4161", "html_url": "https://github.com/huggingface/datasets/pull/4161", "diff_url": "https://github.com/huggingface/datasets/pull/4161.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4161.patch", "merged_at": "2022-04-21T13:08...
true
1,202,845,874
4,160
RGBA images not showing
closed
[ "Thanks for reporting. It's a known issue, and we hope to fix it soon.", "Fixed, thanks!" ]
2022-04-13T06:59:23
2022-06-21T16:43:11
2022-06-21T16:43:11
## Dataset viewer issue for ceyda/smithsonian_butterflies_transparent [**Link:** *link to the dataset viewer page*](https://huggingface.co/datasets/ceyda/smithsonian_butterflies_transparent) ![image](https://user-images.githubusercontent.com/15624271/163117683-e91edb28-41bf-43d9-b371-5c62e14f40c9.png) Am I the...
cceyda
https://github.com/huggingface/datasets/issues/4160
null
false
1,202,522,153
4,159
Add `TruthfulQA` dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Bump. (I'm not sure which reviewer to `@` but, previously, @lhoestq has been very helpful 🤗 )" ]
2022-04-12T23:19:04
2022-06-08T15:51:33
2022-06-08T14:43:34
null
jon-tow
https://github.com/huggingface/datasets/pull/4159
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4159", "html_url": "https://github.com/huggingface/datasets/pull/4159", "diff_url": "https://github.com/huggingface/datasets/pull/4159.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4159.patch", "merged_at": "2022-06-08T14:43...
true
1,202,376,843
4,158
Add AUC ROC Metric
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-12T20:53:28
2022-04-26T19:41:50
2022-04-26T19:35:22
null
emibaylor
https://github.com/huggingface/datasets/pull/4158
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4158", "html_url": "https://github.com/huggingface/datasets/pull/4158", "diff_url": "https://github.com/huggingface/datasets/pull/4158.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4158.patch", "merged_at": "2022-04-26T19:35...
true
1,202,239,622
4,157
Fix formatting in BLEU metric card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-12T18:29:51
2022-04-13T14:30:25
2022-04-13T14:16:34
Fix #4148
mariosasko
https://github.com/huggingface/datasets/pull/4157
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4157", "html_url": "https://github.com/huggingface/datasets/pull/4157", "diff_url": "https://github.com/huggingface/datasets/pull/4157.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4157.patch", "merged_at": "2022-04-13T14:16...
true
1,202,220,531
4,156
Adding STSb-TR dataset
closed
[ "Thanks for your contribution, @figenfikri.\r\n\r\nWe are removing the dataset scripts from this GitHub repo and moving them to the Hugging Face Hub: https://huggingface.co/datasets\r\n\r\nWe would suggest you create this dataset there. Please, feel free to tell us if you need some help." ]
2022-04-12T18:10:05
2022-10-03T09:36:25
2022-10-03T09:36:25
Semantic Textual Similarity benchmark Turkish (STSb-TR) dataset introduced in our paper [Semantic Similarity Based Evaluation for Abstractive News Summarization](https://aclanthology.org/2021.gem-1.3.pdf) added.
figenfikri
https://github.com/huggingface/datasets/pull/4156
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4156", "html_url": "https://github.com/huggingface/datasets/pull/4156", "diff_url": "https://github.com/huggingface/datasets/pull/4156.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4156.patch", "merged_at": null }
true
1,202,183,608
4,155
Make HANS dataset streamable
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-12T17:34:13
2022-04-13T12:03:46
2022-04-13T11:57:35
Fix #4133
mariosasko
https://github.com/huggingface/datasets/pull/4155
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4155", "html_url": "https://github.com/huggingface/datasets/pull/4155", "diff_url": "https://github.com/huggingface/datasets/pull/4155.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4155.patch", "merged_at": "2022-04-13T11:57...
true
1,202,145,721
4,154
Generate tasks.json taxonomy from `huggingface_hub`
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Ok recomputed the json file, this should be ready to review now! @lhoestq ", "Note: the generated JSON from `hf/hub-docs` can be found in the output of a GitHub Action run on that repo, for instance in https://github.com/huggingfac...
2022-04-12T17:12:46
2022-04-14T10:32:32
2022-04-14T10:26:13
null
julien-c
https://github.com/huggingface/datasets/pull/4154
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4154", "html_url": "https://github.com/huggingface/datasets/pull/4154", "diff_url": "https://github.com/huggingface/datasets/pull/4154.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4154.patch", "merged_at": "2022-04-14T10:26...
true
1,202,040,506
4,153
Adding Text-based NP Enrichment (TNE) dataset
closed
[ "Hey @lhoestq, can you please have a look? 🙏", "Great, thanks again @lhoestq! I think we're good to go now", "Done" ]
2022-04-12T15:47:03
2022-05-03T14:05:48
2022-05-03T14:05:48
Added the [TNE](https://github.com/yanaiela/TNE) dataset to the library
yanaiela
https://github.com/huggingface/datasets/pull/4153
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4153", "html_url": "https://github.com/huggingface/datasets/pull/4153", "diff_url": "https://github.com/huggingface/datasets/pull/4153.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4153.patch", "merged_at": "2022-05-03T14:05...
true
1,202,034,115
4,152
ArrayND error in pyarrow 5
closed
[ "Where do we bump the required pyarrow version? Any inputs on how I fix this issue? ", "We need to bump it in `setup.py` as well as update some CI job to use pyarrow 6 instead of 5 in `.circleci/config.yaml` and `.github/workflows/benchmarks.yaml`" ]
2022-04-12T15:41:40
2022-05-04T09:29:46
2022-05-04T09:29:46
As found in https://github.com/huggingface/datasets/pull/3903, The ArrayND features fail on pyarrow 5: ```python import pyarrow as pa from datasets import Array2D from datasets.table import cast_array_to_feature arr = pa.array([[[0]]]) feature_type = Array2D(shape=(1, 1), dtype="int64") cast_array_to_feature(a...
lhoestq
https://github.com/huggingface/datasets/issues/4152
null
false
1,201,837,999
4,151
Add missing label for emotion description
closed
[]
2022-04-12T13:17:37
2022-04-12T13:58:50
2022-04-12T13:58:50
null
lijiazheng99
https://github.com/huggingface/datasets/pull/4151
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4151", "html_url": "https://github.com/huggingface/datasets/pull/4151", "diff_url": "https://github.com/huggingface/datasets/pull/4151.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4151.patch", "merged_at": "2022-04-12T13:58...
true
1,201,689,730
4,150
Inconsistent splits generation for datasets without loading script (packaged dataset puts everything into a single split)
closed
[]
2022-04-12T11:15:55
2022-04-28T21:02:44
2022-04-28T21:02:44
## Describe the bug Splits for dataset loaders without scripts are prepared inconsistently. I think it might be confusing for users. ## Steps to reproduce the bug * If you load a packaged datasets from Hub, it infers splits from directory structure / filenames (check out the data [here](https://huggingface.co/data...
polinaeterna
https://github.com/huggingface/datasets/issues/4150
null
false
1,201,389,221
4,149
load_dataset for winoground returning decoding error
closed
[ "I thought I had fixed it with this after some helpful hints from @severo\r\n```python\r\nimport datasets \r\ntoken = 'hf_XXXXX'\r\ndataset = datasets.load_dataset(\r\n 'facebook/winoground', \r\n name='facebook--winoground', \r\n split='train', \r\n streaming=True,\r\n use_auth_token=token,\r\n)\r\n...
2022-04-12T08:16:16
2022-05-04T23:40:38
2022-05-04T23:40:38
## Describe the bug I am trying to use datasets to load winoground and I'm getting a JSON decoding error. ## Steps to reproduce the bug ```python from datasets import load_dataset token = 'hf_XXXXX' # my HF access token datasets = load_dataset('facebook/winoground', use_auth_token=token) ``` ## Expected res...
odellus
https://github.com/huggingface/datasets/issues/4149
null
false
1,201,169,242
4,148
fix confusing bleu metric example
closed
[]
2022-04-12T06:18:26
2022-04-13T14:16:34
2022-04-13T14:16:34
**Is your feature request related to a problem? Please describe.** I would like to see the example in "Metric Card for BLEU" changed. The 0th element in the predictions list is not closed in square brackets, and the 1st list is missing a comma. The BLEU score are calculated correctly, but it is difficult to understa...
aizawa-naoki
https://github.com/huggingface/datasets/issues/4148
null
false
1,200,756,008
4,147
Adjust path to datasets tutorial in How-To
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-12T01:20:34
2022-04-12T08:32:24
2022-04-12T08:26:02
The link in the How-To overview page to the Datasets tutorials is currently broken. This is just a small adjustment to make it match the format used in https://github.com/huggingface/datasets/blob/master/docs/source/tutorial.md. (Edit to add: The link in the PR deployment (https://moon-ci-docs.huggingface.co/docs/da...
NimaBoscarino
https://github.com/huggingface/datasets/pull/4147
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4147", "html_url": "https://github.com/huggingface/datasets/pull/4147", "diff_url": "https://github.com/huggingface/datasets/pull/4147.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4147.patch", "merged_at": "2022-04-12T08:26...
true
1,200,215,789
4,146
SAMSum dataset viewer not working
closed
[ "https://huggingface.co/datasets/samsum\r\n\r\n```\r\nStatus code: 400\r\nException: ValueError\r\nMessage: Cannot seek streaming HTTP file\r\n```", "Currently, only the datasets that can be streamed support the dataset viewer. Maybe @lhoestq @albertvillanova or @mariosasko could give more details abo...
2022-04-11T16:22:57
2022-04-29T16:26:09
2022-04-29T16:26:09
## Dataset viewer issue for '*name of the dataset*' **Link:** *link to the dataset viewer page* *short description of the issue* Am I the one who added this dataset ? Yes-No
aakashnegi10
https://github.com/huggingface/datasets/issues/4146
null
false
1,200,209,781
4,145
Redirect TIMIT download from LDC
closed
[ "CI is failing because some tags are outdated, but they're fixed in #4067 ", "_The documentation is not available anymore as the PR was closed or merged._", "We may do a release pretty soon (today ?), let me know if it's fine to include it in the new release", "Fine to include this change!" ]
2022-04-11T16:17:55
2022-04-13T15:39:31
2022-04-13T15:33:04
LDC data is protected under US copyright laws and under various legal agreements between the Linguistic Data Consortium/the University of Pennsylvania and data providers which prohibit redistribution of that data by anyone other than LDC. Similarly, LDC's membership agreements, non-member user agreement and various cor...
lhoestq
https://github.com/huggingface/datasets/pull/4145
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4145", "html_url": "https://github.com/huggingface/datasets/pull/4145", "diff_url": "https://github.com/huggingface/datasets/pull/4145.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4145.patch", "merged_at": "2022-04-13T15:33...
true
1,200,016,983
4,144
Fix splits in local packaged modules, local datasets without script and hub datasets without script
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "Thanks !\r\nI'm in favor of this change, even though it's a breaking change:\r\n\r\nif you had a dataset\r\n```\r\ndata/\r\n train.csv\r\n test.csv\r\n```\r\n\r\nthen running this code would now return both train and test splits:\r...
2022-04-11T13:57:33
2022-04-29T09:12:14
2022-04-28T21:02:45
fixes #4150 I suggest to infer splits structure from files when `data_dir` is passed with `get_patterns_locally`, analogous to what's done in `LocalDatasetModuleFactoryWithoutScript` with `self.path`, instead of generating files with `data_dir/**` patterns and putting them all into a single default (train) split. ...
polinaeterna
https://github.com/huggingface/datasets/pull/4144
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4144", "html_url": "https://github.com/huggingface/datasets/pull/4144", "diff_url": "https://github.com/huggingface/datasets/pull/4144.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4144.patch", "merged_at": "2022-04-28T21:02...
true
1,199,937,961
4,143
Unable to download `Wikepedia` 20220301.en version
closed
[ "Hi! We've recently updated the Wikipedia script, so these changes are only available on master and can be fetched as follows:\r\n```python\r\ndataset_wikipedia = load_dataset(\"wikipedia\", \"20220301.en\", revision=\"master\")\r\n```", "Hi, how can I load the previous \"20200501.en\" version of wikipedia which ...
2022-04-11T13:00:14
2022-08-17T00:37:55
2022-04-21T17:04:14
## Describe the bug Unable to download `Wikepedia` dataset, 20220301.en version ## Steps to reproduce the bug ```python !pip install apache_beam mwparserfromhell dataset_wikipedia = load_dataset("wikipedia", "20220301.en") ``` ## Actual results ``` ValueError: BuilderConfig 20220301.en not found. Avail...
beyondguo
https://github.com/huggingface/datasets/issues/4143
null
false
1,199,794,750
4,142
Add ObjectFolder 2.0 dataset
open
[ "Datasets are not tracked in this repository anymore." ]
2022-04-11T10:57:51
2022-10-05T10:30:49
null
## Adding a Dataset - **Name:** ObjectFolder 2.0 - **Description:** ObjectFolder 2.0 is a dataset of 1,000 objects in the form of implicit representations. It contains 1,000 Object Files each containing the complete multisensory profile for an object instance. - **Paper:** [*link to the dataset paper if available*](...
osanseviero
https://github.com/huggingface/datasets/issues/4142
null
false
1,199,610,885
4,141
Why is the dataset not visible under the dataset preview section?
closed
[]
2022-04-11T08:36:42
2022-04-11T18:55:32
2022-04-11T17:09:49
## Dataset viewer issue for '*name of the dataset*' **Link:** *link to the dataset viewer page* *short description of the issue* Am I the one who added this dataset ? Yes-No
Nid989
https://github.com/huggingface/datasets/issues/4141
null
false
1,199,492,356
4,140
Error loading arxiv data set
closed
[ "Hi! I think this error may be related to using an older version of the library. I was able to load the dataset without any issues using the latest version of `datasets`. Can you upgrade to the latest version of `datasets` and try again? :)", "Hi! As @stevhliu suggested, to fix the issue, update the lib to the n...
2022-04-11T07:06:34
2022-04-12T16:24:08
2022-04-12T16:24:08
## Describe the bug A clear and concise description of what the bug is. I met the error below when loading arxiv dataset via `nlp.load_dataset('scientific_papers', 'arxiv',)`. ``` Traceback (most recent call last): File "scripts/summarization.py", line 354, in <module> main(args) File "scripts/summari...
yjqiu
https://github.com/huggingface/datasets/issues/4140
null
false
1,199,443,822
4,139
Dataset viewer issue for Winoground
closed
[ "related (same dataset): https://github.com/huggingface/datasets/issues/4149. But the issue is different. Looking at it", "I thought this issue was related to the error I was seeing, but upon consideration I'd think the dataset viewer would return a 500 (unable to create the split like me) or a 404 (unable to loa...
2022-04-11T06:11:41
2022-06-21T16:43:58
2022-06-21T16:43:58
## Dataset viewer issue for 'Winoground' **Link:** [*link to the dataset viewer page*](https://huggingface.co/datasets/facebook/winoground/viewer/facebook--winoground/train) *short description of the issue* Getting 401, message='Unauthorized' The dataset is subject to authorization, but I can access the files f...
alcinos
https://github.com/huggingface/datasets/issues/4139
null
false
1,199,291,730
4,138
Incorrect Russian filenames encoding after extraction by datasets.DownloadManager.download_and_extract()
closed
[ "To reproduce:\r\n\r\n```python\r\n>>> import datasets\r\n>>> datasets.get_dataset_split_names('MalakhovIlya/RuREBus', config_name='raw_txt')\r\nTraceback (most recent call last):\r\n File \"/home/slesage/hf/datasets-preview-backend/.venv/lib/python3.9/site-packages/datasets/inspect.py\", line 280, in get_dataset_...
2022-04-11T02:07:13
2022-04-19T03:15:46
2022-04-16T15:46:29
## Dataset viewer issue for 'MalakhovIlya/RuREBus' **Link:** https://huggingface.co/datasets/MalakhovIlya/RuREBus **Description** Using os.walk(topdown=False) in DatasetBuilder causes following error: Status code: 400 Exception: TypeError Message: xwalk() got an unexpected keyword argument 'topdow...
iluvvatar
https://github.com/huggingface/datasets/issues/4138
null
false
1,199,000,453
4,137
Add single dataset citations for TweetEval
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "The `test_dataset_cards` method is failing with the error:\r\n\r\n```\r\nif error_messages:\r\n> raise ValueError(\"\\n\".join(error_messages))\r\nE ValueError: The following issues have been found in the dataset ...
2022-04-10T11:51:54
2022-04-12T07:57:22
2022-04-12T07:51:15
This PR adds single data citations as per request of the original creators of the TweetEval dataset. This is a recent email from the creator: > Could I ask you a favor? Would you be able to add at the end of the README the citations of the single datasets as well? You can just copy our readme maybe? https://githu...
gchhablani
https://github.com/huggingface/datasets/pull/4137
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4137", "html_url": "https://github.com/huggingface/datasets/pull/4137", "diff_url": "https://github.com/huggingface/datasets/pull/4137.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4137.patch", "merged_at": "2022-04-12T07:51...
true
1,198,307,610
4,135
Support streaming xtreme dataset for PAN-X config
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-09T06:19:48
2022-05-06T08:39:40
2022-04-11T06:54:14
Support streaming xtreme dataset for PAN-X config.
albertvillanova
https://github.com/huggingface/datasets/pull/4135
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4135", "html_url": "https://github.com/huggingface/datasets/pull/4135", "diff_url": "https://github.com/huggingface/datasets/pull/4135.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4135.patch", "merged_at": "2022-04-11T06:54...
true
1,197,937,146
4,134
ELI5 supporting documents
open
[ "Hi ! Please post your question on the [forum](https://discuss.huggingface.co/), more people will be able to help you there ;)" ]
2022-04-08T23:36:27
2022-04-13T13:52:46
null
if i am using dense search to create supporting documents for eli5 how much time it will take bcz i read somewhere that it takes about 18 hrs??
saurabh-0077
https://github.com/huggingface/datasets/issues/4134
null
false
1,197,830,623
4,133
HANS dataset preview broken
closed
[ "The dataset cannot be loaded, be it in normal or streaming mode.\r\n\r\n```python\r\n>>> import datasets\r\n>>> dataset=datasets.load_dataset(\"hans\", split=\"train\", streaming=True)\r\n>>> next(iter(dataset))\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\n File \"/home/sles...
2022-04-08T21:06:15
2022-04-13T11:57:34
2022-04-13T11:57:34
## Dataset viewer issue for '*hans*' **Link:** [https://huggingface.co/datasets/hans](https://huggingface.co/datasets/hans) HANS dataset preview is broken with error 400 Am I the one who added this dataset ? No
pietrolesci
https://github.com/huggingface/datasets/issues/4133
null
false
1,197,661,720
4,132
Support streaming xtreme dataset for PAWS-X config
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-08T18:25:32
2022-05-06T08:39:42
2022-04-08T21:02:44
Support streaming xtreme dataset for PAWS-X config.
albertvillanova
https://github.com/huggingface/datasets/pull/4132
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4132", "html_url": "https://github.com/huggingface/datasets/pull/4132", "diff_url": "https://github.com/huggingface/datasets/pull/4132.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4132.patch", "merged_at": "2022-04-08T21:02...
true
1,197,472,249
4,131
Support streaming xtreme dataset for udpos config
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-08T15:30:49
2022-05-06T08:39:46
2022-04-08T16:28:07
Support streaming xtreme dataset for udpos config.
albertvillanova
https://github.com/huggingface/datasets/pull/4131
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4131", "html_url": "https://github.com/huggingface/datasets/pull/4131", "diff_url": "https://github.com/huggingface/datasets/pull/4131.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4131.patch", "merged_at": "2022-04-08T16:28...
true
1,197,456,857
4,130
Add SBU Captions Photo Dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-08T15:17:39
2022-04-12T10:47:31
2022-04-12T10:41:29
null
thomasw21
https://github.com/huggingface/datasets/pull/4130
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4130", "html_url": "https://github.com/huggingface/datasets/pull/4130", "diff_url": "https://github.com/huggingface/datasets/pull/4130.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4130.patch", "merged_at": "2022-04-12T10:41...
true
1,197,376,796
4,129
dataset metadata for reproducibility
open
[ "+1 on this idea. This could be powerful for helping better track datasets used for model training and help with automatic model card creation. \r\n\r\nOne possible way of doing this would be to store some/most/all the arguments passed to `load_dataset` if a hub id is passed. i.e. store the Hub ID, configuration, e...
2022-04-08T14:17:28
2023-09-29T09:23:56
null
When pulling a dataset from the hub, it would be useful to have some metadata about the specific dataset and version that is used. The metadata could then be passed to the `Trainer` which could then be saved to a model card. This is useful for people who run many experiments on different versions (commits/branches) of ...
nbroad1881
https://github.com/huggingface/datasets/issues/4129
null
false
1,197,326,311
4,128
More robust `cast_to_python_objects` in `TypedSequence`
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-08T13:33:35
2022-04-13T14:07:41
2022-04-13T14:01:16
Adds a fallback to run an expensive version of `cast_to_python_objects` which exhaustively checks entire lists to avoid the `ArrowInvalid: Could not convert` error in `TypedSequence`. Currently, this error can happen in situations where only some images are decoded in `map`, in which case `cast_to_python_objects` fails...
mariosasko
https://github.com/huggingface/datasets/pull/4128
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4128", "html_url": "https://github.com/huggingface/datasets/pull/4128", "diff_url": "https://github.com/huggingface/datasets/pull/4128.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4128.patch", "merged_at": "2022-04-13T14:01...
true
1,197,297,756
4,127
Add configs with processed data in medical_dialog dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-08T13:08:16
2022-05-06T08:39:50
2022-04-08T16:20:51
There exist processed data files that do not require parsing the raw data files (which can take long time). Fix #4122.
albertvillanova
https://github.com/huggingface/datasets/pull/4127
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4127", "html_url": "https://github.com/huggingface/datasets/pull/4127", "diff_url": "https://github.com/huggingface/datasets/pull/4127.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4127.patch", "merged_at": "2022-04-08T16:20...
true
1,196,665,194
4,126
dataset viewer issue for common_voice
closed
[ "Yes, it's a known issue, and we expect to fix it soon.", "Fixed.\r\n\r\n<img width=\"1393\" alt=\"Capture d’écran 2022-04-25 à 15 42 05\" src=\"https://user-images.githubusercontent.com/1676121/165101176-d729d85b-efff-45a8-bad1-b69223edba5f.png\">\r\n" ]
2022-04-07T23:34:28
2022-04-25T13:42:17
2022-04-25T13:42:16
## Dataset viewer issue for 'common_voice' **Link:** https://huggingface.co/datasets/common_voice Server Error Status code: 400 Exception: TypeError Message: __init__() got an unexpected keyword argument 'audio_column' Am I the one who added this dataset ? No
laphang
https://github.com/huggingface/datasets/issues/4126
null
false
1,196,633,936
4,125
BIG-bench
closed
[ "> It looks like the CI is failing on windows because our windows CI is unable to clone the bigbench repository (maybe it has to do with filenames that are longer than 256 characters, which windows don't like). Could the smaller installation of bigbench via pip solve this issue ?\r\n> Otherwise we can see how to re...
2022-04-07T22:33:30
2022-06-08T17:57:48
2022-06-08T17:32:32
This PR adds all BIG-bench json tasks to huggingface/datasets.
andersjohanandreassen
https://github.com/huggingface/datasets/pull/4125
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4125", "html_url": "https://github.com/huggingface/datasets/pull/4125", "diff_url": "https://github.com/huggingface/datasets/pull/4125.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4125.patch", "merged_at": "2022-06-08T17:32...
true
1,196,469,842
4,124
Image decoding often fails when transforming Image datasets
closed
[ "A quick hack I have found is that we can call the image first before running the transforms and it makes sure the image is decoded before being passed on.\r\n\r\nFor this I just needed to add `example['img'] = example['img']` to the top of my `generate_flipped_data` function, defined above, so that image decode in...
2022-04-07T19:17:25
2022-04-13T14:01:16
2022-04-13T14:01:16
## Describe the bug When transforming/modifying images in an image dataset using the `map` function the PIL images often fail to decode in time for the image transforms, causing errors. Using a debugger it is easy to see what the problem is, the Image decode invocation does not take place and the resulting image pa...
RafayAK
https://github.com/huggingface/datasets/issues/4124
null
false
1,196,367,512
4,123
Building C4 takes forever
closed
[ "Hi @StellaAthena, thanks for reporting.\r\n\r\nPlease note, that our `datasets` library performs several operations in order to load a dataset, among them:\r\n- it downloads all the required files: for C4 \"en\", 378.69 GB of JSON GZIPped files\r\n- it parses their content to generate the dataset\r\n- it caches th...
2022-04-07T17:41:30
2023-06-26T22:01:29
2023-06-26T22:01:29
## Describe the bug C4-en is a 300 GB dataset. However, when I try to download it through the hub it takes over _six hours_ to generate the train/test split from the downloaded files. This is an absurd amount of time and an unnecessary waste of resources. ## Steps to reproduce the bug ```python c4 = datasets.load...
StellaAthena
https://github.com/huggingface/datasets/issues/4123
null
false
1,196,095,072
4,122
medical_dialog zh has very slow _generate_examples
closed
[ "Hi @nbroad1881, thanks for reporting.\r\n\r\nLet me have a look to try to improve its performance. ", "Thanks @nbroad1881 for reporting! I don't recall it taking so long. I will also have a look at this. \r\n@albertvillanova please let me know if I am doing something unnecessary or time consuming.", "Hi @nbro...
2022-04-07T14:00:51
2022-04-08T16:20:51
2022-04-08T16:20:51
## Describe the bug After downloading the files from Google Drive, `load_dataset("medical_dialog", "zh", data_dir="./")` takes an unreasonable amount of time. Generating the train/test split for 33% of the dataset takes over 4.5 hours. ## Steps to reproduce the bug The easiest way I've found to download files from...
nbroad1881
https://github.com/huggingface/datasets/issues/4122
null
false
1,196,000,018
4,121
datasets.load_metric can not load a local metirc
closed
[ "Hello, could you tell me how this issue can be fixed? I'm coming across the same issue." ]
2022-04-07T12:48:56
2023-01-18T14:30:46
2022-04-07T13:53:27
## Describe the bug No matter how I hard try to tell load_metric that I want to load a local metric file, it still continues to fetch things on the Internet. And unfortunately it says 'ConnectionError: Couldn't reach'. However I can download this file without connectionerror and tell load_metric its local directory. A...
SadGare
https://github.com/huggingface/datasets/issues/4121
null
false
1,195,887,430
4,120
Representing dictionaries (json) objects as features
open
[]
2022-04-07T11:07:41
2022-04-07T11:07:41
null
In the process of adding a new dataset to the hub, I stumbled upon the inability to represent dictionaries that contain different key names, unknown in advance (and may differ between samples), original asked in the [forum](https://discuss.huggingface.co/t/representing-nested-dictionary-with-different-keys/16442). F...
yanaiela
https://github.com/huggingface/datasets/issues/4120
null
false
1,195,641,298
4,119
Hotfix failing CI tests on Windows
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-07T07:38:46
2022-04-07T09:47:24
2022-04-07T07:57:13
This PR makes a hotfix for our CI Windows tests: https://app.circleci.com/pipelines/github/huggingface/datasets/11092/workflows/9cfdb1dd-0fec-4fe0-8122-5f533192ebdc/jobs/67414 Fix #4118 I guess this issue is related to this PR: - huggingface/huggingface_hub#815
albertvillanova
https://github.com/huggingface/datasets/pull/4119
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4119", "html_url": "https://github.com/huggingface/datasets/pull/4119", "diff_url": "https://github.com/huggingface/datasets/pull/4119.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4119.patch", "merged_at": "2022-04-07T07:57...
true
1,195,638,944
4,118
Failing CI tests on Windows
closed
[]
2022-04-07T07:36:25
2022-04-07T07:57:13
2022-04-07T07:57:13
## Describe the bug Our CI Windows tests are failing from yesterday: https://app.circleci.com/pipelines/github/huggingface/datasets/11092/workflows/9cfdb1dd-0fec-4fe0-8122-5f533192ebdc/jobs/67414
albertvillanova
https://github.com/huggingface/datasets/issues/4118
null
false
1,195,552,406
4,117
AttributeError: module 'huggingface_hub' has no attribute 'hf_api'
closed
[ "Hi @arymbe, thanks for reporting.\r\n\r\nUnfortunately, I'm not able to reproduce your problem.\r\n\r\nCould you please write the complete stack trace? That way we will be able to see which package originates the exception.", "Hello, thank you for your fast replied. this is the complete error that I got\r\n\r\n-...
2022-04-07T05:52:36
2024-05-07T09:24:35
2022-04-19T15:36:35
## Describe the bug Could you help me please. I got this following error. AttributeError: module 'huggingface_hub' has no attribute 'hf_api' ## Steps to reproduce the bug when I imported the datasets # Sample code to reproduce the bug from datasets import list_datasets, load_dataset, list_metrics, load_metr...
arymbe
https://github.com/huggingface/datasets/issues/4117
null
false
1,194,926,459
4,116
Pretty print dataset info files
closed
[ "maybe just do it from now on no? (i.e. not for existing `dataset_infos.json` files)", "_The documentation is not available anymore as the PR was closed or merged._", "> maybe just do it from now on no? (i.e. not for existing dataset_infos.json files)\r\n\r\nYes, or do this only for datasets created with `push_...
2022-04-06T17:40:48
2022-04-08T11:28:01
2022-04-08T11:21:53
Adds indentation to the `dataset_infos.json` file when saving for nicer diffs. (suggested by @julien-c) This PR also updates the info files of the GH datasets. Note that this change adds more than **10 MB** to the repo size (the total file size before the change: 29.672298 MB, after: 41.666475 MB), so I'm not sur...
mariosasko
https://github.com/huggingface/datasets/pull/4116
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4116", "html_url": "https://github.com/huggingface/datasets/pull/4116", "diff_url": "https://github.com/huggingface/datasets/pull/4116.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4116.patch", "merged_at": "2022-04-08T11:21...
true
1,194,907,555
4,115
ImageFolder add option to ignore some folders like '.ipynb_checkpoints'
closed
[ "Maybe it would be nice to ignore private dirs like this one (ones starting with `.`) by default. \r\n\r\nCC @mariosasko ", "Maybe we can add a `ignore_hidden_files` flag to the builder configs of our packaged loaders (to be consistent across all of them), wdyt @lhoestq @albertvillanova? ", "I think they should...
2022-04-06T17:29:43
2022-06-01T13:04:16
2022-06-01T13:04:16
**Is your feature request related to a problem? Please describe.** I sometimes like to peek at the dataset images from jupyterlab. thus '.ipynb_checkpoints' folder appears where my dataset is and (just realized) leads to accidental duplicate image additions. I think this is an easy enough thing to miss especially if t...
cceyda
https://github.com/huggingface/datasets/issues/4115
null
false
1,194,855,345
4,114
Allow downloading just some columns of a dataset
open
[ "In the general case you can’t always reduce the quantity of data to download, since you can’t parse CSV or JSON data without downloading the whole files right ? ^^ However we could explore this case-by-case I guess", "Actually for csv pandas has `usecols` which allows loading a subset of columns in a more effici...
2022-04-06T16:38:46
2025-02-17T15:10:56
null
**Is your feature request related to a problem? Please describe.** Some people are interested in doing label analysis of a CV dataset without downloading all the images. Downloading the whole dataset does not always makes sense for this kind of use case **Describe the solution you'd like** Be able to just download...
osanseviero
https://github.com/huggingface/datasets/issues/4114
null
false
1,194,843,532
4,113
Multiprocessing with FileLock fails in python 3.9
closed
[ "Closing this one because it must be used this way actually:\r\n```python\r\ndef main():\r\n with FileLock(\"tmp.lock\"):\r\n with Pool(2) as pool:\r\n pool.map(run, range(2))\r\n\r\nif __name__ == \"__main__\":\r\n main()\r\n```" ]
2022-04-06T16:27:09
2022-11-28T11:49:14
2022-11-28T11:49:14
On python 3.9, this code hangs: ```python from multiprocessing import Pool from filelock import FileLock def run(i): print(f"got the lock in multi process [{i}]") with FileLock("tmp.lock"): with Pool(2) as pool: pool.map(run, range(2)) ``` This is because the subprocesses try to ac...
lhoestq
https://github.com/huggingface/datasets/issues/4113
null
false
1,194,752,765
4,112
ImageFolder with Grayscale images dataset
closed
[ "Hi! Replacing:\r\n```python\r\ntransformed_dataset = dataset.with_transform(transforms)\r\ntransformed_dataset.set_format(type=\"torch\", device=\"cuda\")\r\n```\r\n\r\nwith:\r\n```python\r\ndef transform_func(examples):\r\n examples[\"image\"] = [transforms(img).to(\"cuda\") for img in examples[\"image\"]]\r\n...
2022-04-06T15:10:00
2022-04-22T10:21:53
2022-04-22T10:21:52
Hi, I'm facing a problem with a grayscale images dataset I have uploaded [here](https://huggingface.co/datasets/ChainYo/rvl-cdip) (RVL-CDIP) I'm getting an error while I want to use images for training a model with PyTorch DataLoader. Here is the full traceback: ```bash AttributeError: Caught AttributeError in D...
chainyo
https://github.com/huggingface/datasets/issues/4112
null
false
1,194,660,699
4,111
Update security policy
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-06T13:59:51
2022-04-07T09:46:30
2022-04-07T09:40:27
null
albertvillanova
https://github.com/huggingface/datasets/pull/4111
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4111", "html_url": "https://github.com/huggingface/datasets/pull/4111", "diff_url": "https://github.com/huggingface/datasets/pull/4111.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4111.patch", "merged_at": "2022-04-07T09:40...
true
1,194,581,375
4,110
Matthews Correlation Metric Card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-06T12:59:35
2022-05-03T13:43:17
2022-05-03T13:36:13
null
emibaylor
https://github.com/huggingface/datasets/pull/4110
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4110", "html_url": "https://github.com/huggingface/datasets/pull/4110", "diff_url": "https://github.com/huggingface/datasets/pull/4110.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4110.patch", "merged_at": "2022-05-03T13:36...
true
1,194,579,257
4,109
Add Spearmanr Metric Card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "changes made! @lhoestq let me know what you think ", "The CI fail is unrelated to this PR and fixed on master, feel free to merge :)" ]
2022-04-06T12:57:53
2022-05-03T16:50:26
2022-05-03T16:43:37
null
emibaylor
https://github.com/huggingface/datasets/pull/4109
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4109", "html_url": "https://github.com/huggingface/datasets/pull/4109", "diff_url": "https://github.com/huggingface/datasets/pull/4109.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4109.patch", "merged_at": "2022-05-03T16:43...
true
1,194,578,584
4,108
Perplexity Speedup
closed
[ "WRT the high values, can you add some unit tests with some [string, model] pairs and their resulting perplexity code, and @TristanThrush can run the same pairs through his version of the code?", "_The documentation is not available anymore as the PR was closed or merged._", "I thought that the perplexity metri...
2022-04-06T12:57:21
2022-04-20T13:00:54
2022-04-20T12:54:42
This PR makes necessary changes to perplexity such that: - it runs much faster (via batching) - it throws an error when input is empty, or when input is one word without <BOS> token - it adds the option to add a <BOS> token Issues: - The values returned are extremely high, and I'm worried they aren't correct. Ev...
emibaylor
https://github.com/huggingface/datasets/pull/4108
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4108", "html_url": "https://github.com/huggingface/datasets/pull/4108", "diff_url": "https://github.com/huggingface/datasets/pull/4108.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4108.patch", "merged_at": "2022-04-20T12:54...
true
1,194,484,885
4,107
Unable to view the dataset and loading the same dataset throws the error - ArrowInvalid: Exceeded maximum rows
closed
[ "Thanks for reporting. I'm looking at it", " It's not related to the dataset viewer in itself. I can replicate the error with:\r\n\r\n```\r\n>>> import datasets as ds\r\n>>> d = ds.load_dataset('Pavithree/explainLikeImFive')\r\nUsing custom data configuration Pavithree--explainLikeImFive-b68b6d8112cd8a51\r\nDown...
2022-04-06T11:37:15
2022-04-08T07:13:07
2022-04-06T14:39:55
## Dataset viewer issue - -ArrowInvalid: Exceeded maximum rows **Link:** *https://huggingface.co/datasets/Pavithree/explainLikeImFive* *This is the subset of original eli5 dataset https://huggingface.co/datasets/vblagoje/lfqa. I just filtered the data samples which belongs to one particular subreddit thread. How...
Pavithree
https://github.com/huggingface/datasets/issues/4107
null
false
1,194,393,892
4,106
Support huggingface_hub 0.5
closed
[ "Looks like GH actions is not able to resolve `huggingface_hub` 0.5.0, I'm investivating", "_The documentation is not available anymore as the PR was closed or merged._", "I'm glad to see changes in `huggingface_hub` are simplifying code here.", "seems to supersede #4102, feel free to close mine :)", "maybe...
2022-04-06T10:15:25
2022-04-08T10:28:43
2022-04-08T10:22:23
Following https://github.com/huggingface/datasets/issues/4105 `huggingface_hub` deprecated some parameters in `HfApi` in 0.5. This PR updates all the calls to HfApi to remove all the deprecations, <s>and I set the `hugginface_hub` requirement to `>=0.5.0`</s> cc @adrinjalali @LysandreJik
lhoestq
https://github.com/huggingface/datasets/pull/4106
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4106", "html_url": "https://github.com/huggingface/datasets/pull/4106", "diff_url": "https://github.com/huggingface/datasets/pull/4106.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4106.patch", "merged_at": "2022-04-08T10:22...
true
1,194,297,119
4,105
push to hub fails with huggingface-hub 0.5.0
closed
[ "Hi ! Indeed there was a breaking change in `huggingface_hub` 0.5.0 in `HfApi.create_repo`, which is called here in `datasets` by passing the org name in both the `repo_id` and the `organization` arguments:\r\n\r\nhttps://github.com/huggingface/datasets/blob/2230f7f7d7fbaf102cff356f5a8f3bd1561bea43/src/datasets/arr...
2022-04-06T08:59:57
2022-04-13T14:30:47
2022-04-13T14:30:47
## Describe the bug `ds.push_to_hub` is failing when updating a dataset in the form "org_id/repo_id" ## Steps to reproduce the bug ```python from datasets import load_dataset ds = load_dataset("rubrix/news_test") ds.push_to_hub("<your-user>/news_test", token="<your-token>") ``` ## Expected results The data...
frascuchon
https://github.com/huggingface/datasets/issues/4105
null
false
1,194,072,966
4,104
Add time series data - stock market
open
[ "Can I use instructions present in below link for time series dataset as well? \r\nhttps://github.com/huggingface/datasets/blob/master/ADD_NEW_DATASET.md ", "cc'ing @kashif and @NielsRogge for visibility!", "@INF800 happy to add this dataset! I will try to set a PR by the end of the day... if you can kindly poi...
2022-04-06T05:46:58
2024-07-21T16:54:30
null
## Adding a Time Series Dataset - **Name:** 2min ticker data for stock market - **Description:** 8 stocks' data collected for 1month post ukraine-russia war. 4 NSE stocks and 4 NASDAQ stocks. Along with technical indicators (additional features) as shown in below image - **Data:** Collected by myself from investing...
rozeappletree
https://github.com/huggingface/datasets/issues/4104
null
false
1,193,987,104
4,103
Add the `GSM8K` dataset
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "The CI is failing because it's outdated, but the task tags are updated on `master`, merging :)" ]
2022-04-06T04:07:52
2022-04-12T15:38:28
2022-04-12T10:21:16
null
jon-tow
https://github.com/huggingface/datasets/pull/4103
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4103", "html_url": "https://github.com/huggingface/datasets/pull/4103", "diff_url": "https://github.com/huggingface/datasets/pull/4103.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4103.patch", "merged_at": "2022-04-12T10:21...
true
1,193,616,722
4,102
[hub] Fix `api.create_repo` call?
closed
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/datasets/pr_4102). All of your documentation changes will be reflected on that endpoint.", "Closing in favor of https://github.com/huggingface/datasets/pull/4106" ]
2022-04-05T19:21:52
2023-09-24T10:01:14
2022-04-12T08:41:46
null
julien-c
https://github.com/huggingface/datasets/pull/4102
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4102", "html_url": "https://github.com/huggingface/datasets/pull/4102", "diff_url": "https://github.com/huggingface/datasets/pull/4102.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4102.patch", "merged_at": null }
true
1,193,399,204
4,101
How can I download only the train and test split for full numbers using load_dataset()?
open
[ "Hi! Can you please specify the full name of the dataset? IIRC `full_numbers` is one of the configs of the `svhn` dataset, and its generation is slow due to data being stored in binary Matlab files. Even if you specify a specific split, `datasets` downloads all of them, but we plan to fix that soon and only downloa...
2022-04-05T16:00:15
2022-04-06T13:09:01
null
How can I download only the train and test split for full numbers using load_dataset()? I do not need the extra split and it will take 40 mins just to download in Colab. I have very short time in hand. Please help.
Nakkhatra
https://github.com/huggingface/datasets/issues/4101
null
false
1,193,393,959
4,100
Improve RedCaps dataset card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "I find this preprocessing a bit too specific to add it as a method to `datasets` as it's only useful in the context of CV (and we support multiple modalities). However, I agree it would be great to move this code to another lib to av...
2022-04-05T15:57:14
2022-04-13T14:08:54
2022-04-13T14:02:26
This PR modifies the RedCaps card to: * fix the formatting of the Point of Contact fields on the Hub * speed up the image fetching logic (aligns it with the [img2dataset](https://github.com/rom1504/img2dataset) tool) and make it more robust (return None if **any** exception is thrown)
mariosasko
https://github.com/huggingface/datasets/pull/4100
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4100", "html_url": "https://github.com/huggingface/datasets/pull/4100", "diff_url": "https://github.com/huggingface/datasets/pull/4100.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4100.patch", "merged_at": "2022-04-13T14:02...
true
1,193,253,768
4,099
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5 in position 213: ordinal not in range(128)
closed
[ "Hi @andreybond, thanks for reporting.\r\n\r\nUnfortunately, I'm not able to able to reproduce your issue:\r\n```python\r\nIn [4]: from datasets import load_dataset\r\n ...: datasets = load_dataset(\"nielsr/XFUN\", \"xfun.ja\")\r\n\r\nIn [5]: datasets\r\nOut[5]: \r\nDatasetDict({\r\n train: Dataset({\r\n ...
2022-04-05T14:42:38
2022-04-06T06:37:44
2022-04-06T06:35:54
## Describe the bug Error "UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5 in position 213: ordinal not in range(128)" is thrown when downloading dataset. ## Steps to reproduce the bug ```python from datasets import load_dataset datasets = load_dataset("nielsr/XFUN", "xfun.ja") ``` ## Expected resu...
andreybond
https://github.com/huggingface/datasets/issues/4099
null
false
1,193,245,522
4,098
Proposing WikiSplit metric card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "A quick Github tip ;) To avoid running N times the CI, you can push all the changes at once: go to Files Changed tab, and on each suggestion there's a \"add to commit batch\" and then you can do one commit for all the suggestions you...
2022-04-05T14:36:34
2022-10-11T09:10:21
2022-04-05T15:42:28
Pinging @lhoestq to ensure that my distinction between the dataset and the metric are clear :sweat_smile:
sashavor
https://github.com/huggingface/datasets/pull/4098
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4098", "html_url": "https://github.com/huggingface/datasets/pull/4098", "diff_url": "https://github.com/huggingface/datasets/pull/4098.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4098.patch", "merged_at": "2022-04-05T15:42...
true
1,193,205,751
4,097
Updating FrugalScore metric card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-05T14:09:24
2022-04-05T15:07:35
2022-04-05T15:01:46
removing duplicate paragraph
sashavor
https://github.com/huggingface/datasets/pull/4097
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4097", "html_url": "https://github.com/huggingface/datasets/pull/4097", "diff_url": "https://github.com/huggingface/datasets/pull/4097.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4097.patch", "merged_at": "2022-04-05T15:01...
true
1,193,165,229
4,096
Add support for streaming Zarr stores for hosted datasets
closed
[ "Hi @jacobbieker, thanks for your request and study of possible alternatives.\r\n\r\nWe are very interested in finding a way to make `datasets` useful to you.\r\n\r\nLooking at the Zarr docs, I saw that among its storage alternatives, there is the ZIP file format: https://zarr.readthedocs.io/en/stable/api/storage.h...
2022-04-05T13:38:32
2023-12-07T09:01:49
2022-04-21T08:12:58
**Is your feature request related to a problem? Please describe.** Lots of geospatial data is stored in the Zarr format. This format works well for n-dimensional data and coordinates, and can have good compression. Unfortunately, HF datasets doesn't support streaming in data in Zarr format as far as I can tell. Zarr s...
jacobbieker
https://github.com/huggingface/datasets/issues/4096
null
false
1,192,573,353
4,095
fix typo in rename_column error message
closed
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/datasets/pr_4095). All of your documentation changes will be reflected on that endpoint." ]
2022-04-05T03:55:56
2022-04-05T08:54:46
2022-04-05T08:45:53
I feel bad submitting such a tiny change as a PR but it confused me today 😄
hunterlang
https://github.com/huggingface/datasets/pull/4095
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4095", "html_url": "https://github.com/huggingface/datasets/pull/4095", "diff_url": "https://github.com/huggingface/datasets/pull/4095.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4095.patch", "merged_at": "2022-04-05T08:45...
true
1,192,534,414
4,094
Helo Mayfrends
closed
[]
2022-04-05T02:42:57
2022-04-05T07:16:42
2022-04-05T07:16:42
## Adding a Dataset - **Name:** *name of the dataset* - **Description:** *short description of the dataset (or link to social media or blog post)* - **Paper:** *link to the dataset paper if available* - **Data:** *link to the Github repository or current dataset location* - **Motivation:** *what are some good reas...
Budigming
https://github.com/huggingface/datasets/issues/4094
null
false
1,192,523,161
4,093
elena-soare/crawled-ecommerce: missing dataset
closed
[ "It's a bug! Thanks for reporting, I'm looking at it.", "By the way, the error on our part is due to the huge size of every row (~90MB). The dataset viewer does not support such big dataset rows for the moment.\r\nAnyway, we're working to give a hint about this in the dataset viewer.", "Fixed. See https://huggi...
2022-04-05T02:25:19
2022-04-12T09:34:53
2022-04-12T09:34:53
elena-soare/crawled-ecommerce **Link:** *link to the dataset viewer page* *short description of the issue* Am I the one who added this dataset ? Yes-No
seevaratnam
https://github.com/huggingface/datasets/issues/4093
null
false
1,192,499,903
4,092
Fix dataset `amazon_us_reviews` metadata - 4/4/2022
closed
[ "_The documentation is not available anymore as the PR was closed or merged._", "cc: @albertvillanova just FYI" ]
2022-04-05T01:39:45
2022-04-08T12:35:41
2022-04-08T12:29:31
Fixes #4048 by running `dataset-cli test` to reprocess data and regenerate metadata. Additionally I've updated the README to include up-to-date counts for the subsets.
trentonstrong
https://github.com/huggingface/datasets/pull/4092
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4092", "html_url": "https://github.com/huggingface/datasets/pull/4092", "diff_url": "https://github.com/huggingface/datasets/pull/4092.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4092.patch", "merged_at": "2022-04-08T12:29...
true
1,192,023,855
4,091
Build a Dataset One Example at a Time Without Loading All Data Into Memory
closed
[ "Hi! Yes, the problem with `add_item` is that it keeps examples in memory, so you are left with these options:\r\n* writing a dataset loading script in which you iterate over `custom_example_dict_streamer` and yield the examples (in `_generate examples`)\r\n* storing the data in a JSON/CSV/Parquet/TXT file and usin...
2022-04-04T16:19:24
2022-04-20T14:31:00
2022-04-20T14:31:00
**Is your feature request related to a problem? Please describe.** I have a very large dataset stored on disk in a custom format. I have some custom code that reads one data example at a time and yields it in the form of a dictionary. I want to construct a `Dataset` with all examples, and then save it to disk. I la...
aravind-tonita
https://github.com/huggingface/datasets/issues/4091
null
false
1,191,956,734
4,090
Avoid writing empty license files
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-04T15:23:37
2022-04-07T12:46:45
2022-04-07T12:40:43
This PR avoids the creation of empty `LICENSE` files.
albertvillanova
https://github.com/huggingface/datasets/pull/4090
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4090", "html_url": "https://github.com/huggingface/datasets/pull/4090", "diff_url": "https://github.com/huggingface/datasets/pull/4090.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4090.patch", "merged_at": "2022-04-07T12:40...
true
1,191,915,196
4,089
Create metric card for Frugal Score
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-04T14:53:49
2022-04-05T14:14:46
2022-04-05T14:06:50
Proposing metric card for Frugal Score. @albertvillanova or @lhoestq -- there are certain aspects that I'm not 100% sure on (such as how exactly the distillation between BertScore and FrugalScore is done) -- so if you find that something isn't clear, please let me know!
sashavor
https://github.com/huggingface/datasets/pull/4089
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4089", "html_url": "https://github.com/huggingface/datasets/pull/4089", "diff_url": "https://github.com/huggingface/datasets/pull/4089.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4089.patch", "merged_at": "2022-04-05T14:06...
true
1,191,901,172
4,088
Remove unused legacy Beam utils
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-04T14:43:51
2022-04-05T15:23:27
2022-04-05T15:17:41
This PR removes unused legacy custom `WriteToParquet`, once official Apache Beam includes the patch since version 2.22.0: - Patch PR: https://github.com/apache/beam/pull/11699 - Issue: https://issues.apache.org/jira/browse/BEAM-10022 In relation with: - #204
albertvillanova
https://github.com/huggingface/datasets/pull/4088
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4088", "html_url": "https://github.com/huggingface/datasets/pull/4088", "diff_url": "https://github.com/huggingface/datasets/pull/4088.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4088.patch", "merged_at": "2022-04-05T15:17...
true
1,191,819,805
4,087
Fix BeamWriter output Parquet file
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-04T13:46:50
2022-04-05T15:00:40
2022-04-05T14:54:48
Since now, the `BeamWriter` saved a Parquet file with a simplified schema, where each field value was serialized to JSON. That resulted in Parquet files larger than Arrow files. This PR: - writes Parquet file preserving original schema and without serialization, thus avoiding serialization overhead and resulting in...
albertvillanova
https://github.com/huggingface/datasets/pull/4087
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4087", "html_url": "https://github.com/huggingface/datasets/pull/4087", "diff_url": "https://github.com/huggingface/datasets/pull/4087.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4087.patch", "merged_at": "2022-04-05T14:54...
true
1,191,373,374
4,086
Dataset viewer issue for McGill-NLP/feedbackQA
closed
[ "Hi @cslizc, thanks for reporting.\r\n\r\nI have just forced the refresh of the corresponding cache and the preview is working now.", "thank you so much" ]
2022-04-04T07:27:20
2022-04-04T22:29:53
2022-04-04T08:01:45
## Dataset viewer issue for '*McGill-NLP/feedbackQA*' **Link:** *[link to the dataset viewer page](https://huggingface.co/datasets/McGill-NLP/feedbackQA)* *short description of the issue* The dataset can be loaded correctly with `load_dataset` but the preview doesn't work. Error message: ``` Status code: 4...
cslizc
https://github.com/huggingface/datasets/issues/4086
null
false
1,190,621,345
4,085
datasets.set_progress_bar_enabled(False) not working in datasets v2
closed
[ "Now, I can't find any reference to set_progress_bar_enabled in the code.\r\n\r\nI think it have been deleted", "Hi @virilo,\r\n\r\nPlease note that since `datasets` version 2.0.0, we have aligned with `transformers` the management of the progress bar (among other things):\r\n- #3897\r\n\r\nNow, you should update...
2022-04-02T12:40:10
2022-09-17T02:18:03
2022-04-04T06:44:34
## Describe the bug datasets.set_progress_bar_enabled(False) not working in datasets v2 ## Steps to reproduce the bug ```python datasets.set_progress_bar_enabled(False) ``` ## Expected results datasets not using any progress bar ## Actual results AttributeError: module 'datasets' has no attribute 'se...
virilo
https://github.com/huggingface/datasets/issues/4085
null
false
1,190,060,415
4,084
Errors in `Train with Datasets` Tensorflow code section on Huggingface.co
closed
[ "Hi @blackhat-coder, thanks for reporting.\r\n\r\nPlease note that the `transformers` library updated their data collators API last year (version 4.10.0):\r\n- huggingface/transformers#13105\r\n\r\nnow requiring to pass `return_tensors` argument at Data Collator instantiation.\r\n\r\nAnd therefore, we also updated ...
2022-04-01T17:02:47
2022-04-04T07:24:37
2022-04-04T07:21:31
## Describe the bug Hi ### Error 1 Running the Tensforlow code on [Huggingface](https://huggingface.co/docs/datasets/use_dataset) gives a TypeError: __init__() got an unexpected keyword argument 'return_tensors' ### Error 2 `DataCollatorWithPadding` isn't imported ## Steps to reproduce the bug ```python impo...
blackhat-coder
https://github.com/huggingface/datasets/issues/4084
null
false
1,190,025,878
4,083
Add SacreBLEU Metric Card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-01T16:24:56
2022-04-12T20:45:00
2022-04-12T20:38:40
null
emibaylor
https://github.com/huggingface/datasets/pull/4083
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4083", "html_url": "https://github.com/huggingface/datasets/pull/4083", "diff_url": "https://github.com/huggingface/datasets/pull/4083.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4083.patch", "merged_at": "2022-04-12T20:38...
true
1,189,965,845
4,082
Add chrF(++) Metric Card
closed
[ "_The documentation is not available anymore as the PR was closed or merged._" ]
2022-04-01T15:32:12
2022-04-12T20:43:55
2022-04-12T20:38:06
null
emibaylor
https://github.com/huggingface/datasets/pull/4082
{ "url": "https://api.github.com/repos/huggingface/datasets/pulls/4082", "html_url": "https://github.com/huggingface/datasets/pull/4082", "diff_url": "https://github.com/huggingface/datasets/pull/4082.diff", "patch_url": "https://github.com/huggingface/datasets/pull/4082.patch", "merged_at": "2022-04-12T20:38...
true