Ecosyste.ms: Issues
An open API service for providing issue and pull request metadata for open source projects.
GitHub / brightmart/roberta_zh issues and pull requests
#100 - updata:测试
Pull Request -
State: open - Opened by Cpeidong 4 months ago
- 1 comment
#99 - RoBERTa_zh_Large_PyTorch的网盘链接失效了,能麻烦提供新的链接吗?
Issue -
State: open - Opened by zzzengzhe 10 months ago
- 3 comments
#98 - 加载的小问题求解答
Issue -
State: open - Opened by cooper12121 over 1 year ago
#97 - 下载问题和加载模型
Issue -
State: open - Opened by Lj4040 over 1 year ago
#96 - 333add
Pull Request -
State: closed - Opened by luoshao23 about 2 years ago
#95 - Loss curve
Issue -
State: open - Opened by wanglaiqi over 2 years ago
#94 - tensorboard可视化模型输出结果 train的masked_lm_loss和masked_lm_accuracy是空的,eval的图只有一个点
Issue -
State: open - Opened by hhanyyan almost 3 years ago
#93 - 利用roberta_zh的tokenizer来做中文NER任务时报错
Issue -
State: open - Opened by Honma-Rika over 3 years ago
- 2 comments
#92 - Update README.md
Pull Request -
State: closed - Opened by Yang-Jianzhang over 3 years ago
#91 - Huggingface
Issue -
State: open - Opened by archersama over 3 years ago
#90 - 请问下,怎么进行GPU训练?
Issue -
State: open - Opened by Tian14267 over 3 years ago
#89 - pytorch用BERT的加载方式加载roberta模型,呢么创建token时special token 是按照bert的方式还是roberta的方式呢
Issue -
State: open - Opened by ludfeer over 3 years ago
#88 - resource文件夹下的vocab和代码不对应
Issue -
State: closed - Opened by bigheary over 3 years ago
#87 - Unrelated parameters in the config
Issue -
State: open - Opened by zheyuye over 3 years ago
#86 - 预处理数据丢失问题
Issue -
State: open - Opened by puzzledTao almost 4 years ago
- 1 comment
#85 - What are the pretrained-language-model that is obviously better than BERT and RoBERTa?
Issue -
State: open - Opened by guotong1988 almost 4 years ago
#84 - 关于MLM中,中文全词掩盖的预测标签问题
Issue -
State: open - Opened by Rango94 almost 4 years ago
- 7 comments
#83 - 其中依赖的预训练模型是否和bert官方提供是一样的?
Issue -
State: open - Opened by charlesfufu almost 4 years ago
#82 - 是否可以开放语料,供其他模型对比
Issue -
State: open - Opened by lamp-lyz about 4 years ago
#81 - CMRC示例
Issue -
State: closed - Opened by chuzhifeng about 4 years ago
#80 - NaN probability sometimes when inference on GPU
Issue -
State: open - Opened by Jiayuforfreeo about 4 years ago
#79 - 在pytorch模型上做post train
Issue -
State: open - Opened by daniellibin about 4 years ago
- 2 comments
#78 - GPT vs BERT, under same computation and data resource, which one is better for downstream tasks like GLUE?
Issue -
State: open - Opened by guotong1988 about 4 years ago
#77 - XLNet其实不能稳压RoBERTa吧?
Issue -
State: closed - Opened by guotong1988 about 4 years ago
- 1 comment
#76 - 做全词遮蔽的比例设置?
Issue -
State: open - Opened by surimj about 4 years ago
- 1 comment
#75 - 关于中文编码
Issue -
State: open - Opened by Leputa over 4 years ago
#74 - pretrain问题:ValueError: Please provide a TPU Name to connect to.
Issue -
State: closed - Opened by xuehui0725 over 4 years ago
- 2 comments
#73 - pretrain 数据问题
Issue -
State: open - Opened by ruleGreen over 4 years ago
- 3 comments
#72 - 预训练数据构造有误
Pull Request -
State: open - Opened by hy-struggle over 4 years ago
#71 - Update create_pretraining_data.py
Pull Request -
State: open - Opened by guotong1988 over 4 years ago
#70 - 用自己的数据构建pretrain data 提示 KeyError: '##cry'
Issue -
State: open - Opened by ccoocode over 4 years ago
- 1 comment
#69 - 希望能提供一个longformer的中文预训练模型
Issue -
State: open - Opened by xjx0524 over 4 years ago
#68 - 没有merge.txt和vocab.json
Issue -
State: open - Opened by lshowway over 4 years ago
- 3 comments
#67 - Checksum does not match ,请问是TensorFlow版本的问题吗,我的是1.15.0
Issue -
State: open - Opened by 545314690 over 4 years ago
#66 - 动态mask逻辑的实现
Issue -
State: open - Opened by humdingers over 4 years ago
- 1 comment
#65 - 部分参数无法重载
Issue -
State: open - Opened by WBY1993 over 4 years ago
#64 - 数据处理中re.findall('##[\u4E00-\u9FA5]')作用
Issue -
State: open - Opened by xiaojinglu over 4 years ago
- 2 comments
#63 - 你好,谷歌云的roberta下载好像不太可用了?方便重新给出链接吗
Issue -
State: open - Opened by currywu123 over 4 years ago
#62 - 请问英文的roberta预训练模型哪里可以找得到???
Issue -
State: open - Opened by WenxiongLiao over 4 years ago
- 1 comment
#61 - 如何在论文中引用您发布的模型?
Issue -
State: open - Opened by CCNUdhj almost 5 years ago
#60 - 请问一下有如下的权重可以分享吗?
Issue -
State: closed - Opened by rxc205 almost 5 years ago
- 1 comment
#59 - tensorflow版本
Issue -
State: open - Opened by aflyhat almost 5 years ago
- 1 comment
#58 - 您好,我想问一下,在预训练的时候怎么做到多gpu并行?
Issue -
State: open - Opened by chenchengshuai almost 5 years ago
#57 - 使用12层预训练roberta_zh_l12模型做类似于bert as service的句子embedding提取器报错
Issue -
State: closed - Opened by guoraikkonen almost 5 years ago
- 2 comments
#56 - 您好,我用您训练好的模型在我的语料上进行微调没有任何问题。我做了两个实验,一个是用我自己的语料从头开始预训练,一个是用我的语料在您的模型基础之上继续预训练,两个实验的准确率是一样的,都很低。继续训练已经按照你说的加入了init_checkpoint参数,并且日志中也明确看到确实加载了,为什么结果会这么差?
Issue -
State: closed - Opened by chenchengshuai almost 5 years ago
- 3 comments
#55 - Pytorch版本的网盘失效了
Issue -
State: open - Opened by cqlyiyeshu almost 5 years ago
- 2 comments
#54 - 使用roberta-large去预测mask的位置,结果不可读
Issue -
State: closed - Opened by yayaQAQ about 5 years ago
- 1 comment
#53 - 关于预训练模型
Issue -
State: closed - Opened by Foehnc about 5 years ago
- 2 comments
#52 - Roberta_l24_zh_base 和RoBERTa-zh-Large,有什么区别吗,还有有keras调用的样例吗,谢谢大佬了
Issue -
State: open - Opened by lx-rookie about 5 years ago
- 3 comments
#51 - RoBERTa-zh-Large百度网盘文件取消了
Issue -
State: open - Opened by jh-deng about 5 years ago
- 6 comments
#50 - 创建多个tfrecord文件后,如何设置inpu_file进行训练?
Issue -
State: closed - Opened by ahzz1207 about 5 years ago
#50 - 创建多个tfrecord文件后,如何设置inpu_file进行训练?
Issue -
State: closed - Opened by ahzz1207 about 5 years ago
#49 - 关于在您的模型上继续预训练
Issue -
State: open - Opened by zhezhaoa about 5 years ago
- 2 comments
#48 - 这个24层的模型为什么fine tune 的acc只有0.56 一个情感2分类问题,但是12层的表现又有0.94
Issue -
State: open - Opened by hkzhao123 about 5 years ago
- 3 comments
#47 - 指标里的模型都指的那个?
Issue -
State: open - Opened by oyjxer about 5 years ago
- 2 comments
#46 - 关于pytorch版本
Issue -
State: open - Opened by zjjhuihui about 5 years ago
#45 - 请问在Cloud TPU v3-256 上训练了24小时要花费多少美元或人民币?
Issue -
State: closed - Opened by Robets2020 about 5 years ago
- 3 comments
#44 - 关于多卡训练
Issue -
State: open - Opened by gaozhanfire about 5 years ago
- 7 comments
#43 - 运行run_classifier的时候oom了
Issue -
State: open - Opened by zhengchang231 about 5 years ago
- 5 comments
#42 - roberta 预训练语料格式
Issue -
State: closed - Opened by zhengwsh about 5 years ago
- 1 comment
#41 - 请问关于数据预处理
Issue -
State: open - Opened by ouwenjie03 about 5 years ago
- 3 comments
#40 - 去掉sentence loss,请问是否尝试SpanBert中 Span Boundary Objective目标loss,或者DOC-SENTENCES?以及采用 bytes-level 的 BPE?
Issue -
State: closed - Opened by sliderSun about 5 years ago
- 2 comments
#39 - Use Roberta in pytorch transformers
Issue -
State: open - Opened by Vimos about 5 years ago
- 5 comments
#38 - 请问能否在所有权重中都把mlm部分带上?
Issue -
State: closed - Opened by bojone about 5 years ago
- 3 comments
#38 - 请问能否在所有权重中都把mlm部分带上?
Issue -
State: closed - Opened by bojone about 5 years ago
- 3 comments
#37 - hi,请问文档中给出的性能测试中 ERNIE 指的是哪个? ERNIE 1.0 2.0 THU?
Issue -
State: closed - Opened by ArrogantL about 5 years ago
- 1 comment
#36 - Dynamic mask是否會造成data leakage的問題?
Issue -
State: closed - Opened by nickchiang5121 about 5 years ago
- 1 comment
#35 - [CLS],[SEP]
Issue -
State: closed - Opened by HongyanJiao about 5 years ago
- 5 comments
#34 - cuda out of memory
Issue -
State: closed - Opened by HongyanJiao about 5 years ago
- 1 comment
#33 - 您好,请问可以提供一下预训练阶段的代码吗
Issue -
State: closed - Opened by bytekongfrombupt about 5 years ago
- 3 comments
#32 - What is the best CPU inference acceleration solution for BERT now?
Issue -
State: closed - Opened by guotong1988 about 5 years ago
- 1 comment
#31 - 请教一下处理预训练数据的大致策略?
Issue -
State: closed - Opened by fengzuo97 about 5 years ago
- 3 comments
#30 - 请问一下,你的代码跟bert代码除了在预处理阶段,在其它地方有区别?
Issue -
State: closed - Opened by yiyele about 5 years ago
- 2 comments
#29 - roberta的英文预训练权重
Issue -
State: closed - Opened by c0derm4n about 5 years ago
- 1 comment
#28 - 您好,RoBERTa_zh_L12下载时点击网页打不开
Issue -
State: closed - Opened by haosiqing about 5 years ago
- 1 comment
#27 - 您好,请问RoBERTa_zh_L12 pytorch这个模型有别的方式下载么,您给的链接无法打开
Issue -
State: closed - Opened by hvuehu about 5 years ago
- 1 comment
#26 - 你好,请问你们发布的roBERTa中文模型的词表是否和BERT官方发布的中文模型的词表一样?谢谢
Issue -
State: closed - Opened by neilx4 about 5 years ago
- 1 comment
#25 - 关于预训练的embedding
Issue -
State: closed - Opened by beamind about 5 years ago
- 4 comments
#24 - 您好,我用发布的roberta_large当做语言模型测试句子的ppl值时,发现每个字的概率很小,不太合理;相同条件下bert模型句子中每个字的概率都是比较大的。请教一下原因?
Issue -
State: closed - Opened by Jethu1 about 5 years ago
- 6 comments
#23 - 与RoBERTa-wwm-ext, Chinese对比
Issue -
State: closed - Opened by jxst539246 about 5 years ago
- 3 comments
#22 - 6层的roberta模型啥时候发布呀?
Issue -
State: closed - Opened by Jethu1 about 5 years ago
- 10 comments
#21 - 关于BPE
Issue -
State: closed - Opened by ArthurRizar about 5 years ago
- 2 comments
#20 - 加载pytorch模型错误
Issue -
State: closed - Opened by Benjamin-zhangjb about 5 years ago
- 10 comments
#19 - 想尝试转换,用tf.train.list_variables(init_checkpoint)加载模型报错
Issue -
State: closed - Opened by wang001 about 5 years ago
- 5 comments
#18 - loss库
Issue -
State: closed - Opened by Pooky-Z about 5 years ago
- 1 comment
#17 - loss
Issue -
State: closed - Opened by Pooky-Z about 5 years ago
#16 - 请问为什么roberta_large比roberta_middle在CMRC2018上低很多?
Issue -
State: closed - Opened by ewrfcas about 5 years ago
- 19 comments
#15 - 不能使用tf.train.load_variable函数来加载模型吗?
Issue -
State: closed - Opened by RingoTC about 5 years ago
- 1 comment
#14 - 下载RoBERTa-zh-Large总是提示网页走丢
Issue -
State: closed - Opened by haosiqing about 5 years ago
- 6 comments
#13 - 可以增加xlnet的在这两个数据集上的对比吗?
Issue -
State: closed - Opened by RyanHuangNLP about 5 years ago
- 1 comment
#12 - 请问12layer的啥时候发布呢,我看界面写的今天。
Issue -
State: closed - Opened by YingZiqiang about 5 years ago
- 1 comment
#11 - 请问有国内链接吗,翻墙下载速度太慢了,谢谢!
Issue -
State: closed - Opened by TPF2017 about 5 years ago
- 3 comments
#10 - [question] Pretrain longer
Issue -
State: closed - Opened by lsq357 about 5 years ago
- 2 comments
#9 - 请问应该如何转成pyTorch格式?
Issue -
State: closed - Opened by Dongfeng-He about 5 years ago
- 21 comments
#8 - 开源的模型中没有mask language预测层的参数,可不可以带上一起开源,万分感谢
Issue -
State: closed - Opened by zuofeng1997 about 5 years ago
- 6 comments
#7 - 请问roberta的tokenizer和bert的中文 tokenizer是一样的吗
Issue -
State: closed - Opened by zuofeng1997 about 5 years ago
#6 - 原始的bert代码能加载这个模型吗
Issue -
State: closed - Opened by renmada about 5 years ago
- 1 comment
#5 - 下载失败,点开链接出现下面这个
Issue -
State: closed - Opened by luoyexuge about 5 years ago
- 1 comment
#4 - 下载失败
Issue -
State: closed - Opened by currenttime about 5 years ago
- 1 comment
#3 - 24层base版(roberta_l24_zh_base)是用于bert的么?
Issue -
State: closed - Opened by qblyy about 5 years ago
- 2 comments