026-04 の履歴の現在との差分(No.0)


  • 追加された行はこの色です。
  • 削除された行はこの色です。
[[第26回研究会>026]]

* FinMegatron: Large Financial Domain Language Models [#a4d37073]

** 著者 [#v8857fac]
Xianchao Wu (NVIDIA)

**概要 [#m168fb41]
General domain pretrained large-scale language models, such as BERT and GPT3, have achieved state-of-the-art results among numerous NLP classification and generation applications. These pretraining technology is also willing to be used in vertical domains, such as finance. The downstream applications include financial event extraction from news, summarization, and causal inferencing. In this paper, we propose large-scale pretrained models for financial domain in English and Japanese languages. The original datasets come from professional financial news. We empirically study the factors of sub-word vocabulary set, model size and pre-trained corpus and their impacts to the downstream financial NLP applications, such as named entity recognition, relation extraction, causal inferencing and question answering. We will release our models and codes for open-usage.

**キーワード [#l1cbcbb0]
financial language models, BERT, GPT

**論文 [#sc67817c]

//(3月3日以降に公表いたします)
&ref(04_SIG-FIN-26.pdf);

トップ   新規 一覧 検索 最終更新   ヘルプ   最終更新のRSS