1 |
ALUÍSIO, S. et al. An account of the challenge of tagging a reference corpus for brazilian portuguese. In: MAMEDE, N. J. et al. (Ed.). Computational Processing of the Portuguese Language. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. p. ARTSTEIN, R. Inter-annotator agreement. In: Handbook of linguistic annotation. [S.l.]: Springer, 2017. p. 297–313.
|
|
2 |
AZIZ, W.; SPECIA, L. Fully automatic compilation of a Portuguese-English parallel corpus for statistical machine translation. In: STIL 2011. Cuiabá, MT: [s.n.], 2011.
|
|
3 |
BICK, E. The Parsing System “Palavras”. Automatic Grammatical Analysis of Portuguese in a Constraint Grammar Framework. Århus: University of Arhus, 2000.
|
|
4 |
BOJANOWSKI, P. et al. Enriching word vectors with subword information. arXiv preprint arXiv:1607.04606, 2016.
|
|
5 |
BROWN, T. et al. Language models are few-shot learners. In: LAROCHELLE, H. et al. (Ed.). Advances in Neural Information Processing Systems. Curran Associates, Inc., 2020. v. 33, p. 1877–1901.
|
|
6 |
DEMSZKY, D. et al. GoEmotions: A dataset of fine-grained emotions. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Online: Association for Computational Linguistics, 2020. p. 4040–4054.
|
|
7 |
DEVLIN, J. et al. BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Minneapolis, Minnesota: Association for Computational Linguistics, 2019. p. 4171–4186.
|
|
8 |
JOSHI, M. et al. SpanBERT: Improving Pre-training by Representing and Predicting Spans. Transactions of the Association for Computational Linguistics, v. 8, p. 64–77, 01 2020. ISSN 2307-387X.
|
|
9 |
JURAFSKY, D.; MARTIN, J. H. Speech and Language Processing. [s.n.], 2021. Draft of December 29, 2021
|
|
10 |
KILGARRIFF, A. I don’t believe in word senses. Computers and the Humanities, Springer, v. 31, n. 2, p. 91–113, 1997. ISSN 00104817.
|
|
11 |
LIU, Y. et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach. 2019. Cite arxiv:1907.11692
|
|
12 |
MCSHANE, M.; NIRENBURG, S. Linguistics for the Age of AI. The MIT Press, 2021. ISBN 9780262363136.
|
|
13 |
MIKOLOV, T. et al. Efficient estimation of word representations in vector space. Proceedings of Workshop at ICLR, 2013.
|
|
14 |
OLIVEIRA, H. G. et al. As wordnets do português. Oslo Studies in Language, v. 7, n. 1, p. 397–424, 2015.
|
|
15 |
PAPINENI, K. et al. Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics. Philadelphia, Pennsylvania, USA: Association for Computational Linguistics, 2002. p. 311–318.
|
|
16 |
PENNINGTON, J.; SOCHER, R.; MANNING, C. GloVe: Global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar: Association for Computational Linguistics, 2014. p. 1532–1543.
|
|
17 |
Santos et al. 2010 Relações semânticas em português: comparando o TeP, o MWN.PT, o Port4NooJ e o PAPEL. [S.l.]: Associação Portuguesa de Linguística: Lisboa, 2010. 681–700 p.
|
|
18 |
SOUZA, F.; NOGUEIRA, R.; LOTUFO, R. BERTimbau: Pretrained BERT Models for Brazilian Portuguese. In: CERRI, R.; PRATI, R. C. (Ed.). Intelligent Systems. Cham: Springer International Publishing, 2020. p. 403–417. ISBN 978-3-030-61377-8.
|
|