SBBD

Paper Registration

1

Select Book

2

Select Paper

3

Fill in paper information

4

Congratulations

Fill in your paper information

English Information

(*) To change the order drag the item to the new position.

Authors
# Name
1 Laura Camargos(laura.camargos@usp.br)
2 Leonardo Moraes(leo.mauro.desenv@gmail.com)
3 Cristina Aguiar(cdac@icmc.usp.br)

(*) To change the order drag the item to the new position.

Reference
# Reference
1 Aurpa, T. T., Rifat, R. K., Ahmed, M. S., Anwar, M. M., and Ali, A. B. M. S. (2022). Reading comprehension based question answering system in Bangla language with transformer-based learning. Heliyon, 8(10):e11052.
2 Chebbi, I., Boulila, W., and Farah, I. R. (2015). Big data: Concepts, challenges and applications. In Computational Collective Intelligence. Lecture Notes in Computer Science, volume 9330, pages 638–647.
3 Chen, D., Fisch, A., Weston, J., and Bordes, A. (2017). Reading Wikipedia to answer open-domain questions. CoRR, abs/1704.00051.
4 Clark, K., Luong, M., Le, Q. V., and Manning, C. D. (2020). ELECTRA: pre-training text encoders as discriminators rather than generators. CoRR, abs/2003.10555.
5 Devlin, J., Chang, M., Lee, K., and Toutanova, K. (2018). BERT: pre-training of deep bidirectional transformers for language understanding. CoRR, abs/1810.04805.
6 Hirschman, L. and Gaizauskas, R. (2001). Natural language question answering: the view from here. Natural Language Engineering, 7(4):275–300.
7 Jardim, P. C., Moraes, L. M. P., and Aguiar, C. D. (2023). QASports: A question answering dataset about sports. In Proceedings of the Brazilian Symposium on Databases: Dataset Showcase Workshop, pages 1–12.
8 Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., and Stoyanov, V. (2019). RoBERTa: A robustly optimized BERT pretraining approach. CoRR, abs/1907.11692.
9 Mishra, A. and Jain, S. K. (2016). A survey on question answering systems with classification. Journal of King Saud University - Computer and Information Sciences, 28(3):345–361.
10 Moraes, L. M. P., Jardim, P., and Aguiar, C. D. (2023). Design principles and a software reference architecture for big data question answering systems. In Proc. of the 25th International Conference on Enterprise Information Systems, pages 57–67.
11 Rajpurkar, P., Zhang, J., Lopyrev, K., and Liang, P. (2016). Squad: 100,000+ questions for machine comprehension of text. CoRR, abs/1606.05250.
12 Sanh, V., Debut, L., Chaumond, J., and Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR, abs/1910.01108.
13 Wang, W., Wei, F., Dong, L., Bao, H., Yang, N., and Zhou, M. (2020). Minilm: Deep self-attention distillation for task-agnostic compression of pre-trained transformers. In Advances in Neural Information Processing Systems, volume 33, pages 5776–5788.