SBBD

Paper Registration

1

Select Book

2

Select Paper

3

Fill in paper information

4

Congratulations

Fill in your paper information

English Information

(*) To change the order drag the item to the new position.

Authors
# Name
1 Carlos Abel Córdova Sáenz(carlosab1802@gmail.com)
2 Karin Becker(karin.becker@inf.ufrgs.br)

(*) To change the order drag the item to the new position.

Reference
# Reference
1 Abnar, S. and Zuidema, W. (2020). Quantifying attention flow in transformers. In Proc. of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4190–4197.
2 ALDayel, A. and Magdy, W. (2021). Stance detection on social media: State of the art and trends. Information Processing Management, 58(4):102597.
3 Chefer, H., Gur, S., and Wolf, L. (2021). Transformer interpretability beyond attention visualization. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 782–791.
4 Clark, J. H., Garrette, D., Turc, I., and Wieting, J. (2021). Canine: Pre-training an efficient tokenization-free encoder for language representation. arXiv preprint arXiv:2103.06874.
5 Devlin, J., Chang, M., Lee, K., and Toutanova, K. (2019). BERT: pre-training of deep bidirectional transformers for language understanding. In Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, (NAACL-HLT), pages 4171–4186.
6 Ebeling, R., Régis, C. C., Nobre, J. C., and Becker, K. (2022). Analysis of the influence of political polarization in the vaccination stance: the brazilian covid-19 scenario. In Proc. of the 15th Intl. Conference on Web and Social Media (ICWSM). To appear.
7 Ebeling, R., Sáenz, C. C., Nobre, J. C., and Becker, K. (2020). Quarenteners vs. cloroquiners: a framework to analyze the effect of political polarization on social distance stances. In Anais do VIII Symposium on Knowledge Discovery, Mining and Learning, pages 89–96. SBC.
8 Giorgioni, S., Politi, M., Salman, S., 0001, R. B., and Croce, D. (2020). Unitor @ sardistance2020: Combining transformer-based architectures and transfer learning for robust stance detection. In Proc. of the Seventh Evaluation Campaign of Natural Language Processing and Speech Tools for Italian. Final Workshop (EVALITA 2020), volume 2765 of CEUR Workshop Proceedings. CEUR-WS.org.
9 Jain, S. and Wallace, B. C. (2019). Attention is not Explanation. In Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1, pages 3543–3556.
10 Kawintiranon, K. and Singh, L. (2021). Knowledge enhanced masked language model for stance detection. In Proc. of the 2021 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4725–4735.
11 Kokalj, E., Škrlj, B., Lavrač, N., Pollak, S., and Robnik-Šikonja, M. (2021). BERT meets shapley: Extending SHAP explanations to transformer-based classifiers. In Proc. of the EACL Hackashop on News Media Content Analysis and Automated Report Generation, pages 16–21.
12 Molnar, C. (2019). Interpretable Machine Learning. https://christophm.github.io/interpretable-ml-book/.
13 Rogers, A., Kovaleva, O., and Rumshisky, A. (2020). A primer in bertology: What we know about how bert works. Transactions of the Association for Computational Linguistics, 8:842–866.
14 Souza, F., Nogueira, R., and Lotufo, R. (2020). Bertimbau: Pretrained bert models for brazilian portuguese. In Cerri, R. and Prati, R. C., editors, Intelligent Systems, pages 403–417, Cham. Springer International Publishing.
15 Vig, J. (2019). A multiscale visualization of attention in the transformer model. In Proc. of the 57th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 37–42.
16 Wiegreffe, S. and Pinter, Y. (2019). Attention is not not explanation. In Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th International Joint Conf. on Natural Language Processing (EMNLP-IJCNLP), pages 11–20.