Gonçalves, José Luiz Vila Real2022-03-292022-03-292020GONÇALVES, J. L. V. R. Looking for relevance into the eyes: in search of interpretive resemblance in translation through gazing data. Cadernos de Tradução, Florianópolis, v. 40, n. 2, p. 17-44, set-dez, 2020. Disponível em: <https://www.scielo.br/j/ct/a/kfrS4hGjqZHvCRxFrS68fYF/?lang=en>. Acesso em: 25 ago. 2021.2175-7968http://www.repositorio.ufop.br/jspui/handle/123456789/14763This paper is situated at the translation process research branch of the Descriptive Translation Studies and aims at exploring some eye-tracking, key-logging and retrospective protocol experimental data (originally from Fonseca, 2016) building on the Relevance theory framework and its key-concept of interpretive resemblance in translation (cf. Gutt, 2000; Alves, 1995). It aims at identifying and discussing instances of optimal interpretive resemblance and the expected balance between processing effort and cognitive effects in translating. The data was collected at the Laboratory for Experimentation in Translation (LETRA-UFMG), Brazil, and for this exploratory study the subject chosen was a professional translator who translated a short popularization of science text from English into Portuguese with no external support and no time pressure to accomplish the task. Tobii T-60 eyetracker, Tobii Studio and Translog II, besides written retrospective protocols, were used as the main methodological tools for data collection and analyses. After carrying out discussions and some qualitative and quantitative analyses, hypotheses were raised proposing challenging questions for future research on translation processes and eventually cognitive studies and translation competence and expertise.en-USabertoTranslation process researchRelevance TheoryEmpirical experimental methodsTeoria da RelevânciaMétodos empírico-experimentaisLooking for relevance into the eyes : in search of interpretive resemblance in translation through gazing data.Em busca de relevância no olhar : à procura de semelhança interpretativa através de dados de rastreamento ocular.Artigo publicado em periodicoEsta obra utiliza uma licença Creative Commons CC BY: https://creativecommons.org/licence. Fonte: o PDF do artigo.https://doi.org/10.5007/2175-7968.2020v40nesp2p17