12/30/21 · Research

Big data analysis makes it possible to challenge the Eurocentric view of cultural exchanges

Researchers from the UOC have opened the door to the possibility of rewriting Translation History
Big Translation History is a conceptual framework and a new methodology for decentralizing translation history, and cultural and literary history
UOC researchers study the circulation of literary translations between 1898 and 1945 (photo: Aaron Burden / unsplash.com)

UOC researchers study the circulation of literary translations between 1898 and 1945 (photo: Aaron Burden / unsplash.com)

The use of digital tools and data science is still rare in Translation History. However, two researchers from the Global Literary Studies Research Lab (GlobaLS) have shown how big data analysis and machine learning will enable us to contextualize or even rewrite it. GlobaLS is a research group affiliated to the Internet Interdisciplinary Institute (IN3) and the Faculty of Arts and Humanities at the Universitat Oberta de Catalunya (UOC).

The researchers are Diana Roig-Sanz, leader of the GlobaLS group, ICREA research professor, and recipient of a European Research Council Starting Grant, and Laura Fólica, a postdoctoral researcher working in the group. In their article, published in Translation Spaces, they propose the term of Big Translation History as a conceptual framework and methodology to analyse big data and metadata that can be used to challenge the Eurocentric accounts of cultural exchanges and circulation of books to reveal little-known or previously hidden patterns.

Their research includes a case study examining the circulation of literary translations and the agents involved – writers, translators, publishers – in the Spanish-speaking world between 1898 (the end of Spanish rule in Cuba) and 1945 (the end of the Second World War), using metadata from the catalogue of Spain's National Library.

Digitization and big data to reinterpret the past

"Big Translation History is a great leap forward, as it challenges previous research on the circulation of translations by examining metadata and the ever-growing number of digitized texts available," said Roig-Sanz and Fólica. "By comparing data sets and translation flows on a scale that could never be undertaken by a researcher working on their own, we are helping to rewrite a new translation history that is more inclusive and able to challenge the misnomers of 'central' and 'peripheral' and include the gender perspective to study the role of female translators."

The analysis proposed by this new conceptual tool is based on three core principles: research carried out on a large scale in terms of both time and geographical scope; the use of massive data – both big data and little data, including data that are in many cases varied in nature and unstructured, and the use of computational techniques such as machine learning, data mining and artificial intelligence.

Challenging translation historiography

In their case study on the circulation of translations between 1898 and 1945, Roig-Sanz and Fólica used the 28 million volumes in the catalogue of Spain's National Library. Despite methodological challenges, including comparing results with the metadata from other Ibero-American library catalogues, the study successfully provides a new view of the major Spanish-language publishing centres, the most commonly translated languages and the Hispanic and foreign authors with most works.

"One of the main objectives of this new approach is to challenge the national historiographies of translation," they said. "This can prove very valuable not just for researchers working on periods of history when geographical borders were altered and significant changes took place, but also for research on smaller, less translated literatures."

Related article

Roig-Sanz, D; Fólica, L. Big translation history. Data science applied to translated literature in the Spanish-speaking world, 1898–1945. Translation Spaces, Vol. 10, no. 2, 2021, pp. 231–259. https://doi.org/10.1075/ts.21012.roi

This research by the UOC supports Sustainable Development Goal (SDG) 4 (Quality Education)

 

UOC R&I

The UOC's research and innovation (R&I) is helping overcome pressing challenges faced by global societies in the 21st century, by studying interactions between technology and human & social sciences with a specific focus on the network society, e-learning and e-health.

Over 500 researchers and 52 research groups work among the University's seven faculties and two research centres: the Internet Interdisciplinary Institute (IN3) and the eHealth Center (eHC).

The University also cultivates online learning innovations at its eLearning Innovation Center (eLinC), as well as UOC community entrepreneurship and knowledge transfer via the Hubbik platform.

The United Nations' 2030 Agenda for Sustainable Development and open knowledge serve as strategic pillars for the UOC's teaching, research and innovation. More information: research.uoc.edu #UOC25years

Experts UOC

Press contact

You may also be interested in…

Most popular

See more on Research