website design templates

Digital Humanities
& Natural Language Processing

Multidisciplinary project combining newest practices in Digital Humanities with
computer Artificial Intelligence techniques.

More specifically, consists of a neural network for machine deep learning based on algorithms trained to recognize semantical and morpho-syntactical connection between texts written in ancient Greek dated between centuries V bc to VI ad.


MANIFEST

The 21st century arrived accompanied by exponential growth in the ability to produce, collect, process, and store information, driven by technological advancements in computing, artificial intelligence (AI), and digital communication. However, the advent of the "knowledge society," as Drucker named it more than 30 years ago, has revealed... even more complex paradoxes today. For example, the persistent digital divide in regions like EMEA, where equitable access to these technologies remains limited, hinders their potential to be used as educational or cultural resources. Likewise, the impact of automation and AI on the labor market has deepened the skills crisis, without ensuring adequate retraining of the workforce in emerging technological skills.

Digital transformation has enabled individuals to generate and utilize data on a global scale, but this does not necessarily translate into the creation of knowledge. Today, the proliferation of AI-driven tools has allowed unprecedented access to information. However, knowledge creation still requires reflection, critical thinking, and a deep understanding that transcends mere data processing. Although generative AI can synthesize information and produce coherent texts, semantic understanding of meaning remains a fundamentally human quality.

The humanities, meanwhile, have not been left out of this technological revolution. The rise of digital humanities has enabled remote, real-time access to vast libraries, medieval corpora, and digitized ancient sources, providing academics and researchers with new tools for analyzing, curating, and comparing sources. These technologies have facilitated global collaboration in research projects, where AI models play a significant role in organizing and analyzing large volumes of textual and visual data.

The field of Digital Humanities continues to evolve, and its definition keeps expanding in response to new technological possibilities. A key characteristic of this field is its interdisciplinary approach, which not only leverages emerging technologies such as Machine Learning to advance humanistic research but also questions the ethical, social, and philosophical implications of these technologies. The relationship between the humanities and the digital is bidirectional: the humanities can help us better understand the nature and impact of technology, while digital tools expand the scope of humanities research.

It is well known that the pursuit of "artificial intelligence" dates back to ancient Greek myths and humanity’s aspiration to create non-human intelligence. However, it was in 1956 that AI took computational form during the Dartmouth College summer workshop. Today, AI has evolved, making much of what was imagined back then a reality, with systems capable of operating autonomously, generating content, synthesizing natural language, and performing complex tasks. In this sense, AI—considered by some as an oxymoron and by others as the next step in human evolution—remains a subject of controversy in politics, business, and academia. Recent advances in multimodal models have raised profound ethical questions about the role of machines in society, along with ethical dilemmas regarding their control and development.

Far from apocalyptic predictions or utopian visions, there exists a global research community that, after enduring the "AI winters," has achieved continuous advancements in areas such as robotics, computer vision, AI applied to healthcare, and AI agents in commerce. Supported by academic and corporate laboratories, this community continues to progress through open collaboration and knowledge exchange across digital platforms. In this context, the concept of Human-Centered Artificial Intelligence gains relevance, aiming to develop AI systems aligned with human values, prioritizing ethics, transparency, and social well-being in their design and application. This is where LUMERA finds its inspiration and sources, and from which it hopes to contribute to the philosophical research of ancient Greek texts by applying natural language processing through AI with a focus on Digital Humanities.

SCOPE

LUMERA implies an interdisciplinary approach to Digital Humanities, comprising not only the outcome of the philosophical investigation per se and the associated natural language research tool but a critique of the artificial intelligence status quo as well.

DIGITAL HUMANITIES

Humanities Computing
Association for Computers and the Humanities
Philosophy of the Digital

ARTIFICIAL INTELLIGENCE

Ai & Machine Learning
Neural Networks & Deep Learning
Recurrent & Transformers NN

APPLICATION

Intertext analysis and language processing builds up from shorter attainable goals to a wider set of corpora, consequently initial stages focus in proven correspondences between authors and moves on to the exploration of new potential links within the neural network domain of expertise.

PHASING

For a more suitable project management and technical issues handling, LUMERA has been divided into three phases: infrastructure and repository setup, an experimental trial of different natural language processing algorithms and the corpora processing itself using the already trained neural network.


SETUP

Infrastructure
Python Framework
POS Tagging 

ANALYSIS

Text Parsing & Stemming
Tokenization
NLP Techniques

PROCESSING

Tensor Flow
Supervised Learning
Corpus Processing 

ABOUT

LUMERA was conceived and is permanently updated thanks to a close collaboration between scholars and a technical support team.

CONTACT

LEGAL NOTICE | PRIVACY POLICY | COOKIES POLICY | ACCESSIBILITY


© Copyright Lumera - All Rights Reserved