Session 11

11. Bridging the Gap of Computational (Structural) and Hermeneutic Approaches in Discourse Analysis

The Internet, with its massive number of digital materials, is an important source of information for discourse analyses. Even though methodologies of interpretative, hermeneutic research, such as Grounded Theory, provide strong concepts for the meaningful reduction of social reality for qualitative research, the substantial amount of digital resources calls for new methodological approaches for addressing this complexity in discourse analyses and its large bandwidth of variations. While most qualitative methods of data collection are limited to analyzing only a small part of information on the Internet, digital methods with their structural approaches to textual and visual expressions only provide somewhat inadequate meaning interpretations and thus are not capable to analyze a large scale of materials. Bridging the gap between structural patterns detected with digital methods and the interpretative, hermeneutic processes of meaning making thus becomes a most crucial endeavor in methodologies that seek to combine both approaches in the analysis of digital discourse materials. In discourse analytical approaches, human scientists are often confronted with the challenge of working with complex research fields, encompassing several text phenomena, theories, or methodological approaches. The consideration of a variety of styles and research questions, epistemological specifics, different levels of practice and experience matter, as well as the mutual assistance, reflection, and understanding between human and artificial actors. Ethical questions arise on how to control or avoid the use and development of biased research processes. With the integration of IT tools and infrastructures into the knowledge production in the humanities, ontological changes take place in the knowledge production; new social, ethical, and political constellations are consolidated, disturbed, or created. These complexities in human-centered research collide with and challenge the AI-oriented work that is often limited to identifiable phenomena on the text surface. Such computer-driven methods used in qualitative data analysis tools range from topic modeling, co-occurrence analysis, sentiment analysis to visualization of quantified discourses and patterns, offering insights into structural conjunctures and leave scope of what they actually mean. What is often missing in this mutual and iterative research process of quantitative (computational structural analysis on the surface) and qualitative analysis (the human analysis in the fine textures of the dossier), is a closer look on how this gap of structural analysis and the hermeneutic process of meaning making, and understanding is closed. In which ways do these modes of knowledge production affect interpretative epistemologies? How can this discrepancy be adequately recognized and addressed? The panel invites contributions that address the challenge of bridging the gap between computational and hermeneutic analyses by developing integrative methodological approaches of discourse analyses. It is particularly interested in how the analytical modes are implemented in research practices, how this interplay of structural and hermeneutic is organized in a meaningful way and what this means in terms of quality and effectiveness of the research process. By focusing on this gap based on a range of concrete examples from practical experience, the panel seeks to address the epistemological challenges that arise from the use of AI-assisted methods for questions of humanities and how they can be dealt with.