MyJournals Home  

RSS FeedsEntropy, Vol. 25, Pages 243: Informativeness across Interpreting Types: Implications for Language Shifts under Cognitive Load (Entropy)


28 january 2023 13:45:54

Entropy, Vol. 25, Pages 243: Informativeness across Interpreting Types: Implications for Language Shifts under Cognitive Load (Entropy)

Previous quantitative studies discussing interpreting types have focused on various features of linguistic forms in outputs. However, none of them has examined their informativeness. Entropy, as a measure of the average information content and the uniformity of the probability distribution of language units, has been applied to quantitative linguistic research on different types of language texts. In the present study, entropy and repeat rate were used to investigate the difference of overall informativeness and concentration of output texts between simultaneous interpreting and consecutive interpreting. We intend to figure out the frequency distribution patterns of word and word category in two types of interpreting texts. Analyses of linear mixed-effects models showed that entropy and repeat rate can distinguish the informativeness of consecutive and simultaneous interpreting outputs, and consecutive interpreting outputs entail a higher word entropy value and a lower word repeat rate than simultaneous interpreting outputs. We propose that consecutive interpreting is a cognitive process which reaches an equilibrium between production economy for interpreters and comprehension sufficiency for listeners, especially in the case where input speeches are more complex. Our findings also shed lights on the selection of interpreting types in application scenarios. The current research is the first of its kind in examining informativeness across interpreting types, demonstrating a dynamic adaptation of language users to extreme cognitive load.

61 viewsCategory: Informatics, Physics
Entropy, Vol. 25, Pages 242: Trend Feature Consistency Guided Deep Learning Method for Minor Fault Diagnosis (Entropy)
Entropy, Vol. 25, Pages 245: Homogeneous Adaboost Ensemble Machine Learning Algorithms with Reduced Entropy on Balanced Data (Entropy)
blog comments powered by Disqus
The latest issues of all your favorite science journals on one page


Register | Retrieve



Copyright © 2008 - 2023 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten