%0 Electronic Book Component Part %A Mohseni, Mahdi and Redies, Christoph and Gast, Volker %I %C %D 2022 %G English %@ 1099-4300 %~ Universitätsbibliothek "Georgius Agricola" %T Approximate entropy in canonical and non-canonical fiction %J Entropy %V 24 %P 1-16 %U https://doi.org/10.3390/e24020278 %X Abstract: Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics. Keywords: Approximate Entropy; Shannon Entropy; fictional texts; non-fictional texts; canonical texts; non-canonical texts; POS-tags; text classification %Z https://katalog.ub.tu-freiberg.de/Record/0-1801680620 %U https://katalog.ub.tu-freiberg.de/Record/0-1801680620