There is an almost unanimous theoretical consensus according to which human languages are externalized as linear sequences of atomic units which are encoded according to specific hierarchical conditions. The nature of the interplay between the cognitive development of these hierarchical representations and their linearization on the string is however still not clear. In this paper, we aim to address this issue, exploring the relationship between precedence and containment by capitalizing on the results of a new experimental paradigm that has already provided interesting insights (Vender et al. 2019, 2020). More specifically, we report the results of two modified Simon Tasks in which the sequence of stimuli is determined by the rules of the Fibonacci grammar (Fib) or of its modifications Skip and Bif. All three grammars share the same transitional regularities, but they crucially differ in their structure: only Fib is characterized by the presence of so-called k-points, which provide, from a purely computational perspective, a potential bridge to full hierarchical reconstruction. We tested 64 adults’ implicit learning skills, assessing learning of the statistical regularities in Fib, Skip and Bif, while also exploring the presence of hierarchical learning, in terms of the ability to predict k-points. Results provide evidence not only for the presence of statistically-based sequential learning, but also for hierarchical learning in Fib. We argue that the relations of precedence and containment are not antagonistic ways of processing a temporally ordered sequence of symbols; rather, they are strictly interdependent implementations of an abstract mathematical relation of linear ordering within a bidimensional computational space. We propose that the construction of this bidimensional space is primarily determined by labeling requirements, with the labeling algorithm emerging as the solution to the problem of mapping precedence into containment.

Mapping precedence into containment: linear ordering in a bidimensional space

Vender M.
;
Compostella A.;Delfitto D.
2023-01-01

Abstract

There is an almost unanimous theoretical consensus according to which human languages are externalized as linear sequences of atomic units which are encoded according to specific hierarchical conditions. The nature of the interplay between the cognitive development of these hierarchical representations and their linearization on the string is however still not clear. In this paper, we aim to address this issue, exploring the relationship between precedence and containment by capitalizing on the results of a new experimental paradigm that has already provided interesting insights (Vender et al. 2019, 2020). More specifically, we report the results of two modified Simon Tasks in which the sequence of stimuli is determined by the rules of the Fibonacci grammar (Fib) or of its modifications Skip and Bif. All three grammars share the same transitional regularities, but they crucially differ in their structure: only Fib is characterized by the presence of so-called k-points, which provide, from a purely computational perspective, a potential bridge to full hierarchical reconstruction. We tested 64 adults’ implicit learning skills, assessing learning of the statistical regularities in Fib, Skip and Bif, while also exploring the presence of hierarchical learning, in terms of the ability to predict k-points. Results provide evidence not only for the presence of statistically-based sequential learning, but also for hierarchical learning in Fib. We argue that the relations of precedence and containment are not antagonistic ways of processing a temporally ordered sequence of symbols; rather, they are strictly interdependent implementations of an abstract mathematical relation of linear ordering within a bidimensional computational space. We propose that the construction of this bidimensional space is primarily determined by labeling requirements, with the labeling algorithm emerging as the solution to the problem of mapping precedence into containment.
2023
implicit learning, statistical learning, hierarchical learning, precedence vs. containment, Lindenmayer systems, chunking and labeling processes.
File in questo prodotto:
File Dimensione Formato  
Vender et al. (2023) Mapping precedence into linear order.pdf

accesso aperto

Tipologia: Versione dell'editore
Licenza: Dominio pubblico
Dimensione 667.05 kB
Formato Adobe PDF
667.05 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1102846
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact