In previous work, the learnability of several parameterized families of categorial grammar classes was studied. These classes were shown to be learnable in the technical sense of identifiability in the limit from positive data. They are defined in terms of bounds on parameters of the grammars which intuitively correspond to restrictions on linguistic aspects, such as the amount of lexical ambiguity.
The time complexity of learning these classes has been studied. It was shown that, for most of these classes, selecting a grammar from the class that is consistent with given data is NP-hard. In this paper existing complexity results are sharpened by demonstrating W-hardness. Additionally, parameters are defined that allow FPT-results; roughly, this implies that if these parameters are fixed, these problems become tractable.
We also define the new family Gk-sum-val , which is natural from the viewpoints of Parameterized Complexity, a flourishing area of Complexity Theory, and from Descriptional Complexity, a sub-area of Formal Language Theory. We prove its learnability, analyze its relation to other classes from the literature and prove a hierarchy theorem.
This approach is then generalized to a parameterized family defined in terms of a bound on the descriptional complexity expressed as a Hölder norm. We show that both the hierarchy result and the property of finite elasticity (and thus learnability) are preserved under this generalization.