Syntactic Networks-Kernel Memory Approach

CHF 164.35
Auf Lager
SKU
6ODAT0EBLEK
Stock 1 Verfügbar
Geliefert zwischen Do., 26.02.2026 und Fr., 27.02.2026

Details

This book proposes a novel connectionist approach to a challenging topic of language modeling within the context of kernel memory and artificial mind system, both proposed previously by the author in the very first volume of the series, Artificial Mind SystemKernel Memory Approach: Studies in Computational Intelligence, Vol. 1. The present volume focuses on how syntactic structures of language are modeled in terms of the respective composite connectionist architectures, each embracing both the nonsymbolic and symbolic parts. These two parts are developed via inter-module processes within the artificial mind system and eventually integrated under a unified framework of kernel memory. The data representation by the networks embodied within the kernel memory principle is essentially local, unlike conventional artificial neural network models such as the pervasive multilayer perceptron-based neural networks. With this locality principle, kernel memory inherently bears many attractive features, such as topologically unconstrained network formation, straightforward network growing, shrinking, and reconfiguration, no requirement of arduous iterative parameter tuning, construction of transparent and hierarchical data structures, and multimodal and temporal data processing via the network representation. Exploiting these multifacet properties of kernel memory with interweaving the notion of inter-module processing within the artificial mind system provides coherent accounts for concept formation and how various linguistic phenomena, viz. word compoundings, morphologies, and multiword constructions, are modeled. The description is then extended to more intricate network models of context-dependent lexical network and syntactic-oriented processing, the latter being the central theme of the present study, and further to those representing a hybrid of nonverbal and verbal thinking, and semantic and pragmatic aspects of sentential meaning. The book is intended for general readers engaging in various areas of study in cognitive science, computer science, engineering, linguistics, philosophy, psycholinguistics, and psychology.



Focuses upon providing a framework to model a composite network system Proposes a novel connectionist approach to a challenging topic of language modeling Presents a conceptual framework of kernel memory, rather than pursuing the individual topics each by each

Inhalt
Review of the Two Existing Artificial Neural Network Models Multilayer Perceptron and Probabilistic Neural Networks.- Beyond the Original PNN Model Kernel Memory for Modeling Various Neural Pattern Processing Mechanism.- Modules within the Artificial Mind System and Their Interactions Relevant to Language Pattern Processing.- Concept Formation.

<p

Weitere Informationen

  • Allgemeine Informationen
    • GTIN 09783031573118
    • Genre Technology Encyclopedias
    • Auflage 2024
    • Lesemotiv Verstehen
    • Anzahl Seiten 148
    • Herausgeber Springer Nature Switzerland
    • Größe H241mm x B160mm x T14mm
    • Jahr 2024
    • EAN 9783031573118
    • Format Fester Einband
    • ISBN 3031573110
    • Veröffentlichung 22.05.2024
    • Titel Syntactic Networks-Kernel Memory Approach
    • Autor Tetsuya Hoya
    • Untertitel Studies in Computational Intelligence 1157
    • Gewicht 395g
    • Sprache Englisch

Bewertungen

Schreiben Sie eine Bewertung
Nur registrierte Benutzer können Bewertungen schreiben. Bitte loggen Sie sich ein oder erstellen Sie ein Konto.
Made with ♥ in Switzerland | ©2025 Avento by Gametime AG
Gametime AG | Hohlstrasse 216 | 8004 Zürich | Schweiz | UID: CHE-112.967.470
Kundenservice: customerservice@avento.shop | Tel: +41 44 248 38 38