DiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporamaDiaporama
Accueil
ICube   >   Agenda : Thèse : Scaling Intelligence: A Formal and Practical Framework for Computationally Unbounded AI

Thèse : Scaling Intelligence: A Formal and Practical Framework for Computationally Unbounded AI

28 novembre 2025
09h00
auditorium CRBS

Soutenance de thèseAli Khudiyev

Titre : Scaling Intelligence: A Formal and Practical Framework for Computationally Unbounded AI

Date et heure : le 28 novembre 2025 à 9h

Lieu :  CRBS Auditorium (1 rue Eugène Boeckel, 67000 Strasbourg)

La présentation aura lieu en anglais.

Jury :

Reviewers

  • Cédric WEMMERT, UNISTRA
  • Thomas GUYET, INRIA Lyon
  • Ulviyya ABDULKARIMOVA, UFAZ

Thesis Directors

  • Anne Jeannin-Girardon, UNISTRA
  • Latafat Gardashova, ASOIU

Examiners

  • Angelo STEFFENEL, University of Reims Champagne-Ardenne
  • Frédéric BERTRAND, CNAM Paris

Abstract :

Fixed architectures in artificial intelligence (AI) systems are inherently limited by computational bounds, as established by complexity theory, where certain problems demand resources that scale exponentially with input size. This dissertation introduces Computationally Unbounded AI (CUAI), a paradigm enabling AI to autonomously scale computational capacity to address escalating task demands, inspired by the biological principles of modularity and plasticity. Motivated by the need to overcome the brittleness of static models in dynamic, open-ended environments, CUAI uses self-directed mechanisms to expand architectures and learning processes without human intervention, paving the way for achieving autonomy in AI systems. The primary contributions are twofold. First, the Column Extension Framework scales artificial neural networks (ANNs) through graph rewriting: selecting subgraphs within the ANN’s architecture and replacing them with duplicates of the original network to increase depth and breadth for ensuring autonomous growth validated on the MNIST-1D dataset. Second, the Capacity-Aware Learning (CAL) Framework integrates boosting with selective prediction, starting with base learners (e.g., multilayer perceptron auto-encoder) trained to reconstruct inputs with a rejection option, then adding identical models with reweighted data to prioritize difficult samples, achieving progressive task coverage. Inspired by cortical columns studied in the field of neuroscience, CAL ensures capacity-aware adaptation, outperforming single models in reconstruction accuracy and coverage.

À la une

La conférence EGC (Extraction et Gestion des Connaissances) s’est déroulée du 27 au 31 janvier 2025...

Flux RSS

Flux RSS