1st International Workshop on Programmable and Learning Architectures for Intelligent Networks (PLAIN)
The introduction of virtualization, softwarization, and programmability has reshaped networks into dynamic, software-defined infrastructures capable of supporting emerging AI workloads. In this evolving paradigm, the network is no longer a passive conduit, but an intelligent and programmable substrate that actively contributes to distributed AI.
As inference and training increasingly shift toward the edge, future systems must provide not only bandwidth and reliability, but also real-time adaptability, in-network processing, and cross-layer integration across heterogeneous wireless, optical, and core domains. Beyond connectivity, networks are expected to become co-executors of AI workflows, enabling functions such as model aggregation, split inference, and intent-driven resource allocation.
PLAIN – Programmable and Learning Architectures for Intelligent Networks provides a focused venue to discuss how network architectures must evolve to support AI-native operations. The workshop emphasizes hardware/software co-design, where physical-layer programmability (e.g., P4, eBPF, SmartNICs), AI algorithms, and orchestration strategies are jointly developed. Novel use cases, ranging from federated learning and autonomous systems to real-time inference and digital twins, highlight the need for scalable, resilient, and sustainable AI-native networks.
Topics:
- AI-driven design of network protocols and architectures
- Programmable and reconfigurable infrastructures for AI workloads
- Federated and edge AI enhanced by network awareness
- Orchestration of AI pipelines across heterogeneous domains
- Telemetry-driven network optimization and closed-loop control
- High-performance delivery and inference of AI models
- Experimental testbeds and platforms for AI-native networks
- Energy-efficient AI deployment at the edge and core
- AI-enabled automation in 6G RAN, core, and transport
- Digital twins for network monitoring, optimization, and control
- Resilience, security, and survivability of AI-native networks
- AI-driven orchestration of backhaul, midhaul, and fronthaul
- Open and disaggregated network architectures for AI integration
- In-network aggregation and distributed learning mechanisms
Paper Submission Guidelines
Papers submitted to ICIN 2026 Workshops will be assessed based on originality, technical soundness, clarity and interest to a wide audience. All submissions must be written in English and must use standard IEEE two-column conference template, available for download from the IEEE website: https://www.ieee.org/conferences/publishing/templates.html
Workshop papers can be up to 8 pages (Full papers) and 5 pages (Short papers), including tables, figures and references.
Only PDF files will be accepted for the review process and all submissions must be done electronically through EDAS at: https://edas.info/N34510
Important Deadlines:
- Workshop paper submission deadline: January 4, 2026
- Acceptance notification: January 18, 2026
- Camera-ready paper submission: January 25, 2026
Contact:
- Emilio Paolini — General Co-Chair, Scuola Superiore Sant’Anna, emilio.paolini@santannapisa.it
- Memedhe Ibrahimi — General Co-Chair, Politecnico di Milano, memedhe.ibrahimi@polimi.it
- Gianluca Davoli — TPC Chair, Università di Bologna, gianluca.davoli@unibo.it




