Digital Twin & Edge AI for Industrial IoT

1st Workshop on Digital Twin & Edge AI for IIoT (in conjunction with ACM MobiCom 2022)
17 October 2022


Industry 4.0, which primarily focuses on cyber-physical systems and automation, has occupied the spotlight in advanced manufacturing in the last decade. However, it will be a long way until the machine reaches human-level cognition and becomes fully autonomous. Realizing the importance of human factors, Industry 5.0 has been proposed to promote intimate collaboration between machinery and man, leveraging human creativity, expertise, and superior cognitive abilities to work alongside collaborative robots (cobots).

Compared to traditional robots, cobots are much more programmable and situationally aware. They are able to sense their surrounding (using computer vision and sensors) and learn how to safely collaborate with human workers (using AI) instead of having to stop when human is around. Digital Twin is another key enabling technology for Industry 5.0. A digital twin is an exact digital illustration of an existing physical object (car, cobot, building, plant process, or even human). When coupled with AI, it allows engineers to gain valuable insights into the exact operations of those physical entities and take actions in real-time, including root cause analysis, detailed monitoring, defect detection, and so on.

Modeling a digital twin requires an extensive network of cameras (for computer vision) and sensors (temperature, humidity, noise, vibration, air pressure, etc.), which constantly generates a massive amount of data. To handle this much data in real-time, mobile/multi-access edge computing (MEC) has emerged to bring computing power close to the data source right on the factory floor. With Edge AI, complex and data-hungry AI models can now be trained at the edge, eliminating the network bottleneck of transferring data to the cloud.

Key Dates
Paper Submission Deadline19th September, 2022 30th September, 2022
Notification to Authors5th October, 2022
Camera-ready Version Deadline14th October, 2022
Workshop Date17th October, 2022

Call for Papers

This workshop is broad in scope. It will bring together researchers, decision makers, and practitioners from various industries to promote the latest advances in Industry 5.0 with a particular focus on Digital Twin and Edge AI. Topics of interest include, but are not limited to:

  • Novel applications of Digital Twin and Edge AI for advanced manufacturing in Industry 5.0
  • Digital Twin modeling and architecture
  • New edge and embedded AI architectures
  • Edge AI computer vision tasks
  • Maximizing cobot’s situational awareness when working with human workers
  • Lightweight and compact AI models to run on the edge
  • Other advanced communication technologies that complement MEC connectivity such as 5G/6G, 5G New Radio, private 5G
  • 5G/6G-assisted indoor tracking/tracing/positioning/localization of IIoT assets
  • Environmental heterogeneity: multi-vendor integration, co-existence of different protocols, legacy equipment and processes, lack of universal standards and frameworks
Submission Instructions:

We are seeking original, unpublished research papers addressing the above topics. Each submission is a single PDF file limited to 6 pages (two-column, 10-point, following ACM conference proceedings format) including figures, references, etc. Papers must include author names and affiliations for single-blind peer reviewing by the technical program committee. Authors are expected to present their papers at the workshop. Papers accepted for presentation will be published in the MobiCom Workshop Proceedings, and available at the ACM Digital Library.

Submission site:

Important Dates:

Paper Submissions Deadline: 19th September, 2022 30th September, 2022
Notification to Authors: 5th October, 2022
Camera-ready Version Deadline: 14th October, 2022
Workshop Date: 17th October, 2022




Workshop Co-Chairs
Technical Program Committee Co-Chairs
Publicity Co-Chairs
  • Jianxin Li, Deakin University
  • Phu Lai, La Trobe University
  • Yueyue Dai, Huazhong University of Science and Technology