Seems to be running since 1 January 2023 already, but I never encountered it so far, nor saw any results. @j.walker, are you familiar with this project?
Some very ambitious and wide-ranging goals, judging from https://sharespace.eu/about/
The vision of SHARESPACE is the creation of future Social Hybrid Spaces (SHS) shared by humans and avatars engaged in embodied collaborative tasks, where social sensorimotor primitives are transparently captured through mobile connected innovative sensors, and then reconstructed using novel extended reality (XR) technology. Our ambition is to create a hybrid, multimodal-multisensory integrated platform which adapts to individual users and enables them to interact in an embodied shared space by learning, identifying, and reconstructing the core sensorimotor primitives of social interactions.
Objective 1: Definition of embodied social interactions in XR. Conceptual, modelling, and experimental approaches rooted in (i) characterising a successful social interaction in SHS, (ii) identifying neuroscientific principles underlying the reconstruction of social sensorimotor primitives in SHS, and (iii) formalising core primitives.
Objective 2: A novel fully mobile, un-obstructive, fine-grained, body kinematic capture system with on-body visual-inertial sensors and processing. Developing a mobile, reliable, and intuitive system, incorporating (i) self-calibration of ego-centric visual-inertial tracking, (ii) online motion prediction, segmentation, and primitive encoding, and (iii) hand skeleton tracking (from the XR-glasses camera sensors).
Objective 3: Cognitive architectures for virtual avatars. Populating SHS with humans and avatars by exploiting a library of primitives and metrics for interaction, through (i) AI-based architectures to drive avatars with high efficiency, reliability, and different levels of autonomy, (ii) data-driven models to generate synthetic datasets to train avatars, and (iii) the multi-users, multimodal SHARESPACE communication platform.
Objective 4: Virtual avatar animation. Rendering of remote participants onto virtual avatars in a highly personalised way by (i) respecting animation constraints, (ii) synchronising animations, (iii) mapping avatars in their environment, (iv) incorporating culture, gender, age, body types, (iv) real-time rendering through learned physics and (v) generating high-quality facial expressions out of audio (speech) and eye-tracking data.
Objective 5: Spatial mobile eXtended Reality display. Visualising virtual avatars in (i) optical see-through multifocal XR-glasses with (ii) integrated eye-tracking and (iii) forward-oriented wide-angle camera for scene 6DoF localisation and hand detection.
Objective 6: Socio-cultural constraints and ethics on social XR interactions. Evaluating the influence of (i) demographics, (ii) psychological and (iii) sociocultural factors in SHS experiences, to optimise interaction, reduce exclusion, preserve individualities and increase social intelligence, as well as mitigate possible negative effects of SHS. In line with commission objectives, we will also uncover ethical and legal impacts of our innovation.
Objective 7: Real-world validation of our embodied social hybrid space. Implementing and testing our integrated solutions amplifying sensorimotor communication in SHS in three different but complementary real-world scenarios: Health (Social Low Back Pain Exergame), Sport (Family Peloton Cycling), and Art (Shared Creativity).