🚀 𝗘𝗠𝗜𝗟 >> 𝗙𝗔𝗦𝗧 𝗙𝗢𝗥𝗪𝗔𝗥𝗗 - 𝗲𝗱𝗶𝘁𝗶𝗼𝗻 𝟯 𝗶𝘀 𝗹𝗶𝘃𝗲! Catch the latest edition of FAST FORWARD, our rapid-fire update on all things EMIL. In under 15 minutes, dive into the breakthroughs shaping the future of immersive tech across Europe. ✨ From VR-empowered smart garments to immersive exhibitions and advanced haptic devices for teleoperation, this third edition is a showcase of innovation and impact in interactive exergames that are transforming therapy and elder care, VR, AR, XR and beyond. Stay informed on research highlights, award nominations and prototypes. Ready to experience what’s next in immersive technology? 🎬 𝗪𝗮𝘁𝗰𝗵 𝗻𝗼𝘄 >> https://lnkd.in/eZXXaRnt #EMIL #FastForward #Innovation #ImmersiveTech #EUfunding #XR Aalto University | Filmakademie Baden-Wuerttemberg | Animationsinstitut | University of Bath | Universitat Pompeu Fabra | Vrije Universiteit Amsterdam (VU Amsterdam) | Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS) | Atlas V | Pixelcloud GmbH & Co. KG | Polycular | Metropolia University of Applied Sciences | Moverse | Broomx | CREW Brussels | SenseGlove | Design Academy Eindhoven
EMIL >> FAST FORWARD 11/24
Transcript
Hey, I���m Tim representing Aalto University's Lighthouse project, smart garments for enhancing virtual experiences with thermal feedback. Over��the past few weeks, we've been refining���� the design of these garments to make sure�� they give more consistent haptic feedback.�� We���ve also set up a user study where we���re�� testing how different heat and vibration���� patterns can affect people���s emotions. We've�� created both static and moving heat patterns���� and are measuring physiological signals such�� as sweat levels, heart rate or eye activity.���� The study is now officially underway�� and we���re excited to see the results. Hi, I am Alexander Kreische from�� Filmakademie Baden-W��rttemberg.�� During the last two months,�� our EMIL Lighthouse project,���� the VR experience, FATE OF THE MINOTAUR,�� has reached some important milestones:�� We successfully completed filming for scale option�� 1, a non-interactive stereoscopic 180�� film,���� over two action-packed days in a virtual�� production process at one of our studios.�� Meanwhile, we���re nearing the finish line�� with scale options 2 and 3 ��� our larger,���� interactive experiences. Option 2 offers�� both single-player and multi-player modes,���� while option 3 adds free-roaming�� capabilities in an 8 by 8 meter space.�� We also hosted a fun demo session at Europapark�� with participants from the XR-savvy team���� at MACK One, who provided us with valuable�� feedback from their experienced perspective.�� And last but not least, we���re excited to announce�� that FATE OF THE MINOTAUR has been nominated for���� Best Installation at XRC nextReality.Contest,�� as well as for the AUREA Award in January 2025.�� So, fingers crossed and Minotaur horns up�� for some good luck ��� thanks for tuning in! Hi there, my name is Christof Lutteroth. I'm the�� lead investigator of EMIL at Bath and it's my���� pleasure to tell you about our Lighthouse project. We are detecting emotions in real time in virtual���� reality. So while people are using virtual�� reality, we're using physiological sensors���� to measure things like heart rate, pupil size�� and skin conductance. And we are combining���� all these measures to estimate emotions in�� real time, removing all the noise created���� by headsets and movements and other factors. And this has recently been highlighted as an���� Excellent Innovation by the European Commissions����� Innovation Radar. And if you would like to try���� it out yourself or even use it for your own�� products, please contact me because we are���� currently preparing evaluation licenses and we�� are very happy to give you an evaluation license,���� if you���d like. Thank you. Hi, Paul here from the Full-Body Interaction�� Lab at the UPF node of EMIL. In September,���� our Lighthouse project, the AR Magic Lantern, made�� its public debut at an immersive art exhibit in���� Barcelona. The exhibition was hosted by the�� Espronceda Institute of Art and Culture and���� built by Niall Hill, Rafa Roeder, and our own�� Fran Macia. It places visitors in the middle���� of a speculative fiction scenario in which a�� climate crisis has caused severe draught. As���� a participant, you join a group of weather�� scientists in a last-ditch attempt to save���� humanity. You use the AR Magic Lantern to�� reveal clues in the surrounding landscape,���� and then use the clues to solve puzzles�� on the physical weather control station,���� resulting in a true mixed-reality experience that�� visitors enjoyed as participants and observers.���� We���re honored that the team incorporated the�� AR Magic Lantern into their work. If you have���� an idea for how it could be used in your project,�� we are interested - please get in touch. Thanks! Hello, I am Melvyn Roerdink from the VU team�� in Amsterdam with an update about project���� CAPARE. This time a publication alert linked�� to progress in augmented-reality technology.�� Plos ONE accepted our paper on the day�� that Meta demonstrated Orion, highlighting���� its large augmented-reality field of view. This is important as we showed that it improves���� interaction with nearby holographic objects. In project CAPARE, we integrated augmented-reality���� cues in our augmented-reality exercises. These cues assist people with Parkinson's���� disease in completing their exercises at home. The increased augmented-reality field of view���� size now enable users to interact with�� nearby augmented-reality cues on the���� floor in a similar vein as they would�� interact with real-world cues. That is,���� without excessive downward facing head orientation�� seen with earlier generation of AR classes. Hi, I'm Tony Donegan. I'm a Clinical Research���� Physiotherapist at IDIBAPS in Barcelona. And I want to give you an update on���� our project XR-Pain. So our overall project���� aim is to develop and test a therapeutic VR�� application for people with chronic low back���� pain that they can use independently at home. We are approximately halfway through the���� clinical trial with over 50 patients recruited. We're also exploring the further development of���� an AI-driven virtual physiotherapist�� using natural language processing.�� The physiotherapist will assess the verbal�� responses of the patient questioning and���� combine this information with their�� movement data to generate responses���� that give tailored encouragement and advice. Finally, we're collaborating with a team from���� Alto University to develop a haptic feedback�� system to be worn around the low back to help���� improve the sensation of virtual embodiment�� and we're in the preliminary testing stages. Hello! I���m Adrien from Atlas V, we���re building the���� Shaun the Sheep immersive exhibition FSTP project. Since the last FAST FORWARD, we���ve been preparing���� the project for two major milestones:�� animation and the playtests which will���� be taking place at the end of the month. To prepare for animation, we���ve blocked���� out the farm environment inside a simulated�� virtual room in Unreal Engine. This is where���� our intro scene, which mirrors the iconic�� opening from the show, will be taking place.�� We���ve also iterated on the design of our main�� interactive scene, a crowd controlled obstacle���� course through the English countryside, and�� we���ve built it in a way so that major elements���� like speed, difficulty, camera placement�� and even the main movement mechanics can���� be adjusted based on the results of our playtests. Once this phase is over, we���ll have all we need to���� start animating and integrating assets. Thank you! Enter the amusement park full of wonders�� and fossil-industry themed attractions!�� How about a gas pipeline roller coaster, a free�� fall oil rig tower or a big wheel with shovels���� as cabins? Framed by the aesthetic of marvelous�� antique oil springs! We built and bought a lot of���� 3D-Assets to build this inviting scenery in�� which our game will take place. This unique���� setting will underline the satirical aspect of our�� game and focuses on the fun of turning that evil���� amusement park into a source for renewable energy. In addition to the blaster, we implemented two���� more game mechanics. The so-called ���solar drops���,�� which drop upon the destruction of enemies,���� can be collected as a quick energy recharge�� for theme-packs. Also a revive-mechanic,���� allowing for more multiplayer fun, as friends have�� to rush for aid to revive knocked out players.�� Alongside we are working on the�� intro-sequence and the hub-world in���� which we will select the levels, so a full�� gameplay experience can be achieved soon. Hi everyone! It���s time for�� another update on coopXscape,���� our collaborative XR Escape�� Game for team development!�� This time we���re excited to�� introduce Mission 4: The Stationary!�� In this mission, our explorers arrive at�� an abandoned space station. The challenge?���� Rearrange the station���s cubic modules to gather�� the resources needed to continue the journey.�� In our current prototype, we test the layout and�� mechanics using placeholder assets. Our goal is���� to craft a fun, strategic puzzle that truly�� challenges the team���s problem-solving skills.�� Our design team is busy crafting the�� space station���s look. Let���s take a���� first peek at the exterior and interior�� designs���there���s much more in the works!�� If you want to learn more, reach�� out and stay tuned for more updates. We have implemented an LLM based digital assistant�� using speech-to-text and text-to-speech features���� and research the best service provider for it. We are also developing a machine learning model���� to ensure user safety during exercises. For this purpose, we have developed a���� standalone application for training the AI. Our team has started research on Azure Kinect���� and is working on an application,�� which records motion capture data.�� The purpose is to compare Azure Kinect's�� data with Quest 3 body tracker data.�� We added a video overview page where one can�� watch the patient's performance video and���� the original mock-up exercise side by side. Our 3rd mini game has progressed and now has���� a music mixing feature and engaging visuals. We added more wrist exercises to our mock up���� library, which are currently in the�� process of being cleaned and polished. Our two-month journey begins with�� our efforts dedicated to control the���� diversity of our single-shot generation model. Our implementations along with some preliminary���� results has been submitted as a poster paper�� at the SIGGRAPH Asia 2024 conference and have���� been accepted for publication. The conference will be held in���� Tokyo on the 3rd of December. Next stop would be our initial���� approach for 3D body motion restyling, where�� we retrained integrated a state-of-the-art���� method into our real-time capturing software. The experiment provided valuable feedback���� for our next steps in this direction. Final stop, our brand new open-source���� repository from where we share our implementations�� and tools with the 3D animation community.�� Our goal is to add more material in this�� repository by the end of the project. Hi! My name is Ignasi and I am�� one of the co-founders of Broomx.�� Our project���s name is "Immersive�� Exergames for Rehabilitation Care in���� Elderly Populations��� and it is designed�� to revolutionise rehabilitation and���� physical exercise in elderly care. The purpose is to create innovative���� exergames tailored for individuals aged�� 65 and above, and these exercises will be���� projected throughout the room. We aim to address four crucial���� aspects of recovery: neurorehabilitation,�� cognitive stimulation, physical exercise,���� and emotional well-being. For this project,���� six different exergames will be developed�� and tested in real senior care institutions.�� After evaluation by industry�� experts, we will launch a series���� of marketing initiatives to promote their�� implementation in senior care facilities. Hi! We are CREW and here are some updates�� on our project Tools For Timon, as we are���� continuing with the development of several of our�� tools for the Embodied XR Performance Framework.�� Chorus lets us easily repeat the animations we�� want, including live motion capture. It has been���� tested in situations with performers who were�� able to interact with their previous avatars,���� helping to create mystery for the immersants. SPIN, another tool in development, turns a VR���� headset into a broadcasting hub for peripherals�� like trackers and controllers, relaying them���� into the multiplayer environment. We are also developing CIRCA,���� a tool that lets us record and replay�� performances for interactive archives.�� We have been testing and implementing�� these tools during multiple residencies���� and performances! We will have�� upcoming location-based experiences���� in Kikk Festival in Namur and soon�� after we���ll be in XR Hub in Munich! Hi, I���m Anne from SenseGlove, and I���m�� excited to share our progress on Rembrandt,���� a new haptic input device designed for�� tele-operation. With our product vision in place,���� we have finalised the program of requirements�� and concept design for Project Rembrandt.�� These documents define the scope of the product,�� detailing core and auxiliary functionalities,���� sensing and control systems, production�� aspects, and integrated safety features.�� Currently, we are developing one-finger�� prototypes, focusing on optimising key���� hardware components such as linkage�� systems, motor-hub pulleys and the���� force-feedback sensor. At the same time, we���re�� refining the controller for these prototypes.�� Additionally, we have begun our initial steps�� in shaping our branding strategy and setting up���� mood-boards for our product aesthetics. Stay tuned as we move forward with���� integrating a functioning prototype of�� Project Rembrandt with an end-effector. Hello, I'm Professor Ian Biscoe,�� founder of the Trans Realities Lab���� at Design Academy Eindhoven and creator�� of the XR-IT program supported by EMIL.�� My job is to investigate and define�� the user requirements for XR-IT and���� build tools for its creative output. Currently I'm working on adding support���� for OptiTrack, Ultra Grid and Meta�� Quest in the user interface of XR-IT.�� Currently I'm streaming together�� the web of different services���� we have to support within the system. I'm currently implementing a monitoring���� service and investigating how we can make the�� configuration of XR-IT easier on your machine.�� I am currently working on enhancing�� the project's cyber security.�� I'm currently working on implementing�� third party software products in the���� XR-IT backend system like OptiTrack and Ultimate.To view or add a comment, sign in