PROMETHEUS 2.0 by INBIODROID, currently in its development phase, is an advanced telepresence avatar robot. This humanoid robot is capable of replicating the movements of a user anywhere in the world, in real-time. The second version of this robot represents a significant leap in telepresence technology.
PROMETHEUS 2.0 is expected to be controlled through an exoskeleton suit with haptic gloves, a VR viewer with a speedometer, and an omnidirectional belt, providing faster and more accurate response times.
The robot's design includes better leg balance, advanced algorithms for movement skills, sensors on the feet, and arms capable of replicating human movements. It also features two different hands for various tasks and stereoscopic vision.
By 2024, INBIODROID aims for PROMETHEUS 2.0 to perform acrobatics and reach peak performance. Post successful human interaction tests, it's anticipated to become a commercially available product.
INBIODROID's mission with this robot is to transcend physical barriers in industries, potentially revolutionizing sectors like healthcare, exploration, and aerospace with its immersive telepresence capabilities.
Crafting growth and digital strategies for tech startups. For the latest in tech innovation and tips to boost your startup's visibility, follow my updates on Jerry Louis-Jeune, and visit my agency JLJ Digital for our services
#Robotics#BigData#MachineLearning#ArtificialIntelligence#ML#Innovation#jljinsights
Prometheus is one of the most advanced bilateral telepresence avatar robot. Our robot is so advanced that we participated in the A&A AVATAR X Prize competition and have not only qualified but past quarterfinals, semifinals and we're currently in the top 20 finalist worldwide out of approximately 150 participants. But we're not done yet. We are proud to announce the prototype Avatar 2.0 system or what we call Prometheus 2, and that's where you come in. Our goal is to make Prometheus Prototype much more advanced. Precise with new skill sets, Prometheus 2 will be developed so that it can complete advanced tasks such as multi access, mobility and new bipedal system. Improved haptics, manipulation and interaction that spans different domains with more precision such as connectivity, exploration and skill transfer. Our vision of Prometheus 2 is to create humanoid adaptations where it allows the user senses to be able to receive precise stimuli from the environment and tell the operate. The avatar in different areas where humans would not be at risk, such as natural disasters, bio research, and space exploration. This is why we need to continue to improve and make Prometheus to a reality. We have a very talented team of researchers, engineers, and scientists working in a wide variety of fields from robotics to computer vision. We are breaking barriers. We want to extend our capabilities by allowing humans to be able to touch. Feel, see, and experience without having to be fully present in or outside our world. Despite the great news of being among the finalists, the development of Prometheus 2 is in a critical state. We need your help to make this a reality. If you are interested, support Prometheus 2 and in by Adroid so that together we can continue perfecting the Avatar 2.0 system and be worthy of presenting an improved telepresence avatar robot to the world Mexico. Thank you for watching our video.
Robot teleoperation might seem new, but it's a long-standing concept. It allows humans to remotely control robots in real-time, extending human capabilities in hazardous or distant environments.
This tech has evolved from early prototypes to advanced, AI-assisted systems across industries. Industries like healthcare, manufacturing, and space have seen significant benefits, improving efficiency, safety.
Yet, challenges remain, including network latency and the need for skilled operators. As teleoperation merges with #AI and VR, we’re on the brink of innovations that could redefine human-robot collaboration.
#RobotTeleoperation#AI#Innovation#TechTrends#Automation#RemoteControl
🤖 𝐎𝐩𝐞𝐧-𝐓𝐞𝐥𝐞𝐕𝐢𝐬𝐢𝐨𝐧 𝐁𝐫𝐢𝐧𝐠𝐬 𝐑𝐨𝐛𝐨𝐭 𝐂𝐨𝐧𝐭𝐫𝐨𝐥 𝐭𝐨 𝐑𝐞𝐚𝐥𝐢𝐭𝐲
Imagine controlling a robot from thousands of miles away, just like in the movie '𝐀𝐯𝐚𝐭𝐚𝐫'! Researchers at UC San Diego and MIT have made this sci-fi dream a reality with Open-TeleVision, an open-source teleoperation system.
𝐓𝐡𝐞 𝐃𝐞𝐭𝐚𝐢𝐥𝐬:
- 𝐔𝐧𝐢𝐯𝐞𝐫𝐬𝐚𝐥 𝐀𝐜𝐜𝐞𝐬𝐬: Open-TeleVision can be accessed from any device with a web browser, including VR headsets, enabling robot teleoperation from anywhere in the world.
- 𝐑𝐞𝐚𝐥-𝐓𝐢𝐦𝐞 𝐕𝐢𝐝𝐞𝐨 𝐒𝐭𝐫𝐞𝐚𝐦𝐢𝐧𝐠: The system provides real-time stereo video streaming for depth perception, allowing for the fine manipulation of challenging objects.
- 𝐈𝐧𝐭𝐮𝐢𝐭𝐢𝐯𝐞 𝐂𝐨𝐧𝐭𝐫𝐨𝐥: An active neck with inverse kinematics enables intuitive control of the robot's head movements, mirroring the operator's actions precisely.
- 𝐎𝐩𝐞𝐧 𝐒𝐨𝐮𝐫𝐜𝐞: The entire system is fully open-sourced, allowing other researchers to access and build upon the source code.
𝐖𝐡𝐲 𝐈𝐭 𝐌𝐚𝐭𝐭𝐞𝐫𝐬:
Technological developments like Open-TeleVision bridges the gap between current AI limitations and fully autonomous humanoid robots. This innovation enables more immediate practical applications while paving the way for a future where robots become fully independent.
𝐃𝐨𝐧'𝐭 𝐌𝐢𝐬𝐬 𝐓𝐡𝐢𝐬! Watch the full video to see Open-TeleVision in action and witness firsthand how this revolutionary technology is transforming the way we control robots from afar. Trust us, you won't want to miss a second of it!
#AI#Robotics#Teleoperation#OpenSource#Innovation#Technology#FutureTech#Research
Video Credits: Xuxin Cheng
#TechyTuesday: Japan's Giant Robot Revolution 🚂🤖
Japan's latest innovation is a giant robot with coke-bottle eyes, reminiscent of Wall-E! This advanced robot uses cutting-edge AI and sensors for precision tasks, controlled remotely via VR. While it's currently making waves in railway maintenance, the technology has the potential to revolutionize many industries by handling tasks in hazardous environments with unmatched efficiency.
Japan's push for such futuristic robots highlights its commitment to blending technology with practical solutions. The possibilities are exciting—this technology could revolutionize not just railways, but any industry requiring precision in dangerous environments. Enhanced safety, efficiency, and reliability all in one. How cool is that?
#TechyTuesday#RobotTech#AI#ITMBusinessSchool#ESMEducation#NN2024
Exciting times ahead! 🎉
Honored to be a part of this recent video published by EU Parliament 🇪🇺 on the evolving role of Robotics and Artificial Intelligence in reshaping industries and future human's life. I’m thrilled to represent our department's cutting-edge research and development, especially during this exciting period of innovation.
Our team has been working on some fascinating projects, focusing on:
🌍 AI-driven automation to streamline business processes.
🤖 Advanced robotics for real-world applications.
📊 Data-driven decision-making to enhance operational efficiency.
⚠️ Safety in Human-Robot collaboration and interaction
This 5-minute video dives into some of the incredible breakthroughs happening in the field—and you can catch me for a brief moment! 🎥
Check it out and let me know your thoughts on how AI and robotics are impacting the future of work.
https://lnkd.in/eTvWCgyj#AI#Robotics#Innovation#XR#AugmentedReality#AR#VR#Safety#HumanFactors#Ergonomics#Robot#ResearchAndDevelopment#EU#FutureOfWork#EUParliament#Automation#TechTransformation
🚀 Teleoperation Breakthrough: Control Robots from thousands of miles away with Open-TeleVision 🚀
Researchers at UC San Diego and Massachusetts Institute of Technology led by Xuxin Cheng & Jialong Li released Open-TeleVision, an open-source tele-op system allowing users to control robots from virtually anywhere. Traditional tele-op systems often suffer from limited perception and require physical proximity. Open-TeleVision uses VR for a, stereoscopic view from the robot's perspective.
How It Works🤔❓
📷Active Stereo RGB Camera: Mounted on the robot to tracks and replicate the operator's head movements for a synchronized, stereoscopic view.
⚙️Data Transmission: Lifelike visual feedback loops are created by transmitting visual data to the operator's VR headset.
🦾Motion Capture and Replication: The operator's arm & hand movements are captured and translated into precise robotic actions.
🎓Learning Capabilities: The system records high-quality data of successful & unsuccessful attempts, useful for for imitation learning.
This opens up potential possibilities in health care (think surgical robots) and others. The ability to provide high-quality data for imitation learning also paves the way for more intelligent and autonomous robotic systems. Awesome stuff Xuxin Cheng, Shiqi Yang, Ge Yang, Xiaolong Wang
👉 Follow Robotics & Business page for more updates on the latest in #robotics and #automation! 👍 Like and reshare this if you find this development exciting too!
[video: via Xuxin Cheng YT]
#Robotics#Automation#Teleoperation#Innovation#VR#AI#ImitationLearning
Tesla Optimus Update: A Peek Inside Their Data Collection Farm
The new Optimus video highlights their impressive human data collection pipeline, arguably their biggest lead.
Here's what makes it work:
- Top-tier robot hands: Dexterous, tactile, and robust - Optimus boasts some of the best 5-finger hands globally.
- Low-latency teleoperation: Seamless VR control with minimal lag between human motion and robot action - a huge feat!
- Scalable operations: Multiple robots, 24/7 human oversight, and on-site maintenance keep the data flowing. Goes beyond academic research capabilities.
- Strategic task selection: Moving beyond demos, Optimus focuses on environments and tasks for real-world use (factories, homes). But the question remains: what 1,000 tasks maximize skill transfer and generalizability?
Optimus is leading the way, but teleoperation alone won't solve humanoid robotics. Stay tuned for more on why scaling is crucial! #Tesla#Optimus#HumanoidRobotics#AI
🤖 AI-driven devices, such as robots, are revolutionising the art and design landscape, particularly in enhancing visitor interactivity and engagement at museums and galleries.
🎨 From AI robotic art critics to on-hand automated assistants guiding visitors through the premises, and even AI-equipped installations and augmented reality (AR) tours offering immersive experiences and fun facts, optics-integrated tech is truly elevating the arts scene.
💡 Not only does this technology deepen our understanding of the art world, but it also provides venue owners with invaluable insights into visitor habits, behaviours, and preferences, shaping future endeavours and the trajectory of the art world.
🔍 Discover how AI is making its mark and extending its influence across various domains in the expansive arts and design sphere: https://bit.ly/3xBJDUj#OpticalComponents#Optics#KnightOptical#ArtTech#MuseumTech#GalleryTech#AI#Robotics#InteractiveArt#ArtInnovation#CreativeTech