Lately i have been using the AgileX Limo robot to perform Real Time Autonomous Navigation and Mapping. In the video i have taken below, you can see a demonstration of the robot scanning and exploring our Robot Lab in real time! 😁
Simultaneous Localization and Mapping, also known as SLAM, is a technique that enables a robot or vehicle to construct a map of its environment in real-time while simultaneously determining its own position within that environment.
Day 5 - Autonomous navigation
Concepts covered:
- Global and Local Costmaps:
Both are important tools in the ROS 2 ecosystem as they produce a map of how "costly" it is for a robot to pass through an area. Higher costs (darker areas) indicate a wall or higher chance of collision.
The global costmaps define the overall trajectory of the robot from the starting point till the end by avoiding areas with obstacles. In contrast, the local costmap adjusts the path in real-time with the goal to avoid sudden obstacles.
- Global and Local Paths:
Are used for the robot's navigation in an area. The global path computes the optimal trajectory from the starting point to the end by avoiding areas with obstacles whereas the local path takes care of the immediate path to follow the robot and adjusts the trajectory to avoid suddenly appearing obstacles.
In the following video, you can see in the RViz tool how the scene changes by adding the global costmap, local costmap, global path(green color), and local path (red color)...
https://lnkd.in/d3UFVxxP
The main goal of modelling a Multi-Agent System (MAS) is finding its optimal joint policy, which maximize the collective payoffs of all agents. This can be a complex problem, especially for real-world systems, like heterogenes traffic aiming to ensure safe movements or optimize autonomous driving.
Our updated work here: https://lnkd.in/eu4iYbu3 contains a multi-agent simulator with minimum requirements and elegant design, in addition to baselines and a novel multi-step reinforcement learning models, thoroughly evaluated on many relevant criteria, as shown in the video below. Paper published here: https://lnkd.in/ena2vDaE
It's done! It works! It's tested!
It's such a feeling of pride, accomplishment and satisfaction to be able to publish this video, the culmination of about 6 months of work building a fully autonomous fixed wing/VTOL/UAV/UAS follow capability on the ArduPilot platform.
Here's my video of my recent successful test flight.
As I've said before "autonomous" is not AI. This is telemetry driven, but adding AI as the source is now possible (if not easy). I'll be doing that next.
https://lnkd.in/ghqEzM_h
I successfully developed a multi-robot navigation system that adheres to obstacle and collision avoidance principles. This system effectively addresses latency issues, enabling multiple robots to navigate their shortest paths without any collisions.
My project, Omni World, serves as a versatile and user-friendly tool for visualizing simulations and data coordinates, specifically designed to support the autonomous navigation of mobile robots.
Previously, I enhanced Omni World to simulate both static and dynamic environments, further integrating it with computer vision techniques for obstacle detection and pathfinding. These advancements enabled the system to accurately guide robots through complex environments, showcasing its potential for real-world applications in autonomous navigation.
#TechForAutonomy#MultiRobotNavigation#CollisionAvoidance#AI_In_Robotics#AI#ArtificialIntelligence
🚀 We completed a series of key flight tests as we prepare for our upcoming trials!
During these tests, we demonstrated some of Vyom's core capabilities:
1️⃣ Proprietary AI Navigation: Our system adapts to terrain in real-time, even without GPS, ensuring precision and reliability.
2️⃣ Advanced Obstacle Detection & Avoidance: Enhanced safety features for seamless and autonomous navigation.
These tests have set the foundation for our next field trials, which are just around the corner. We're pushing the limits even further and can't wait to share what’s coming next!
👏 A huge thanks to our brilliant engineering team for their dedication and hard work, and to our trusted development partners for their collaboration!
👀 Want to shape the future of autonomous robotics? We’re expanding our pilot program. Get in touch with us for more details!
#Robotics#DroneInnovation#TechStartup#AutonomousSystems#Innovation#Engineering#FutureOfFlight
🚀 Exploring ROS2 Navigation and SLAM with TurtleBot3 in Simulation! 🐢
In the world of robotics, navigation and mapping are critical for autonomous systems. I've been diving deep into the ROS2 Navigation Stack (Nav2) and using SLAM (Simultaneous Localization and Mapping) to create maps in real-time with the simulated TurtleBot3.
🔍 Nav2 is the go-to solution for autonomous navigation in ROS2, allowing robots to move from point A to point B while avoiding obstacles and building accurate maps.
Key steps I followed:
🛠️ Set up the TurtleBot3 in Gazebo simulation.
🗺️ Integrated SLAM to build dynamic maps of the environment.
🤖 Configured Nav2 to handle path planning and obstacle avoidance.
🎯 Simulated autonomous navigation with real-time mapping and localization.
These tools are crucial for anyone working on autonomous driving systems, indoor robots, or any other applications requiring precise navigation and environmental awareness.
In this tutorial i used available packages for turtlebot3 and Navigation packages for navigation and used AMCL for predicting particles.
Using waypoints robot is simulated fully autonomous around the arena of the map.
If you’re interested in robotics and ROS2, feel free to connect—let's discuss and innovate together! 🌐
Day (10/100)
#ROS2#Nav2#SLAM#TurtleBot3#Robotics#AutonomousNavigation#RoboticsInnovation#Simulation#Gazebo#RobotProgramming
Developed an autonomous robot capable of following a predefined path using real-time line detection. Designed and implemented efficient algorithms for smooth and precise navigation.
💡💻🛫 Simulation plays a critical role in the development of safe, secure, and dependable autonomous flight.
Simulation ensures our autonomous systems are thoroughly tested to operate safely in dynamic environments. By conducting extensive testing and validation in simulation, we can refine key autonomous features and build confidence before moving into real-world trials.
⬇️ Swipe through to understand how this process works, step-by-step from start to finish.
#FleetOperations | #AutonomousAviation | #FlightSimulation
UI/UX Engineer at Frontier Robotics | The National Robotarium
11moExcellent work Andrei!