https://lnkd.in/gazqJGJE
Excited to share my latest YouTube tutorial on creating a URDF robot arm from Fusion 360 Learn the essentials in this beginner-friendly guide to bring your robotics projects . Check it out for a step-by-step walkthrough and start building your own robotic masterpiece today. #Robotics#Fusion360#URDF#YouTubeTutorial
Bridging Language and Action with AI-Powered Robotics!
I recently integrated the powerful LLaMA language model with a robot control system to bring language-based commands to life! 🦾✨ By combining LLaMA's natural language processing capabilities with a JSON configuration file, we're able to make the robot understand and execute commands like "move forward" or "turn left" with precision.
The JSON file provides specific action parameters, allowing the robot to perform the task accurately.
Chat history maintains context so that commands are relevant and conversational.
💡 This setup is more than just movement—it's about making the human-robot interaction feel natural and engaging. Imagine a robot that not only moves forward but also explains what that command means in real life!
Exciting times in AI and robotics! 🚀🤖 #AI#Robotics#MachineLearning#NaturalLanguageProcessing#LLM#RobotControl#Innovation#TechnologyNinad MadhabJigar HalaniDustin FranklinAvinash ChakravarthiTanna TechBiz LLP
Excited to share my latest blog post on integrating NVIDIA Isaac Sim with Jetson AGX Orin and the LLaMA language model! In this project, I implemented a natural language-based waypoint navigation system, allowing a robot to interpret and respond to verbal commands for autonomous movement. This fusion of simulation and AI is a step forward in advancing human-robot interaction. Dive into the blog to see how this technology is shaping the future of robotics! #AI#Robotics#IsaacSim#Jetson#NLP#ROS2Ninad MadhabJigar HalaniDustin FranklinAvinash ChakravarthiTanna TechBiz LLPhttps://lnkd.in/gbfkHSGm
Save the date: October 9th in #Bangalore. This event offers a fantastic opportunity for students, faculty, and researchers to explore NVIDIA’s cutting-edge #AI and #robotics technologies.
Immerse yourself in Generative AI and learn how to build intelligent robots with #Jetson, #IsaacSim, and #Metropolis Microservices.
Event Details:
🗓 Date: 9th October 2024
📍 Venue: NVIDIA Bangalore
Secure your spot by filling out the form here:
🔗 [Form]: https://lnkd.in/gsVS8zTZ
Hi everyone,
Thank you for the tremendous response to the #NVIDIA Robotics Event on 27th September! Your enthusiasm has led to an exciting announcement - an additional event tailored for #universities.
Mark your calendars for the upcoming session on 9th October in #Bangalore. This event offers a unique chance for students, faculty, and researchers to explore NVIDIA’s cutting-edge #AI and #robotics technologies.
Delve into the world of Generative AI and master the art of crafting intelligent robots with #Jetson, #Isaac Sim, and #Metropolis Microservices.
Event Details:
🗓 Date: 9th October 2024
📍 Location: NVIDIA Bangalore
To secure your spot, please fill out the form here:
🔗 [Form]: https://lnkd.in/gsVS8zTZKabilan KbSunil PatelJigar HalaniMandar Rikame
Carter can move by Audio to Action, where a robot listens to audio commands and takes action using a large language model (LLaMA 3) on a Jetson AGX Orin, with JSON and Nav2 for seamless navigation. The robot listens to instructions like "Move to the pallet area" or "Go to the charging station," processes them through the LLaMA model, and translates them into structured JSON data for execution. It also provides conversational feedback, such as, "I’m starting to navigate to the pallet area. Please wait!" Using the Nav2 stack, the robot autonomously moves to the specified location. Additionally, this system has been simulated in NVIDIA Isaac Sim, showcasing how AI-powered conversation and advanced navigation can enhance the interactivity of the Carter robot, making it more intuitive and responsive to users.NVIDIA RoboticsNinad MadhabJigar HalaniDustin FranklinAvinash ChakravarthiShreyans DhankharTanna TechBiz LLP
Event Success: NVIDIA Robotics Day - Bangalore 🤖
I'm thrilled to announce that the "Build Your Gen-AI Robot in a Day with NVIDIA!" event on 27th September was a huge success! A event organizer, alongside our amazing event coordinator Ninad Madhab , we showcased the future of AI and robotics in an unforgettable way.
Attendees got hands-on experience with cutting-edge NVIDIA technologies, including live demos of Generative AI models like LLMs, VLMs, and VLA models running directly on robots. It was an exciting deep dive into intelligent robot creation using the Jetson platform, Isaac Sim, and Metropolis Microservices.
A massive thank you to all the developers, researchers, and robotics enthusiasts who joined us and made this day remarkable. The energy and enthusiasm in the room were incredible, and it's inspiring to see how we're pushing the boundaries of AI and robotics together.
This event marks just the beginning of many more exciting innovations and collaborations. Stay tuned for what's next as we continue building the future of robotics.
Jigar HalaniDustin FranklinAvinash ChakravarthiTanna TechBiz LLP#NVIDIA#Robotics#GenerativeAI#LLM#VLM#VLAModel#Jetson#IsaacSim#MetropolisMicroservices#Innovation#AICommunity#BangaloreEvents#OrganizingSuccess#EventCoordinator#FutureOfRobotics
🚀 Exciting advancements in human-robot interaction! 🤖✨
We're leveraging a Vision-Language Model (VLM) to create an intelligent robot nvidia Carter that can seamlessly understand and respond to voice commands. Whether it’s monitoring warehouse activity or assisting with tasks, Carter processes audio and visual data in real-time, enhancing operational efficiency and safety.
This technology opens new horizons for intelligent automation, making robots not just tools but collaborative partners in various industries.
Curious to learn more about the methodology and potential applications? Check out the full article on Medium!
#Robotics#AI#VLM#Automation#Innovation#MachineLearningNinad MadhabJigar HalaniDustin FranklinAvinash ChakravarthiNVIDIA RoboticsTanna TechBiz LLP
Carter, the robot, can see and understand its environment using a Vision-Language Model (VLM) deployed on Jetson AGX. Through cameras,NVIDIA Robotics Carter visually processes its surroundings and explains what's happening in real time. Using Isaac Sim for simulation.
Users can easily communicate with Carter through a Web UI, either by typing or using voice commands like "Hey Carter, what's going on?". The VLM interprets the visual input and responds to these natural language queries, offering immediate situational insights. The system is deployed on Jetson AGX, enabling Carter to perform in real-world environments, improving warehouse automation and human-robot collaboration. This integration of AI, simulation, and robotics brings a natural and efficient way for humans to interact with robots.
For more information, read the full article on Medium.
https://lnkd.in/greMXi-gNinad MadhabJigar HalaniDustin FranklinAvinash ChakravarthiMandar RikameTanna TechBiz LLP
Catch up with the Jetson AI Lab Research Group tomorrow! 🚀 I’ll be presenting the Isaac Sim2Real VLM workflow. Big thanks to Dustin Franklin and the amazing team!
Join us on September 17th at 9 AM PST to dive into AI in robotics! 🤖
On tomorrow's agenda of the Jetson AI Lab Research Group, genAI in robotics continues to take off into reality 🚀🤖
* JAX support (Johnny Núñez Cano)
* Isaac sim2real VLM workflow (Kabilan Kb)
* LeRobot on Orin Nano (Chitoku Yato)
* nvidia/Nemotron-Mini-4B-Instruct 🤗
✅ Date: Tuesday, September 17th, at 9am PST
✅ Invite: https://lnkd.in/giGHBebQ
Anyone is welcome to join in the discussions, so if edge computing and physical AI are of interest to you, please feel free to drop by and catch up with us on the latest.
https://lnkd.in/gEC__afB
Associate Consultant @ Ellucian
10moInteresting 😊