Running a safety helmet detection system based on Ultralytics YOLOv8 model, optimized with TensorRT and deployed on the NVIDIA Jetson AGX Orin 64GB Developer Kit. 🚀
By using just a few commands, you can easily convert custom PyTorch model to NVIDIA Robotics TensorRT format and run inference.
Read more on NVIDIA Jetson AI Lab: https://lnkd.in/dZ-5wbMF Thanks to Lakshantha Dissanayake and Dustin Franklin!!!
The amazing video is courtesy of Akin Victor!!!!!
NVIDIA Omniverse is a cutting-edge development platform for virtual world simulation, combining real-time physically-based rendering, physics simulation, and generative AI technologies.
In Omniverse, robots can learn and refine their skills, significantly reducing the sim-to-real gap and enhancing the transfer of learned behaviors.
Building robots with generative physical AI involves three key components:
- NVIDIA AI Supercomputers: For model training.
- NVIDIA Jetson Orin and Jetson Thor: To run the models.
- NVIDIA Omniverse: For simulation and skill refinement.
#OpenUSD#COMPUTEX2024#Robotics#omniverse#nvidia#generativeai
#GTC2024 😎 moments review: Join Roboflow and explore how to seamlessly deploy YOLO-World zero-shot object detection model on #reComputer Industrial edge device powered by NVIDIA Robotics Jetson Orin Nano!
The whole pipeline transcends traditional boundaries with its open vocabulary capability, allowing it to recognize objects beyond predefined categories—making it more efficient and adaptable for real-world applications. Experience the flexibility of dynamically adjusting detection vocabularies to meet diverse needs without compromising performance.
👉 To get hands on the YOLO-World approach on Jetson Orin Nano 8GB, you may want to check out our reComputer Industrial J3011: https://lnkd.in/gcfq6g8D
🏂 YOLO-World GitHub repo: https://lnkd.in/gvASzQUX#nvidia#jetson#computervision#edgeai#zeroshot#objectdetection#yoloworld
Mechatronics engineering student | 2+ Years of Industry/startup Experience | Open to Engineering Internships in Robotics, Automation, and Industrial Systems
We can all agree : simulations can be fun when they go wrong !
A first interaction with NVIDIA's Isaac Sim, where I coordinated a handoff between an NVIDIA JETbot and Franka Emika robot🤖🤝
Accessing the remote RTX-powered system running the Omniverse App was a breeze using the Omniverse Streaming Client (Telnet protocol).
This is just the beginning of a much larger project.
🌟 Project #goals:
1️⃣ Programmatically tap into Isaac Sim using the KIT interface.
2️⃣ Navigate robots with task logic in a simulation loop.
3️⃣ Develop scalable modules for repeated experiments.
Introduction to Robotic Simulations in Isaac Sim by NVIDIA, #recommended#TechInnovation#Robotics#NVIDIA#IsaacSim#Omniverse#RoboticSimulations#FutureOfTech#Automation#AI#MachineLearning
Nvidia made some huge robotics announcements at their GTC 2024 conference. The headliner was GR00T, a novel foundation model designed to power the next wave of advanced humanoid robots. The goal is for GR00T to allow humanoid robots to understand natural language, mimic human movements with dexterity, and operate intelligently in the real world. However, many details around GR00T's inner workings and training data remain vague. To run GR00T and other robotics AI workloads, Nvidia also unveiled their new Jetson Thor computing platform featuring their latest Blackwell GPU architecture.
#GR00T#HumanoidRobots#ArtificialGeneralIntelligence#RobotOperatingSystem#ROS#OpenSource#OpenRobotics#JetsonThor#BlackwellGPU#Robotics#AI#FoundationModels#OSRA#NVIDIA#GTC2024#FutureOfRobotics
Training and developing intelligent robots in simulation to be successfully deployed in the real world is challenging.
The robot’s interactions and senses in the virtual world should be indistinguishable from reality.
With NVIDIA Isaac Sim (https://lnkd.in/dfCJ5C_K), developers and researchers around the world are able to train and optimize AI robots for a breadth of tasks.
In this demo, we showcase three incredible and very different robots developed in simulation and proven in the real world:
1) Obelix, an Autonomous Mobile Robot (AMR) by Fraunhofer IML
2) At Festo, a collaborative robot (“cobot”) for industrial automation, and
3) Anymal a robot dog trained by ETH Zurich and Swiss-Mile.
https://lnkd.in/dKrk663m#robotics, #ros, #AI
We've amassed one of the world’s largest labeled datasets of images of recyclables with our AI platform, which we use to train our image recognition models that run on NVIDIA A100 Tensor Core GPUs.
Learn more about how #AI is powering #recycling efficiency: https://lnkd.in/djJEErXwNVIDIA Robotics#NVIDIARobotics
At #nvdia GTC, Jensen Huang announced Project GR00T, a general-purpose foundation model for humanoid robots.
As part of the initiative, the company also unveiled a new computer, Jetson Thor, for humanoid robots based on the NVIDIA Thor system-on-a-chip (SoC). It also announced significant upgrades to the NVIDIA Isaac™ robotics platform, including generative AI foundation models and tools for simulation and AI workflow infrastructure.
Robots powered by GR00T, which stands for Generalist Robot 00 Technology, will be designed to understand natural language and emulate movements by observing human actions — quickly learning coordination, dexterity, and other skills to navigate, adapt, and interact with the real world.
Jetson Thor was created as a new computing platform capable of performing complex tasks and interacting safely and naturally with people and machines. It has a modular architecture optimized for performance, power and size.
The SoC includes a next-generation GPU based on the NVIDIA Blackwell architecture with a transformer engine delivering 800 teraflops of 8-bit floating point AI performance to run multimodal generative AI models like GR00T.
😎Curious about our hottest #reComputer J4012 edge device? Join us as we unbox the AI computer, power it up, and explore its incredible capabilities in computer vision and large language models for diverse applications, such as robotics, automation, video analytics, and Generative AI. Watch the video on YouTube: https://lnkd.in/gXfiMpVP
What you'll discover:
✅ reComputer part list
✅ Take apart the reComputer to check out the specifications breakdown of J401 carrier board
✅ Plug in to complete the initial setup
✅ Configurations for installed packages and runtime environment
✅ One-line command deployment of Ultralytics Yolov8 and Meta Llama3 on reComputer
Ready to kickstart your Edge AI Project? Check out reComputer J4012, powered by NVIDIA Robotics Jetson Orin NX 16GB for infinite possibilities! https://lnkd.in/g7J7w7Ts
Explore more LLM and CV models supported on Jetson Examples GitHub: https://lnkd.in/gPcgtSuq#nvidia#jetson#edgeai#computervision#videoanalytics#llm#generativeai#robotics#autonomous
Today, we are taking a peek behind the curtain at Plus One Robotics' PickOne AI-powered vision software and the role NVIDIA Robotics plays in warehouse automation.
NVIDIA GPUs power the vision system providing a brain behind the eyes, all of which enable the muscle 💪 of the robot arm. Together, these functions allow for a growing neural network to make the robots more sophisticated over time. So, what does this mean for warehouse automation?
☑ Adaptive induction and depalletization solutions that can run 24x7x365 🕚
☑ Increased throughput with “on-the-fly” human in the loop support to account for the infinite variability in parcels 📦
☑ Continual adaption of the neural network as variability is processed 🧠
☑ Less heavy lifting for your warehouse staff leading to less turnover 🏋♂️
▶ See how it works
#CES2024#NVIDIA#ROBOTICS#logistics#supplychain#warehouseautomation#bettertogether#robotsworkpeoplerule
Gazebo or Isaac Sim
The robotics simulation tools are undergoing a massive shift with the latest push from NVIDIA towards building a community around Isaac sim and many companies migrating to it. The choice remains unclear for many novice users. Here are some things I’ve learned working with both:
* Isaac sim offers significantly better photo realism, which is key for data collection and model training.
* Isaac sim offers massive parallelization capabilities over multiple GPUs, which makes it advantageous for RL pipelines, where simulation scale is important.
* Physics of Isaac sim are still lacking for modeling certain types of contacts in a stable manner, but it’s under active development.
* Gazebo has been a traditional choice due to its straight forward ROS integration. However, the latest versions of Isaac sim offer bridges to ROS2. I expect this interface to improve as many robotics companies are actively using it.
In conclusion, Gazebo has had a good run and been the default choice for many years. But, Isaac sim is definitely gaining momentum with a fast growing community.
Would love to hear other perspectives from folks who have used either or both.
#isaacsim#robotics#simulation#gazebo#ros#roboticsengineering
Embedded Computer Vision Engineer @Ultralytics | YOLOv8 | NVIDIA Jetson | Raspberry Pi | Edge TPU | ex-Seeed Studio
2moThank you for the share Shakhizat Nurgaliyev! Hope it helps the NVIDIA Jetson community!