Great opportunity in IIT Madras
We are hiring ! Looking for hiring Post Docs, Research Scientists with experience in Optical Communication - pl contact me -
Skip to main content
Great opportunity in IIT Madras
We are hiring ! Looking for hiring Post Docs, Research Scientists with experience in Optical Communication - pl contact me -
To view or add a comment, sign in
Do check out some nice mobile robots you can build and be more hands on. You will be surprised by how much your experience is enriched with real hardware.
ACEBOTT help students learn coding and circuits with at-home/school education STEM DIY electronics kits.
Acebott Multi-function smart car kit can deform mecanum wheel to super cool tank wheel, which can help to experience different car kits #RobotKit #RoboticsKit #Robotics #Coding #Programming #Robotsmartcar #Building #Tech #educator #stem #stemeducation #educational kit
To view or add a comment, sign in
Do check out this gem. I love it.
Técnico em Instrumentação | Programador de CLP | IHM | WinCC | TIA PORTAL | Step7 | CLP SIEMENS| CLP Rockwel | Análise e Desenvolvimento de Sistemas | C++ | Python | Técnico Em Eletrotécnica |
💡
To view or add a comment, sign in
Do check out this gem from Anbu Kumar
How to use open source webserver for IoT applications using low cost edge devices like Arduino Nano ? A web server in IoT is crucial for enabling communication between devices and users over the internet. It acts as a central point for data storage, processing, and sharing, allowing IoT devices to send data to and receive commands from remote systems. Web servers help in real-time monitoring and control of devices, improving automation and decision-making. They also facilitate scalability by managing multiple devices and integrating with cloud services for enhanced functionality. #IoT #WebServers #IoTApplications #SmartTechnology #CloudComputing #RealTimeMonitoring #Automation #IoTSolutions #ConnectedDevices #TechInnovation https://lnkd.in/ghu7TUvQ
To view or add a comment, sign in
I had a very interesting conversation with a Medical Device consultant today in the CAPA arena. For those who have been in the field and attended an CAPA review board meeting it can be traumatic. The consultant had some great tools to track and understand CAPA's as well as get schooled on RCA. We were looking at the wisdom in automating the CAPA process and I promised to give it a shot with a teaser demo. If you dig deep down there are many common processes for the manufacturing of medical devices. Many of the parameters deal with time, temperature and pressure. My proposal for automation is to create generic simulation templates which not only help the RCA process but help initial process characterization as well. Here is a draft. Let me know your candid thoughts. This will go a long way in improving present day practices. I am following a modular approach of automating each module. This is the first. Georg Digel Don Tran
To view or add a comment, sign in
Do check out.
Robust Multi-Camera Solution with Long Cable Reach – Optimized for Robotics In many machine vision applications, multiple cameras must be deployed simultaneously, often positioned at a distance from the processor. At Arducam, we’ve made some successful strides in this area. In the realm of multi-camera solutions, we’ve previously demonstrated 8-camera projects—running eight cameras on platforms like the Raspberry Pi Compute Module 4 https://lnkd.in/gzz__au4 and the Jetson Xavier NX https://lnkd.in/gpJUBQqW . The common thread across these projects is the ability to run 8 Arducam cameras simultaneously, regardless of platform. Additionally, we introduced two multi-camera extension solutions for scenarios where the cameras need to be positioned some distance away from the processor: one via HDMI and another via LAN, https://lnkd.in/grsqRAks . While these setups have proven successful, they’ve primarily been lab projects suited for experienced makers. However, through ongoing discussions with users—including organizational customers—it became clear that many are seeking a solution that’s easier to deploy and offers more stability. That’s what led us to design a more robust, ready-to-deploy solution. Read the full story here: https://bit.ly/3NtjmLY And demo here: https://lnkd.in/gvwUG5gq
To view or add a comment, sign in
Do check out. Nice application for industrial inspection.
Have you heard about PUP1000? Discover Viska Systems’ advanced Part Verification Solution, powered by state-of-the-art Cognex Corporation Edge Learning technology. #PUP1000 guarantees precise part identification, even with subtle variations, preventing any mix-up of components in your processes. With a cycle time of less than 0.5 seconds, the system supports high-throughput manual assembly operations, ensuring efficiency in production. Discover PUP1000 👉 https://lnkd.in/dUwvBKfU #VisionSolution #IndustrialAutomation
To view or add a comment, sign in
Opportunity
Robotics Software Developer | Robotics Researcher | ROS & ROS2 | Undergraduate Research Student at IIT Delhi & Robotics Research Center (IIIT-H)
Hello Connections, I’m thrilled to share that I’ll be mentoring two final-year projects this year! If you're a student looking for a mentor to guide you from scratch to building your project in simulation or real-time, this is your chance. Every year I will get a ton of requests from the students to guide their final semester project in ROS. I usually pick 2/3 teams out of it and guide them from scratch till the project completion. For this year, I'm currently opening that offer. I’m specifically looking for ROS/ROS2-based projects in areas like AMR, AGV, drones, and the integration of ROS with Computer Vision, AI, and Machine Learning algorithms. If you’re passionate about robotics and have a project idea, feel free to reach out. We’ll discuss your interests, and I’ll select two projects to mentor throughout the development journey. Let’s collaborate and build something amazing together! Looking forward to connecting! #Robotics #ROS #ROS2 #AMR #Drones #AI #MachineLearning #FinalYearProjects #Mentorship #HappyLearning
To view or add a comment, sign in
One of the projects I am now working on is in doing some machine learning exercises on traffic lights detection. This is part of the line following project for the Cody Rocky. I set up a simple interface to trigger the Red, Yellow or Green LED. There is also a capture button which will save the image in a folder. The goal is to capture about 100 images and annotate in Roboflow We will train in the laptop with Tensorflow and convert to Tensorflow lite. Then hopefully we can do inference with the BLE Sense mounted on the mobile robot and make it obey the traffic laws. Let us see how that goes. You will note that the Traffic Light has built in resistors and just plugs into the Arduino.
To view or add a comment, sign in
Create your free account or sign in to continue your search
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
New to LinkedIn? Join now
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.