📃Scientific paper: Spacecraft Coatings Optimizing LiDAR Debris Tracking and Light Pollution Impacts Abstract: Space safety and astronomy are at odds. The problem posed by space debris and derelict satellites in the low Earth orbit is an existential threat to all space operations. These dangerous objects in space are more easily tracked with ground-based LiDAR if they are highly reflective, especially in the near-infrared (NIR) range. At the same time, reflective objects in orbit are the bane of ground-based astronomers, causing light pollution and marring images with bright streaks. How can this tension be resolved? The hypothesis tested is that a near-infrared-transparent (NIRT) coating which is opaque in the visible light range and transparent in the NIR range is a promising candidate for use in satellite construction. This experiment tests whether typical spacecraft surfaces such as anodized aluminum or multi-layer insulation (MLI) with a NIRT coating applied will absorb visible light and reflect NIR. The findings confirm the efficacy of the NIRT coating for this purpose, reducing visible light reflection by 47% (+/-3%) and increasing reflection in the NIR by 7% (+/-2%). This promising novel NIRT coating may help provide a path forward to resolve the tension between astronomy and the space industry. Continued on ES/IODE ➡️ https://etcse.fr/jAB2 ------- If you find this interesting, feel free to follow, comment and share. We need your help to enhance our visibility, so that our platform continues to serve you.
es/iode’s Post
More Relevant Posts
-
In the vast universe of SATELLITE imaging calibration, NASA boasts a colossal “analog checkerboard”—a 92-square-kilometer patch nestled in Nevada. It serves as the grandmaster of reference points for fine-tuning space sensors and cameras: a flat no-man’s land under cloudless skies. At Hexago Technologies, we champion 100% DIGITAL and SMALL-IS-BEAUTIFUL CALIBRATION. Our ReMap station, comparable in size to a MINI-FRIDGE, rectifies geometric distortions, chromatic aberrations, and zoom displacements. It achieves this correction for multiple imaging systems in under 60 SECONDS, addressing every pixel. www.hexago-tech.com revolutionizes the accuracy of imaging systems M+M #earthobservation
To view or add a comment, sign in
-
Image sensors have become a crucial component in space missions to capture high-quality images under challenging conditions for earth observation, planetary exploration, and astronomy. The rising need to improve resolution, sensitivity, and durability of image sensors have led to backside-illuminated sensors, time-delayed integration sensors, and radiation-hardened sensors. As technology continues to advance, image sensors will become even more critical in advancing our understanding of the universe. Read more: https://buff.ly/3l6bqWJ #machinevision #machinevisionsolutions #imagesensors #space #astronomy #spaceexploration #machinevisionsensors #spacetechnology #sensortechnology #robotics #imaging #hwyl #imagingthefuture
To view or add a comment, sign in
-
NASA's latest experimental jet, the X-59, aims to minimize the sonic boom, a thunderous noise produced when supersonic aircraft break the sound barrier. Traditional jets create shock waves that result in this loud boom, but the X-59's slender, pointed design is engineered to direct sound waves away from the ground. Predicted to fly at Mach 1.4, or 925 miles per hour, the X-59 seeks to make supersonic commercial flight feasible again by reducing the noise to a level that might lift the 1974 ban on non-military supersonic jets flying over land. In 2025, NASA will test the X-59 over its Armstrong Flight Research Center in California, monitoring sound levels with ground recorders. By 2026, the jet will fly over major U.S. cities to gauge public reaction to its noise levels. This data, combined with technical findings, will be presented to regulators to support lifting the ban. Key design features include a T-tail to minimize aft shock, a high-thrust engine mounted on the upper side, and a long, tapered nose to break up shock waves. Inside, the X-59 uses an eXternal Vision System, replacing the central front window with a 4K monitor that displays live camera footage augmented with flight data. This setup helps pilots maintain visibility and control, essential for safely navigating supersonic speeds. NASA uses advanced supercomputers to simulate flight conditions and refine the jet’s design, aiming to reduce sonic booms through precise engineering adjustments. The goal is to create a quieter supersonic experience that could revolutionize air travel.
To view or add a comment, sign in
-
https://lnkd.in/ddVm_Yc2 Since the first launch of a human made machine into space, approximately 55,000 satellites were deployed into space as of 2023. It is further estimated that there are millions of pieces of debris larger than 1 centimeter, as well as tens of millions of smaller debris particles that are hard to track. A recent study published in IET Radar, Sonar & Navigation reveals researchers found a way to track the troublesome metal objects by utilizing AI – they reportedly used data from the Tracking and Imaging Radar to teach computers how to recognize and track objects. The method they tried is a detection system based on the You-Only-Look-Once algorithm, which, unlike older methods that need several passes or sliding windows, can spot objects in just one go through the neural network, making it much faster and more efficient. According to Interesting Engineering, the evaluation was conducted in a simulated environment mimicking real-world conditions, and the results showed that the YOLO-based detection system outperformed all traditional approaches, achieving a higher detection rate while maintaining a low rate of false alarms. Co-author of the study Federica Massimi explains, “In addition to improving space surveillance capabilities, artificial intelligence–based systems like YOLO have the potential to revolutionize space debris management. By quickly identifying and tracking hard-to-detect objects, these systems enable proactive decision-making and intervention strategies to mitigate collisions and risks and preserve the integrity of critical space resources.” The reason many space agencies and organizations worldwide are tracking space debris is that it is essential to remove these particles to maintain a safe and sustainable space environment. But with the increasing speed and quantity of satellites being launched into space, the debris is only bound to increase. Satellites nowadays are avoiding debris by carefully planning orbits to minimize the risk of collision with known debris and being designed with protective shielding to withstand small impacts. Satellite operators also use ground-based tracking systems to monitor debris and maneuver satellites to avoid potential collisions. #technews #ai #ia #machinelearning #space #defense #security
AI-Powered Solution to Track Space Debris - iHLS
i-hls.com
To view or add a comment, sign in
-
On a mission to democratize computing in space. Currently focused on building high performance compute systems for small satellites. Hardware engineer with 20+ years experience.
The more I learn about the IM-1 mission, the more I'm impressed. The amount of problem solving on-the-fly was incredible. The team overcame many problems that would have ended other missions. The biggest challenge has to be lack of altitude data. Apollo and other missions used radar, but Odysseus planned to use a laser. Unfortunately, the primary system was left inhibited during launch and couldn't be activated in flight. They happened to be carrying a second laser as a tech demo, which got a field promotion to primary navigation. Unfortunately, the software patch didn't fully work. This meant all Odysseus had was an Inertial Measurement Unit (IMU), which gave it a rough idea of where it was, and cameras, which could compensate for the drift in the IMU. That the spacecraft was able to soft land at all (tipping over not withstanding), is a testament to the robustness of their systems. This was a fantastic demonstration of terrain relative navigation. The Ingenuity helicopter on Mars used similar techniques to find its way, combining a camera with an image processor to identify relevant features and track them. As we continue to explore the Moon, this technology will be essential to enable surface rendezvous for more complex missions. Apollo 12 did this in 1969, landing within walking distance of Surveyor 3. They had Pete Conrad onboard to look out the window and fly the lander, though. Robotic missions will require advanced computers to process the image data from the cameras. I'm excited to be working on this enabling technology at Zephyr Computing Systems, building high performance computing systems for the next generation of space exploration. For more information the IM-1 mission, I recommend this article and others on the same topic from Eric Berger. https://lnkd.in/ebR2E9A5
It turns out that Odysseus landed on the Moon without any altimetry data
arstechnica.com
To view or add a comment, sign in
-
Climate Change, Sustainability & Technology Strategist | Driving Alignment Between Science, Tech, & Policy to Build Solutions at Scale
🛰️ Have you ever wondered how the unseen particles in our atmosphere are constantly shaping the quality of the air we breathe? Thanks to advancements in earth science data collection and modeling, we're now able to illuminate the invisible forces, such as air quality, shaping our everyday life. 🌏 This recently released animation, fueled by the powerful GEOS-5 model, sheds light on the patterns behind aerosol transport - and the interconnections between Earths systems. Once airborne, particles like dust, black carbon, and sea salt can travel across continents and over oceans, influencing climate and weather patterns and altering air quality in communities across the globe. 🚀 NASA's GEOS-5 model fuses complex equations with measurements from an array of sources, including satellites, aircraft sensors, and ground instruments to simulate real-world conditions several times a day. Models such as GEOS-5, paired with compelling visualizations, are key to understanding the connections between Earth's systems, and driving informed decision-making. Watch the animation, here: https://ow.ly/mKEe50QwzeY #NASA #GMAO #GEOS5 #EarthObservation #DataScience #ClimateAction #TechForGood #Innovation #gis #maps #dataviz
To view or add a comment, sign in
-
In the domain of celestial observation, astronomers perceive the atmosphere as a source of noise, whereas for atmospheric physicists, it represents a crucial source of signal. That's why high Signal-to-Noise Ratio (SNR) is essential from a technical & data point of view! I, recently, came across a paper published by a group of scientists from the European Space Agency - ESA & Thales Alenia Space and it was about the newest European space mission dedicated for atmospheric observation. The scientific payload was designed to detect, measure and infer chemical composition and suspended particles in the atmosphere. Their payload consists of 3 hyperspectral imagers, one of which is focusing at the SWIR where CH4 and CO2 have absorption lines. Thw HSI has a spectral resolution of ~0.3nm! Their SWIR spectral response is shown in the bottom left of the image below. My Australian SWIR RedEye-1, developed at the ARC Training Centre for CubeSats, UAVs, and their Applications (CUAVA) & SAIL labs, offers the same spectral response in the SWIR with a spectral resolution of ~0.3nm! Its spectral profile is shown at the bottom right of the image! It is very satisfying to see how #RedEye-1, developed with COTS optics, is achieving almost identical spectral responsivity to the CO2M mission. Copernicus CO2M mission paper: https://lnkd.in/gFshFB6u RedEye-1 paper: https://lnkd.in/gWin5S7U
To view or add a comment, sign in
-
Electronic Engineer. Sr. Project Manager(PMP), Energy Leader @ CACME (WEC) & Postgraduate Diploma in Hydrogen Economy @ UTN (FRBA)
Free-Flying Robots in Space: How Real-Life Droids Test New Tech. by Clarence Oxford - Los Angeles CA (SPX) Navigating the great unknown of space requires innovative tools, like the Multi-Resolution Scanner (MRS), a novel 3D mapping technology developed under the auspices of the International Space Station (ISS) National Laboratory. This technology leverages NASA's robotic Astrobee system to generate high-resolution maps of remote environments, aiding not only space exploration but also terrestrial industries. A joint effort by Boeing and CSIRO, the Australian scientific research agency, the project underscores the vital role of international cooperation in advancing space technology. Historically, CSIRO has played a significant role in space exploration, having supported the Apollo 11 mission by receiving and broadcasting lunar television signals through its Parkes radio telescope. The MRS aims to produce detailed 3D maps of environments such as the ISS or lunar and Martian terrains. "We will use NASA's free-flying Astrobee robots to test MRS, which will allow us to create 3D maps of the space station's Kibo module," said Marc Elmouttie, research group leader at CSIRO. The system combines multiple sensors to enhance data accuracy and resolution, crucial for the robot's movement and data collection in space. This technology also incorporates advanced photogrammetry and 3D simultaneous localization and mapping (SLAM) technology, enabling autonomous navigation and environment mapping. The first tests are conducted in the Kibo module to compare the technology's performance in microgravity against terrestrial results. Elmouttie and his team aim to validate the effectiveness of their mapping software in space, initially launched aboard a SpaceX resupply mission to the ISS. If successful, MRS could be extended to other space station modules and potentially used in future robotic-led missions, such as NASA's planned lunar Gateway. This could facilitate autonomous operations in space when human presence is limited, enhancing the capability and safety of both crewed and robotic space missions. "Boeing is committed to providing improved capabilities and enhancing safety for trips to the Moon and beyond," said Scott Copeland, director for ISS research integration at Boeing. https://lnkd.in/dC7CEwyW
To view or add a comment, sign in
-
In a groundbreaking achievement, Insta360 has elevated its innovation to new heights. Sending a pair of Instax360 X2 action cameras into outer space, these cutting-edge devices are now orbiting Earth, capturing unprecedented 360-degree views of our planet, stars, and the Milky Way galaxy. This remarkable milestone signifies the inaugural journey of a 360-degree action camera into the realms of space. The unique challenges posed by the space environment prompted meticulous engineering efforts. The X2 cameras underwent adaptations to withstand extreme temperature fluctuations, ranging from -70 to 50 degrees Celsius (-94 to 122 degrees Fahrenheit) as the satellite orbits Earth every 90 minutes. During the journey into space, shocks and vibrations posed concerns. Although once in orbit, these concerns diminished, ensuring that the cameras were securely positioned for their entire voyage was crucial. The specially adapted X2 cameras are now successfully orbiting Earth, relaying captivating photos and videos back to our planet through a specially developed communication network. This technological feat marks a significant leap forward in capturing the wonders of space through the lens of a 360-degree action camera. 🌌📷 #Insta360InSpace #360DegreeView #SpaceExploration #Space #Photography #Science #Innovation #Tech
First 360-Degree Action Camera in Space Captures New View of Earth
petapixel.com
To view or add a comment, sign in
907 followers