Receive a world-class education from outstanding faculty on where the endoscopy field is headed on October 25 at the ASGE Endoscopy Course led by Course Directors Michelle A. Anderson, MD, MSc, FASGE, Nalini Guda, MD, MASGE and Irving Waxman, MD, FASGE. This includes the keynote on how AI will shape the future of GI delivered by ASGE President Prateek Sharma, MD, FASGE. Sign up now! https://hubs.ly/Q02K_lJC0#GIEndoscopy#Endoscopy#Gastroenterology
🌟 Exciting news! This year, at #ECR2024 in Vienna 🇦🇹, we at Viz.ai are going boothless to enhance flexibility for our valued customers and partners.
Connect with me, and my colleague Daniel D'Amour to discover how Viz.ai One accelerates diagnosis and treatment with AI-driven notifications, streamlined communication tools, and prioritized worklists. Don't miss out on the future of AI in healthcare! 🚀💡 #AIHealthcare#AIinRadiology
And it all starts tomorrow until Saturday! 📅
👨⚕️ Our study, conducted in collaboration with the University Hospitals and recently published in the journal Academic Radiology, yielded outstanding results in enhancing radiology efficiency.
🏥 We analysed 2,626 radiographs with input from 24 physician readers for MRMC analysis.
🌟 Performance Study Highlights:
🚀 Negative Predictive Value: 99.6%
🚀 False Negative Avoided: 67%
🚀 Average Reading Time Shortened: 27% per Exam
🤝 Connect with our team to learn more: https://bit.ly/rayvolve
Exciting News! 😃
Our latest research paper, "Exploring the Potentials of Large Language Models in Vascular and Interventional Radiology: Opportunities and Challenges," is now published in the Arab Journal of Interventional Radiology. This work, conducted with my esteemed colleagues Taofeeq Togunwa , Christabel Uche-Orji and Richard Olatunji explores the transformative potential of AI in enhancing precision and efficiency in healthcare.
In our review, we explore the promising applications and address the inherent challenges of integrating AI into clinical practices and patient care.
Kindly check out out full paper in the comment section below.
#ArtificialIntelligence#Radiology#Healthcaree#Research
MD, PhD, MSc, MBA at University Hospital Motol; Medic Kral Ltd. CEO; Researcher at CAS, Lecturer at 2nd Medical Faculty Charles University, Bariatric Endo, Nutritionist, AI enthusiast, ESGE SoME WG, 3D print, @KralJan
🏥 Gleamer is thrilled to share a groundbreaking study published in BMJ Open by the esteemed Freiburg hospital's radiology team. 📑
🤔 What's the study about? The radiology team embarked on a fascinating journey to develop their own model for wrist fracture detection, comparing it against the renowned Boneview. 🦴💻
💥 The results? Boneview showcased an impressive sensitivity of 95% on a local dataset derived from ER patient files (against 87% for the custom model).
🚀 This study underscores no significant advantage of specialized custom models on individual body areas, as BoneView matched their performance. Adapting seamlessly to any dataset is simply part of our expertise.
Curious to learn more? Dive into our clinical study gallery: https://lnkd.in/g852BqRD#AI#Radiology#Innovation#Healthcare#Gleamer#BMJStudy#WristFractureDetection
🩻 Pulmonary lesions. Hard to see, hard to detect.
📝 Our second Carebot poster at European Society of Radiology #ecr2024 focuses on the detection of suspicious lesions of the lung parenchyma on chest X-ray. While the CT examination offers a higher sensitivity in detecting lesions, it is often the routinely performed chest X-ray where such changes could be first observed.
🤖 To address this, we designed a deep learning-based assistive system (Carebot AI CXR) that evaluated all X-ray images in a medium-sized hospital (n=956) for one month. Our proposed algorithm achieved a sensitivity of 0.905, which was a significantly higher than that of six compared radiologists in a multi-reader study. It also achieved a very balanced specificity of 0.893, which was comparable to the radiologists, given the low prevalence of suspicious lesions.
CEO at Carebot, the most adopted AI radiology solution in CEE | Impacting millions of lives with data-driven diagnostics
🩻 Pulmonary lesions. Hard to see, hard to detect.
📝 Our second Carebot poster at European Society of Radiology#ecr2024 focuses on the detection of suspicious lesions of the lung parenchyma on chest X-ray. While the CT examination offers a higher sensitivity in detecting lesions, it is often the routinely performed chest X-ray where such changes could be first observed.
🤖 To address this, we designed a deep learning-based assistive system (Carebot AI CXR) that evaluated all X-ray images in a medium-sized hospital (n=956) for one month. Our proposed algorithm achieved a sensitivity of 0.905, which was a significantly higher than that of six compared radiologists in a multi-reader study. It also achieved a very balanced specificity of 0.893, which was comparable to the radiologists, given the low prevalence of suspicious lesions.
🩻 Pulmonary lesions. Hard to see, hard to detect.
📝 Our second Carebot poster at European Society of Radiology#ecr2024 focuses on the detection of suspicious lesions of the lung parenchyma on chest X-ray. While the CT examination offers a higher sensitivity in detecting lesions, it is often the routinely performed chest X-ray where such changes could be first observed.
🤖 To address this, we designed a deep learning-based assistive system (Carebot AI CXR) that evaluated all X-ray images in a medium-sized hospital (n=956) for one month. Our proposed algorithm achieved a sensitivity of 0.905, which was a significantly higher than that of six compared radiologists in a multi-reader study. It also achieved a very balanced specificity of 0.893, which was comparable to the radiologists, given the low prevalence of suspicious lesions.
Professor Livre-docente do Dpto. de Cirurgia da FMUSP. Gestor em Saúde pela EAESP-FGV (CEAHS)
2moWhat a top team!