Founder & CEO @ hushh 🤫, ex-Google Director of Product - Core Developer & ML, GPU TPU AI/ML Infra, Board Member, Tech Advisor & Investor, ex-Microsoft AI, Splunk AI/ML, E&Y / Capgemini, UBS, ING, Adobe, Sun
Great work here Rohan Sidankar . Thanks to Rohan and Justin, hushh is known for its work in optimizing neural network search through dimensionality reduction, specifically with ViT-B-32 CLIP for various search tasks as we build out our search index. This allows for maintaining recall while reducing vector size, enhancing search efficiency and effectiveness against large datasets like the LAION Benchmark Repository. Dimensionality reduction impacts neural network search by enabling the maintenance of recall while reducing vector size. This optimization, as seen in Rohan Sidankar's work at Hushh with ViT-B-32 CLIP, enhances search efficiency and effectiveness against large datasets like the LAION Benchmark Repository. ViT-B-32 CLIP can enhance search tasks by allowing for dimensionality reduction in neural network search, as demonstrated by Rohan Sidankar's work at Hushh. This approach enables the maintenance of recall while reducing the vector size, which is crucial for efficient and effective search tasks against large datasets like the LAION Benchmark Repository.
Fractional AI Officer, Founder @ 🤫 hushh, ex-Salesforce Principal Data Scientist/Engineer, Advisor for UW Continuing Education and AI/Aeronautical/Health startups.
Rohan Sidankar did a great job diving into dimensionality reduction for neural network search at Hushh. Here's some of his findings on tuning ViT-B-32 CLIP for various search tasks against the LAION Benchmark Repository. Some of the gains here may not seem like much, but the big benefit comes from the ability to shrink the vector size down while maintaining recall.