Arcee.ai

Arcee.ai

Software Development

San Francisco, California 6,423 followers

Seamlessly Merge, Train, & Deploy your own Small Language Models (SLMs) in any environment.

About us

Arcee AI believes in a world in which every organization builds their own specialized AI on top of open source general intelligence, maintains ownership of their models, and leverages model merging to efficiently create the most accurate models. We’re making that world a reality with our end-to-end system for merging, training, and deploying Small Language Models (SLMs) that you own and that are adapted to your domain and data. Our solution is user-friendly and enables seamless deployment to any cloud or platform for inference.

Website
https://arcee.ai
Industry
Software Development
Company size
11-50 employees
Headquarters
San Francisco, California
Type
Privately Held
Founded
2023
Specialties
LLM, NLP, AI, Applied NLP, Data, Data Science, and Machine Learning

Locations

Employees at Arcee.ai

Updates

  • View organization page for Arcee.ai, graphic

    6,423 followers

    Another day, another Arcee AI Small Language Model (SLM) making waves in the #GenAI and #opensource world 🌊 🌊 🌊 Today, we unveil 𝗔𝗿𝗰𝗲𝗲-𝗠𝗲𝗿𝗮𝗷-𝗠𝗶𝗻𝗶, a 7B Arabic-language model – coming on the heels of our original top-performing Arabic model, 72B Arcee-Meraj. Arcee-Meraj-Mini is designed to make advanced Arabic-language understanding more accessible to everyone. Check out the full blog post here: https://lnkd.in/eBNH_vbv. And explore the model yourself over on Hugging Face (https://lnkd.in/e4qKUxCY) or in this Google Colab notebook (https://lnkd.in/eSRunKSk). Our Arcee-Meraj series of models, and our other multilingual models, are bridging business and cultural gaps in the quest to make #AI accessible to all. Arcee-Meraj-Mini boasts exceptional performance in the following: • instruction-following • generation of long texts • structured data understanding • and generation of structured outputs. Give it a try and let us know what you think! 🙌 #AI #ML #NLP #LLM

    7 Billion Reasons to Choose Arcee-Meraj-Mini: The Open-Source Arabic SLM for All

    7 Billion Reasons to Choose Arcee-Meraj-Mini: The Open-Source Arabic SLM for All

    blog.arcee.ai

  • View organization page for Arcee.ai, graphic

    6,423 followers

    👀

    View organization page for Arcee.ai, graphic

    6,423 followers

    🤔 If you’re wondering how Arcee AI is consistently releasing Small Language Models (SLMs) whose performance rivals #LLMs – there’s no secret to our success. ✨ It’s because our technology is powered by a world-class research team that’s constantly updating our model training pipeline, making it possible to get MORE out of models that are increasingly SMALL. 🎉 We’re proud to announce the latest publication by our researchers, "𝗠𝗲𝗿𝗴𝗶𝗻𝗴 𝗶𝗻 𝗮 𝗕𝗼𝘁𝘁𝗹𝗲: 𝗗𝗶𝗳𝗳𝗲𝗿𝗲𝗻𝘁𝗶𝗮𝗯𝗹𝗲 𝗔𝗱𝗮𝗽𝘁𝗶𝘃𝗲 𝗠𝗲𝗿𝗴𝗶𝗻𝗴 (𝗗𝗔𝗠) 𝗮𝗻𝗱 𝘁𝗵𝗲 𝗣𝗮𝘁𝗵 𝗳𝗿𝗼𝗺 𝗔𝘃𝗲𝗿𝗮𝗴𝗶𝗻𝗴 𝘁𝗼 𝗔𝘂𝘁𝗼𝗺𝗮𝘁𝗶𝗼𝗻." The paper introduces DAM as an efficient, data-informed, adaptive merging approach that can be an alternative to evolutionary merging. DAM optimizes model integration through scaling coefficients, which minimizes computational demands. The paper also explores model merging techniques across a spectrum of complexity, examining where automated methods like evolutionary strategies stand – compared to hyperparameter-driven approaches such as DARE and TIES-Merging, and and also compared to simpler methods like Model Soups. Our findings challenge the traditional assumption that more complex methods are inherently superior, showing that straightforward techniques like linear averaging can perform just as well, especially when merged models share similar characteristics. Huge congrats to our researchers Thomas Gauthier-Caron, Shamane Siri, PhD, Elliot Stein, Malikeh Ehghaghi, Charles Goddard, Mark McQuade, and Jacob Solawetz... and with a special shout-out to our longtime collaborator, co-author Maxime Labonne. Read the paper here ⬇ https://lnkd.in/e_DdYBDm #NLP #GenAI

    Merging in a Bottle: Differentiable Adaptive Merging (DAM) and the Path from Averaging to Automation

    Merging in a Bottle: Differentiable Adaptive Merging (DAM) and the Path from Averaging to Automation

    arxiv.org

  • View organization page for Arcee.ai, graphic

    6,423 followers

    🤔 If you’re wondering how Arcee AI is consistently releasing Small Language Models (SLMs) whose performance rivals #LLMs – there’s no secret to our success. ✨ It’s because our technology is powered by a world-class research team that’s constantly updating our model training pipeline, making it possible to get MORE out of models that are increasingly SMALL. 🎉 We’re proud to announce the latest publication by our researchers, "𝗠𝗲𝗿𝗴𝗶𝗻𝗴 𝗶𝗻 𝗮 𝗕𝗼𝘁𝘁𝗹𝗲: 𝗗𝗶𝗳𝗳𝗲𝗿𝗲𝗻𝘁𝗶𝗮𝗯𝗹𝗲 𝗔𝗱𝗮𝗽𝘁𝗶𝘃𝗲 𝗠𝗲𝗿𝗴𝗶𝗻𝗴 (𝗗𝗔𝗠) 𝗮𝗻𝗱 𝘁𝗵𝗲 𝗣𝗮𝘁𝗵 𝗳𝗿𝗼𝗺 𝗔𝘃𝗲𝗿𝗮𝗴𝗶𝗻𝗴 𝘁𝗼 𝗔𝘂𝘁𝗼𝗺𝗮𝘁𝗶𝗼𝗻." The paper introduces DAM as an efficient, data-informed, adaptive merging approach that can be an alternative to evolutionary merging. DAM optimizes model integration through scaling coefficients, which minimizes computational demands. The paper also explores model merging techniques across a spectrum of complexity, examining where automated methods like evolutionary strategies stand – compared to hyperparameter-driven approaches such as DARE and TIES-Merging, and and also compared to simpler methods like Model Soups. Our findings challenge the traditional assumption that more complex methods are inherently superior, showing that straightforward techniques like linear averaging can perform just as well, especially when merged models share similar characteristics. Huge congrats to our researchers Thomas Gauthier-Caron, Shamane Siri, PhD, Elliot Stein, Malikeh Ehghaghi, Charles Goddard, Mark McQuade, and Jacob Solawetz... and with a special shout-out to our longtime collaborator, co-author Maxime Labonne. Read the paper here ⬇ https://lnkd.in/e_DdYBDm #NLP #GenAI

    Merging in a Bottle: Differentiable Adaptive Merging (DAM) and the Path from Averaging to Automation

    Merging in a Bottle: Differentiable Adaptive Merging (DAM) and the Path from Averaging to Automation

    arxiv.org

  • View organization page for Arcee.ai, graphic

    6,423 followers

    Since our launch (only a year ago–it’s hard to believe), Arcee AI has been making waves in the #AI world with our pioneering of Small Language Models or SLMs as the high-performant, compute-efficient solution to the vast majority of #GenAI use cases. (Yes, LLMs are so 2024 🤣.) But you may be wondering: how do we get our #SLMs to pack such a strong punch? It’s thanks to our world-class model training pipeline, which is based on cutting-edge techniques that include Model Merging and Distillation. The latest 𝗦𝘁𝗮𝘁𝗲 𝗼𝗳 𝗔𝗜 report highlight's our team's work on distillation, which we open-sourced via the tool 𝗗𝗶𝘀𝘁𝗶𝗹𝗹𝗸𝗶𝘁 (https://lnkd.in/dsRrG7_6). Thanks for the shout-out! And be sure to check out the report here: https://www.stateof.ai/.

    GitHub - arcee-ai/DistillKit: An Open Source Toolkit For LLM Distillation

    GitHub - arcee-ai/DistillKit: An Open Source Toolkit For LLM Distillation

    github.com

  • View organization page for Arcee.ai, graphic

    6,423 followers

    Yet another Arcee AI Small Language Model (SLM) that's punching above its weight. 💪 Our 14B SuperNova-Medius is #1 on the Hugging Face Open LLM Leaderboard for models under 21B parameters!! We released it on Friday and it's already been downloaded nearly 8k times. Check it out here: https://lnkd.in/ega2NEH6. And learn more about how we trained SuperNova-Medius using cutting-edge techniques that we've pioneered, including distillation and model merging, in our blog: https://lnkd.in/gDxf5m9T #NLP #GenAI #LLMs

  • View organization page for Arcee.ai, graphic

    6,423 followers

    First came Arcee AI's flagship 𝟳𝟬𝗕 𝗺𝗼𝗱𝗲l 𝗦𝘂𝗽𝗲𝗿𝗡𝗼𝘃𝗮, followed by the 𝟴𝗕 𝗦𝘂𝗽𝗲𝗿𝗡𝗼𝘃𝗮-𝗟𝗶𝘁𝗲.  Today we add to this family of superpower Small Language Models (SLMs) with the release of the 𝟭𝟰𝗕 𝗦𝘂𝗽𝗲𝗿𝗡𝗼𝘃𝗮-𝗠𝗲𝗱𝗶𝘂𝘀. SuperNova-Medius represents a breakthrough in SLMs, combining the power of model merging with the efficiency of knowledge distillation. Developed by the innovative team at Arcee Labs, this model represents a significant leap forward in combining advanced capabilities with practical efficiency. At its core, SuperNova-Medius is built on the robust Qwen2.5-14B. But what sets it apart is its unique heritage. It's a carefully-orchestrated fusion of knowledge, using our DistillKit, from two AI giants: Qwen2.5-72B-Instruct and Llama-3.1-405B-Instruct. This amalgamation of different AI "minds" results in a model that punches well above its weight class, offering performance that rivals much larger models despite its more manageable size. Read more about the unique development of SuperNova, straight from our Chief of Frontier Research Charles Goddard, here: https://lnkd.in/gDxf5m9T #GenAI #opensource #LLMs

    Introducing Arcee-SuperNova-Medius: Our 14B Small Language Model That Rivals a 70B

    Introducing Arcee-SuperNova-Medius: Our 14B Small Language Model That Rivals a 70B

    blog.arcee.ai

  • View organization page for Arcee.ai, graphic

    6,423 followers

    What do you need to know about Small Language Models (SLMs)? If you haven’t yet started making the switch from #LLMs to SLMs, here are the initial reasons why companies of all sizes are doing so–with Arcee AI. ⚡𝗙𝗮𝘀𝘁𝗲𝗿 𝗣𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴 Lightning-fast responses without sacrificing accuracy. ⏩ 𝗘𝗮𝘀𝗶𝗲𝗿 𝗗𝗲𝗽𝗹𝗼𝘆𝗺𝗲𝗻𝘁 Seamlessly integrate into existing systems and devices. 🤔 𝗜𝗺𝗽𝗿𝗼𝘃𝗲𝗱 𝗘𝘅𝗽𝗹𝗮𝗶𝗻𝗮𝗯𝗶𝗹𝗶𝘁𝘆 Get a clearer understanding of how your model makes decisions. 💴 𝗟𝗼𝘄𝗲𝗿 𝗖𝗼𝘀𝘁𝘀 Save on computational resources while reducing your costs and your carbon footprint. 🔐 𝗘𝗻𝗵𝗮𝗻𝗰𝗲𝗱 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆 Protect sensitive data with transparent models that are in full regulatory compliance. Ready to learn more about Small Language Models? Reach out to our team today (https://lnkd.in/ezf8c-Fd) and get ready for the power of efficient, effective, and responsible #AI with #SLMs. #GenAI #NLP #Productivity

  • View organization page for Arcee.ai, graphic

    6,423 followers

    When you sign a contract for ChatGPT Enterprise or Claude Enterprise, you’re essentially renting THEIR Large Language Model (#LLM). With Arcee AI’s Small Language Models (#SLMs), you get: ☑️ Better performance ☑️ Customization ☑️ A much better price ✨AND✨ you actually OWN your model. The OWNERSHIP factor is so incredible that, in calls with prospects, the most informed buyers question us on this – and when we explain that it is indeed TRUE, that's when they have the 💡moment of realization💡: that Arcee AI is offering the future of secure, transparent #GenAI that enterprises need. It's a reality that's resonating across many industries, including the #mobile world – with huge enthusiasm for our talk on this by Arcee AI's Chief Evangelist Julien SIMON at the Mobile World Congress (#MWC24) in Vegas this week. Ready to learn more about how you can get full ownership of your GenAI, with Arcee AI's Small Language Models – whether you're in #telco or another industry? Book a demo with our team today: https://lnkd.in/eG8DpHYm #AI #NLP #telco

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
  • View organization page for Arcee.ai, graphic

    6,423 followers

    The beauty – but also the challenge – of working in #AI is that the technology is advancing not just week-to-week, but day-to-day or even hour-to-hour. Today, we’re taking a moment to step back and honor two of the greatest AI visionaries as they receive the Nobel Prize.🏅 As most of you likely have heard by now, 𝘁𝗵𝗲 𝟮𝟬𝟮𝟰 𝗡𝗼𝗯𝗲𝗹 𝗣𝗿𝗶𝘇𝗲 𝗶𝗻 𝗣𝗵𝘆𝘀𝗶𝗰𝘀 𝗵𝗮𝘀 𝗯𝗲𝗲𝗻 𝗮𝘄𝗮𝗿𝗱𝗲𝗱 𝘁𝗼 𝗚𝗲𝗼𝗳𝗳𝗿𝗲𝘆 𝗛𝗶𝗻𝘁𝗼𝗻 𝗮𝗻𝗱 𝗝𝗼𝗵𝗻 𝗛𝗼𝗽𝗳𝗶𝗲𝗹𝗱, 𝗳𝗼𝗿 𝘁𝗵𝗲𝗶𝗿 𝗽𝗶𝗼𝗻𝗲𝗲𝗿𝗶𝗻𝗴 𝘄𝗼𝗿𝗸 𝗼𝗻 𝗮𝗿𝘁𝗶𝗳𝗶𝗰𝗶𝗮𝗹 𝗻𝗲𝘂𝗿𝗮𝗹 𝗻𝗲𝘁𝘄𝗼𝗿𝗸𝘀. Their research is widely seen as having laid the foundations for AI – enabling breakthroughs in language translation, facial recognition, and more. Hinton is also a vocal advocate for responsible AI development, highlighting its transformative power while also explaining the the potential risks. As a company whose focus has always been on efficient, secure, and transparent AI, we applaud his leadership. (And yes, if you were surprised to see AI in the Physics category, you're not alone. 🤔 Perhaps it's time for a new Nobel category – if not a Nobel of AI, maybe of Computer Science?”) #NobelPrize #GenAI #MachineLearning #Physics

  • View organization page for Arcee.ai, graphic

    6,423 followers

    Start your week with Arcee AI's replacement to ChatGPT: SuperNova (try it here: https://lnkd.in/e5zAhzyx). Whether you're a developer, researcher, or entrepreneur, Supernova is a game-changer. Here's why: ⚡ Lightning-Fast Processing 🤝 Seamless Integration with Your Workflows 😊 Scalability and Flexibility 💡Enhanced Explainability and Transparency 🛡️ Robust Security and Compliance Try SuperNova for yourself and get in touch with our team to learn more about the model that companies are choosing over ChatGPT and Claude: https://lnkd.in/eG8DpHYm #AI #GenAI #NLP

    Supernova

    Supernova

    arcee.ai

Similar pages

Funding

Arcee.ai 2 total rounds

Last Round

Series A

US$ 24.0M

See more info on crunchbase