Knowledgator’s Post

View organization page for Knowledgator, graphic

2,084 followers

Many people asked about the technical nuances of our bi-encoder #GLiNER architecture. If you want to explore the intrinsic details of this work or you just seeking efficient fine-tuning tips here is a blog post for you: https://lnkd.in/eqhhuNsP

Meet the new zero-shot NER architecture

Meet the new zero-shot NER architecture

blog.knowledgator.com

Benjamin Anderson

Stanford CS Grad, Chief Scientist, Taylor AI (YC S23)

2mo

This is very cool! It totally makes sense why you use BGE to encode the entity embeddings. Any insight as to why DeBERTA is preferred for the span processing vs. also using sentence-transformers for that? Is it because sentence-transformers don't have good representation for individual tokens?

Yasir Altaf

Data Science/AI Consultant | Big Data Solutions | Cellular Radio & IoT

2mo

Hey Knowledgator, various derivative architectures on top of Gliner such as Numind and gliner Multitask provide broad spectrum use cases already. So What is knowledgator offering that goes beyond the existing features

Like
Reply

Joint training of sentence transformer and span representation layer improves the semantic abilities of label encoder to understand entity categories. Below you can see projected entity embeddings clustered with the K-means algorithm.

  • No alternative text description for this image

The final model consists of two encoder models that actually came a long path to become a good bi-directional GLiNER model.

  • No alternative text description for this image

The main difference with the original GLiNER architecture is using a separate encoder for entity representation. In our work, we explored sentence transformers such as BGE.

  • No alternative text description for this image

As a result, our models demonstrate efficiency and scalability while beating the original GLiNER v2.1 and going close with other uni-encoder models.

  • No alternative text description for this image
Like
Reply
See more comments

To view or add a comment, sign in

Explore topics