PolyTransNet integrates various neural network models, including:
-
Transformer-based Models: Used for both translation and embeddings generation. The model benefits from the self-attention mechanism of transformers, which enables it to capture long-range dependencies within text.
-
Bidirectional Encoder Representations from Transformers (BERT): Used for text classification tasks by leveraging pre-trained language models to understand context and semantic relationships in text.
-
Custom Wrappers and Logic: DeepQuery has integrated custom wrappers that fine-tune the core Krutrim platform to address specific use cases, including cross-lingual embeddings and domain-specific text processing.