YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

HY-MT1.5-1.8B

Model Description

HY-MT1.5-1.8B is a 1.8B-parameter multilingual neural machine translation model released by Tencent as part of the Hunyuan Translation Model v1.5 series.

The model focuses on high-quality, high-speed translation across 33 major languages plus 5 ethnic/dialect variants, achieving performance comparable to the larger 7B model while remaining lightweight. After quantization, HY-MT1.5-1.8B is suitable for edge and on-device deployment, enabling real-time translation scenarios.

Quickstart

  1. Install NexaSDK and create a free account at sdk.nexa.ai
  2. Activate your device with your access token:
nexa config set license '<access_token>'
  1. Run the model on Qualcomm NPU in one line:
nexa infer NexaAI/LFM2.5-1.2B-npu

Features

  • Multilingual translation: supports 36 languages, including Chinese, English, European, Middle Eastern, South Asian, and Southeast Asian languages.

  • Small yet strong: industry-leading translation quality among ~2B-parameter models, surpassing many commercial translation APIs.

  • Edge-ready: supports FP8 and INT4 quantization for efficient deployment on resource-constrained devices.

  • Advanced translation controls:

    • Terminology intervention
    • Context-aware translation
    • Formatted translation with tag preservation
  • Robust scenarios: strong performance on mixed-language and explanatory translation tasks.

Use Cases

  • Real-time translation on mobile, PC, and edge devices
  • Multilingual chat and customer support systems
  • Document and webpage translation
  • Cross-lingual content creation and localization
  • Embedded translation for IoT and smart devices

License

This repo is licensed under the Creative Commons Attribution–NonCommercial 4.0 (CC BY-NC 4.0) license, which allows use, sharing, and modification only for non-commercial purposes with proper attribution. All NPU-related models, runtimes, and code in this project are protected under this non-commercial license and cannot be used in any commercial or revenue-generating applications. Commercial licensing or enterprise usage requires a separate agreement. For inquiries, please contact hello@nexa.ai

Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including NexaAI/HY-MT1.5-1.8B-npu