Skip to content
Regolo Logo

Apertus-70B-2509

Apertus‑70B‑2509 is a 70B-parameter, fully open multilingual transformer from the Swiss AI Initiative, trained on 15T compliant tokens and supporting 1,800+ languages with competitive open‑weight benchmark performance.
Core Model
Chat

How to Get Started

pip install requestsCode language: Bash (bash)
import requests

api_url = "https://api.regolo.ai/v1/chat/completions"
headers = {
    "Content-Type": "application/json",
    "Authorization": "Bearer YOUR_REGOLO_KEY"
}
data = {
  "model": "minimax-m2.5",
  "messages": [
    {
      "role": "user",
      "content": "What is the capital of Italy, and which region does it belong to?"
    }
  ],
  "reasoning_effort": "low"
}

response = requests.post(api_url, headers=headers, json=data)
print(response.json())Code language: Python (python)

Additional Info

image/png

Applications & Use Cases

  • Multilingual chat and assistant backends that must serve users across 1,800+ languages with open, auditable training data.
  • Research and benchmarking projects that need fully documented data and training pipelines, enabling reproducible experiments and model audits.
  • Fine‑tuned domain experts for law, healthcare, government, or finance where strict data compliance and non‑memorization objectives are critical.
  • Code, math, and technical writing assistants leveraging the model’s web/code/math curriculum and strong open‑weight benchmark scores.
  • Teacher models for distillation or alignment of smaller LLMs, using Apertus’s transparent training artifacts and multilingual strength as a reference.