InternLM logo

InternLM

Open-source foundation model series from Shanghai AI Lab with strong reasoning, coding, math, and bilingual Chinese-English capabilities

Free open-weight models on Hugging Face; API access via Shanghai AI Lab and partners

Visit Tool

Overview

InternLM is an open-source large language model family developed by Shanghai AI Laboratory (Shanghai AILab). The InternLM2 and InternLM3 series are competitive with leading open-source models on Chinese and English benchmarks, with particular strengths in long-context reasoning, code, and math tasks.

Key Features

  • Open weights with permissive commercial license
  • InternLM3: strong performance on reasoning, code, and multilingual tasks
  • Extended context windows up to 200K tokens
  • InternVL: vision-language models for image and video understanding
  • InternLM-Math: specialized mathematical reasoning capabilities
  • Optimized for efficient inference with quantized variants

Pricing: Free and open-source; available on Hugging Face and via Shanghai AILab's InternStudio cloud.

Pros

  • Strong cross-domain performance: reasoning, coding, math, and bilingual understanding
  • Open-weight with Apache 2.0 license — commercial use permitted
  • Active research group releasing regular improvements and specialized variants
  • Rich ecosystem including InternVL multimodal and InternLM-Math

Cons

  • Less mainstream adoption outside of China and research communities
  • Fewer integrations and tooling compared to the Llama ecosystem
  • Documentation and community resources primarily in Chinese

Tags

open-sourcechinesemultilingualreasoningcodingmathematicsresearchshanghai-ai-labapache-license

Similar Tools