Ver Fonte

Merge pull request #3900 from opendatalab/release-2.6.3

Release 2.6.3
Xiaomeng Zhao há 2 semanas atrás
pai
commit
374ace0a34
2 ficheiros alterados com 3 adições e 3 exclusões
  1. 1 1
      README.md
  2. 2 2
      README_zh-CN.md

+ 1 - 1
README.md

@@ -684,7 +684,7 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing
  
 <sup>1</sup> Accuracy metric is the End-to-End Evaluation Overall score of OmniDocBench (v1.5), tested on the latest `MinerU` version.   
 <sup>2</sup> Linux supports only distributions released in 2019 or later.  
-<sup>3</sup> MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher.
+<sup>3</sup> MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher.  
 <sup>4</sup> Windows vLLM support via WSL2(Windows Subsystem for Linux).  
 <sup>5</sup> Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`.
 

+ 2 - 2
README_zh-CN.md

@@ -670,8 +670,8 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c
 </table> 
 
 <sup>1</sup> 精度指标为OmniDocBench (v1.5)的End-to-End Evaluation Overall分数,基于`MinerU`最新版本测试  
-<sup>2</sup> Linux仅支持2019年及以后发行版
-<sup>3</sup> MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用
+<sup>2</sup> Linux仅支持2019年及以后发行版  
+<sup>3</sup> MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用  
 <sup>4</sup> Windows vLLM通过WSL2(适用于 Linux 的 Windows 子系统)实现支持  
 <sup>5</sup> 兼容OpenAI API的服务器,如通过`vLLM`/`SGLang`/`LMDeploy`等推理框架部署的本地模型服务器或远程模型服务