Browse Source

docs: update backend features and CPU inference support sections in README and README_zh-CN

myhloli 2 weeks ago
parent
commit
d654238115
2 changed files with 17 additions and 17 deletions
  1. 7 7
      README.md
  2. 10 10
      README_zh-CN.md

+ 7 - 7
README.md

@@ -637,7 +637,7 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing
         <tr>
             <th>Backend Features</th>
             <td>Fast, no hallucinations</td>
-            <td>Good compatibility, slower</td>
+            <td>Good compatibility, <br>but slower</td>
             <td>Faster than transformers</td>
             <td>Fast, compatible with the vLLM ecosystem</td>
             <td>No configuration required, suitable for OpenAI-compatible servers</td>
@@ -651,8 +651,8 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing
         </tr>
         <tr>
             <th>CPU inference support</th>
-            <td colspan="3" style="text-align:center;">✅</td>
-            <td>❌</td>
+            <td colspan="2" style="text-align:center;">✅</td>
+            <td colspan="2" style="text-align:center;">❌</td>
             <td>Not required</td>
         </tr>
         <tr>
@@ -678,10 +678,10 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing
     </tbody>
 </table>
  
-1. Accuracy metric is the End-to-End Evaluation Overall score of OmniDocBench (v1.5)  
-2. Linux supports only distributions released in 2019 or later  
-3. Requires macOS 13.5 or later  
-4. Windows vLLM support via WSL2
+<sup>1</sup> Accuracy metric is the End-to-End Evaluation Overall score of OmniDocBench (v1.5)  
+<sup>2</sup> Linux supports only distributions released in 2019 or later  
+<sup>3</sup> Requires macOS 13.5 or later  
+<sup>4</sup> Windows vLLM support via WSL2
 
 
 ### Install MinerU

+ 10 - 10
README_zh-CN.md

@@ -625,7 +625,7 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c
             <th>后端特性</th>
             <td>速度快, 无幻觉</td>
             <td>兼容性好, 速度较慢</td>
-            <td>比transformers快</td>
+            <td>比transformers快</td>
             <td>速度快, 兼容vllm生态</td>
             <td>无配置要求, 适用openai兼容服务器</td>
         </tr>
@@ -637,9 +637,9 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c
             <td>不限</td>
         </tr>
         <tr>
-            <th>cpu推理支持</th>
-            <td colspan="3" style="text-align:center;">✅</td>
-            <td >❌</td>
+            <th>纯CPU推理支持</th>
+            <td colspan="2" style="text-align:center;">✅</td>
+            <td colspan="2" style="text-align:center;">❌</td>
             <td >不需要</td>
         </tr>
         <tr>
@@ -663,12 +663,12 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c
             <td colspan="5" style="text-align:center;">3.10-3.13</td>
         </tr>
     </tbody>
-</table>
- 
-1. 精度指标为OmniDocBench (v1.5)的End-to-End Evaluation Overall分数  
-2. Linux仅支持2019年及以后发行版  
-3. 需macOS 13.5及以上版本  
-4. 通过WSL2实现Windows vLLM支持
+</table> 
+
+<sup>1</sup> 精度指标为OmniDocBench (v1.5)的End-to-End Evaluation Overall分数  
+<sup>2</sup> Linux仅支持2019年及以后发行版  
+<sup>3</sup> 需macOS 13.5及以上版本  
+<sup>4</sup> 通过WSL2实现Windows vLLM支持
 
 > [!TIP]
 > 除以上主流环境与平台外,我们也收录了一些社区用户反馈的其他平台支持情况,详情请参考[其他加速卡适配](https://opendatalab.github.io/MinerU/zh/usage/)。