Quellcode durchsuchen

Quote pip install arguments in extension module docs

Updated the pip install commands in both English and Chinese quick start guides to quote the mineru extras arguments, ensuring correct parsing by the shell.
aopstudio vor 2 Wochen
Ursprung
Commit
5cd31f97b6

+ 2 - 2
docs/en/quick_start/extension_modules.md

@@ -6,7 +6,7 @@ MinerU supports installing extension modules on demand based on different needs
 ### Core Functionality Installation
 The `core` module is the core dependency of MinerU, containing all functional modules except `vllm`. Installing this module ensures the basic functionality of MinerU works properly.
 ```bash
-uv pip install mineru[core]
+uv pip install "mineru[core]"
 ```
 
 ---
@@ -15,7 +15,7 @@ uv pip install mineru[core]
 The `vllm` module provides acceleration support for VLM model inference, suitable for graphics cards with Turing architecture and later (8GB+ VRAM). Installing this module can significantly improve model inference speed.
 In the configuration, `all` includes both `core` and `vllm` modules, so `mineru[all]` and `mineru[core,vllm]` are equivalent.
 ```bash
-uv pip install mineru[all]
+uv pip install "mineru[all]"
 ```
 > [!TIP]
 > If exceptions occur during installation of the complete package including vllm, please refer to the [vllm official documentation](https://docs.vllm.ai/en/latest/getting_started/installation/index.html) to try to resolve the issue, or directly use the [Docker](./docker_deployment.md) deployment method.

+ 2 - 2
docs/zh/quick_start/extension_modules.md

@@ -6,7 +6,7 @@ MinerU 支持根据不同需求,按需安装扩展模块,以增强功能或
 ### 核心功能安装
 `core` 模块是 MinerU 的核心依赖,包含了除`vllm`外的所有功能模块。安装此模块可以确保 MinerU 的基本功能正常运行。
 ```bash
-uv pip install mineru[core]
+uv pip install "mineru[core]"
 ```
 
 ---
@@ -15,7 +15,7 @@ uv pip install mineru[core]
 `vllm` 模块提供了对 VLM 模型推理的加速支持,适用于具有 Turing 及以后架构的显卡(8G 显存及以上)。安装此模块可以显著提升模型推理速度。
 在配置中,`all`包含了`core`和`vllm`模块,因此`mineru[all]`和`mineru[core,vllm]`是等价的。
 ```bash
-uv pip install mineru[all]
+uv pip install "mineru[all]"
 ```
 > [!TIP]
 > 如在安装包含vllm的完整包过程中发生异常,请参考 [vllm 官方文档](https://docs.vllm.ai/en/latest/getting_started/installation/index.html) 尝试解决,或直接使用 [Docker](./docker_deployment.md) 方式部署镜像。