Browse Source

update note about Model Source (#4459)

Tingquan Gao 3 months ago
parent
commit
ae22fc95ec
38 changed files with 38 additions and 38 deletions
  1. 1 1
      docs/module_usage/tutorials/ocr_modules/doc_img_orientation_classification.en.md
  2. 1 1
      docs/module_usage/tutorials/ocr_modules/doc_img_orientation_classification.md
  3. 1 1
      docs/module_usage/tutorials/ocr_modules/formula_recognition.en.md
  4. 1 1
      docs/module_usage/tutorials/ocr_modules/formula_recognition.md
  5. 1 1
      docs/module_usage/tutorials/ocr_modules/layout_detection.en.md
  6. 1 1
      docs/module_usage/tutorials/ocr_modules/layout_detection.md
  7. 1 1
      docs/module_usage/tutorials/ocr_modules/seal_text_detection.en.md
  8. 1 1
      docs/module_usage/tutorials/ocr_modules/seal_text_detection.md
  9. 1 1
      docs/module_usage/tutorials/ocr_modules/table_cells_detection.en.md
  10. 1 1
      docs/module_usage/tutorials/ocr_modules/table_cells_detection.md
  11. 1 1
      docs/module_usage/tutorials/ocr_modules/table_classification.en.md
  12. 1 1
      docs/module_usage/tutorials/ocr_modules/table_classification.md
  13. 1 1
      docs/module_usage/tutorials/ocr_modules/table_structure_recognition.en.md
  14. 1 1
      docs/module_usage/tutorials/ocr_modules/table_structure_recognition.md
  15. 1 1
      docs/module_usage/tutorials/ocr_modules/text_detection.en.md
  16. 1 1
      docs/module_usage/tutorials/ocr_modules/text_detection.md
  17. 1 1
      docs/module_usage/tutorials/ocr_modules/text_image_unwarping.en.md
  18. 1 1
      docs/module_usage/tutorials/ocr_modules/text_image_unwarping.md
  19. 1 1
      docs/module_usage/tutorials/ocr_modules/text_recognition.en.md
  20. 1 1
      docs/module_usage/tutorials/ocr_modules/text_recognition.md
  21. 1 1
      docs/module_usage/tutorials/ocr_modules/textline_orientation_classification.en.md
  22. 1 1
      docs/module_usage/tutorials/ocr_modules/textline_orientation_classification.md
  23. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/OCR.en.md
  24. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/OCR.md
  25. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/PP-StructureV3.en.md
  26. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/PP-StructureV3.md
  27. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/doc_preprocessor.en.md
  28. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/doc_preprocessor.md
  29. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/formula_recognition.en.md
  30. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/formula_recognition.md
  31. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/layout_parsing.en.md
  32. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/layout_parsing.md
  33. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/seal_recognition.en.md
  34. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/seal_recognition.md
  35. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition.en.md
  36. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition.md
  37. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition_v2.en.md
  38. 1 1
      docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition_v2.md

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/doc_img_orientation_classification.en.md

@@ -101,7 +101,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result obtained is:
 After running, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/doc_img_orientation_classification.md

@@ -100,7 +100,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 ```bash
 ```bash

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/formula_recognition.en.md

@@ -158,7 +158,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result obtained is:
 After running, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/formula_recognition.md

@@ -157,7 +157,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 ```bash
 ```bash

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/layout_detection.en.md

@@ -324,7 +324,7 @@ for res in output:
 
 
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/layout_detection.md

@@ -326,7 +326,7 @@ for res in output:
 
 
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/seal_text_detection.en.md

@@ -110,7 +110,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result is:
 After running, the result is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/seal_text_detection.md

@@ -107,7 +107,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_cells_detection.en.md

@@ -100,7 +100,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_cells_detection.md

@@ -99,7 +99,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_classification.en.md

@@ -92,7 +92,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running the code, the result obtained is:
 After running the code, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_classification.md

@@ -91,7 +91,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 ```
 ```

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_structure_recognition.en.md

@@ -119,7 +119,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 <details><summary>👉 <b>After running, the result is: (Click to expand)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/table_structure_recognition.md

@@ -115,7 +115,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 <details><summary>👉 <b>运行后,得到的结果为:(点击展开)</b></summary>
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_detection.en.md

@@ -144,7 +144,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result obtained is:
 After running, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_detection.md

@@ -143,7 +143,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_image_unwarping.en.md

@@ -103,7 +103,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result obtained is:
 After running, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_image_unwarping.md

@@ -97,7 +97,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_recognition.en.md

@@ -476,7 +476,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 For more information on using PaddleX's single-model inference APIs, please refer to the [PaddleX Single-Model Python Script Usage Instructions](../../instructions/model_python_API.en.md).
 For more information on using PaddleX's single-model inference APIs, please refer to the [PaddleX Single-Model Python Script Usage Instructions](../../instructions/model_python_API.en.md).
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/text_recognition.md

@@ -501,7 +501,7 @@ for res in output:
     res.save_to_json(save_path="./output/res.json")
     res.save_to_json(save_path="./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 ```bash
 ```bash

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/textline_orientation_classification.en.md

@@ -112,7 +112,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 After running, the result obtained is:
 After running, the result obtained is:
 
 

+ 1 - 1
docs/module_usage/tutorials/ocr_modules/textline_orientation_classification.md

@@ -112,7 +112,7 @@ for res in output:
     res.save_to_json("./output/res.json")
     res.save_to_json("./output/res.json")
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 运行后,得到的结果为:
 运行后,得到的结果为:
 ```bash
 ```bash

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/OCR.en.md

@@ -1851,7 +1851,7 @@ paddlex --pipeline OCR \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 For details on the relevant parameter descriptions, please refer to the parameter descriptions in [2.2.2 Python Script Integration](#222-python-script-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 For details on the relevant parameter descriptions, please refer to the parameter descriptions in [2.2.2 Python Script Integration](#222-python-script-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/OCR.md

@@ -1843,7 +1843,7 @@ paddlex --pipeline OCR \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/PP-StructureV3.en.md

@@ -1340,7 +1340,7 @@ paddlex --pipeline PP-StructureV3 \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 The parameter description can be found in [2.2.2 Python Script Integration](#222-python-script-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 The parameter description can be found in [2.2.2 Python Script Integration](#222-python-script-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/PP-StructureV3.md

@@ -1305,7 +1305,7 @@ paddlex --pipeline PP-StructureV3 \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/doc_preprocessor.en.md

@@ -504,7 +504,7 @@ paddlex --pipeline doc_preprocessor \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 You can refer to the parameter descriptions in [2.1.2 Python Script Integration](#212-python-script-integration) for related parameter details. Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 You can refer to the parameter descriptions in [2.1.2 Python Script Integration](#212-python-script-integration) for related parameter details. Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/doc_preprocessor.md

@@ -502,7 +502,7 @@ paddlex --pipeline doc_preprocessor \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.1.2 Python脚本方式集成](#212-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.1.2 Python脚本方式集成](#212-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/formula_recognition.en.md

@@ -1221,7 +1221,7 @@ paddlex --pipeline formula_recognition \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 The relevant parameter descriptions can be referenced from [2.2 Integration via Python Script](#22-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 The relevant parameter descriptions can be referenced from [2.2 Integration via Python Script](#22-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/formula_recognition.md

@@ -1223,7 +1223,7 @@ paddlex --pipeline formula_recognition \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2 Python脚本方式集成](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2 Python脚本方式集成](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/layout_parsing.en.md

@@ -749,7 +749,7 @@ paddlex --pipeline layout_parsing \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 For parameter descriptions, refer to the parameter explanations in [2.2.2 Integration via Python Script](#222-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 For parameter descriptions, refer to the parameter explanations in [2.2.2 Integration via Python Script](#222-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/layout_parsing.md

@@ -697,7 +697,7 @@ paddlex --pipeline layout_parsing \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2.2 Python脚本方式集成](#222-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/seal_recognition.en.md

@@ -1364,7 +1364,7 @@ paddlex --pipeline seal_recognition \
     --save_path ./output
     --save_path ./output
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 The relevant parameter descriptions can be referred to in the parameter explanations of [2.1.2 Integration via Python Script](#212-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 The relevant parameter descriptions can be referred to in the parameter explanations of [2.1.2 Integration via Python Script](#212-integration-via-python-script). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/seal_recognition.md

@@ -1383,7 +1383,7 @@ paddlex --pipeline seal_recognition \
     --save_path ./output
     --save_path ./output
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.1.2 Python脚本方式集成](#212-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.1.2 Python脚本方式集成](#212-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition.en.md

@@ -717,7 +717,7 @@ paddlex --pipeline table_recognition \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 The content of the parameters can refer to the parameter description in [2.2 Python Script Method](#22-python-script-method-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 The content of the parameters can refer to the parameter description in [2.2 Python Script Method](#22-python-script-method-integration). Supports specifying multiple devices simultaneously for parallel inference. For details, please refer to the documentation on pipeline parallel inference.
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition.md

@@ -656,7 +656,7 @@ paddlex --pipeline table_recognition \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2 Python脚本方式](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2 Python脚本方式](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition_v2.en.md

@@ -1121,7 +1121,7 @@ paddlex --pipeline table_recognition_v2 \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>Note: </b>The official models would be download from HuggingFace by default. If can't access to HuggingFace, please set the environment variable `PADDLE_PDX_MODEL_SOURCE="BOS"` to change the model source to BOS. In the future, more model sources will be supported.
+<b>Note: </b>The official models would be download from HuggingFace by first. PaddleX also support to specify the preferred source by setting the environment variable `PADDLE_PDX_MODEL_SOURCE`. The supported values are `huggingface`, `aistudio`, `bos`, and `modelscope`. For example, to prioritize using `bos`, set: `PADDLE_PDX_MODEL_SOURCE="bos"`.
 
 
 <details><summary>👉 <b>After running, the result obtained is: (Click to expand)</b></summary>
 <details><summary>👉 <b>After running, the result obtained is: (Click to expand)</b></summary>
 
 

+ 1 - 1
docs/pipeline_usage/tutorials/ocr_pipelines/table_recognition_v2.md

@@ -1139,7 +1139,7 @@ paddlex --pipeline table_recognition_v2 \
         --device gpu:0
         --device gpu:0
 ```
 ```
 
 
-<b>注:</b>PaddleX 官方模型默认从 HuggingFace 获取,如运行环境访问 HuggingFace 不便,可通过环境变量修改模型源为 BOS:`PADDLE_PDX_MODEL_SOURCE="BOS"`,未来将支持更多主流模型源
+<b>注:</b>PaddleX 支持多个模型托管平台,官方模型默认优先从 HuggingFace 下载。PaddleX 也支持通过环境变量 `PADDLE_PDX_MODEL_SOURCE` 设置优先使用的托管平台,目前支持 `huggingface`、`aistudio`、`bos`、`modelscope`,如优先使用 `bos`:`PADDLE_PDX_MODEL_SOURCE="bos"`
 
 
 相关的参数说明可以参考[2.2 Python脚本方式集成](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。
 相关的参数说明可以参考[2.2 Python脚本方式集成](#22-python脚本方式集成)中的参数说明。支持同时指定多个设备以进行并行推理,详情请参考 [产线并行推理](../../instructions/parallel_inference.md#指定多个推理设备)。