Explorar el Código

update_docs_1026

yzl19940819 hace 4 años
padre
commit
50b19630fe
Se han modificado 100 ficheros con 5526 adiciones y 1662 borrados
  1. 82 87
      README.md
  2. 0 0
      deploy/Python/README.md
  3. 22 0
      deploy/README.md
  4. 4 43
      deploy/cpp/README.md
  5. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/Form1.Designer.cs
  6. 215 274
      deploy/cpp/docs/CSharp_deploy/C#/Form1.cs
  7. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/Form1.resx
  8. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/Program.cs
  9. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.csproj
  10. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.csproj.user
  11. 0 0
      deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.sln
  12. 0 0
      deploy/cpp/docs/CSharp_deploy/README.md
  13. 0 0
      deploy/cpp/docs/CSharp_deploy/images/1.png
  14. 0 0
      deploy/cpp/docs/CSharp_deploy/images/10.png
  15. 0 0
      deploy/cpp/docs/CSharp_deploy/images/11.png
  16. 0 0
      deploy/cpp/docs/CSharp_deploy/images/12.png
  17. 0 0
      deploy/cpp/docs/CSharp_deploy/images/13.png
  18. 0 0
      deploy/cpp/docs/CSharp_deploy/images/14.png
  19. 0 0
      deploy/cpp/docs/CSharp_deploy/images/15.png
  20. 0 0
      deploy/cpp/docs/CSharp_deploy/images/16.png
  21. 0 0
      deploy/cpp/docs/CSharp_deploy/images/17.png
  22. 0 0
      deploy/cpp/docs/CSharp_deploy/images/18.png
  23. 0 0
      deploy/cpp/docs/CSharp_deploy/images/19.png
  24. 0 0
      deploy/cpp/docs/CSharp_deploy/images/2.png
  25. 0 0
      deploy/cpp/docs/CSharp_deploy/images/20.png
  26. 0 0
      deploy/cpp/docs/CSharp_deploy/images/21.png
  27. 0 0
      deploy/cpp/docs/CSharp_deploy/images/22.png
  28. 0 0
      deploy/cpp/docs/CSharp_deploy/images/23.png
  29. 0 0
      deploy/cpp/docs/CSharp_deploy/images/24.png
  30. 0 0
      deploy/cpp/docs/CSharp_deploy/images/25.png
  31. BIN
      deploy/cpp/docs/CSharp_deploy/images/26.png
  32. BIN
      deploy/cpp/docs/CSharp_deploy/images/27.png
  33. BIN
      deploy/cpp/docs/CSharp_deploy/images/28.png
  34. BIN
      deploy/cpp/docs/CSharp_deploy/images/29.png
  35. 0 0
      deploy/cpp/docs/CSharp_deploy/images/3.png
  36. 0 0
      deploy/cpp/docs/CSharp_deploy/images/4.png
  37. 0 0
      deploy/cpp/docs/CSharp_deploy/images/5.png
  38. 0 0
      deploy/cpp/docs/CSharp_deploy/images/6.png
  39. 0 0
      deploy/cpp/docs/CSharp_deploy/images/7.png
  40. 0 0
      deploy/cpp/docs/CSharp_deploy/images/8.5.png
  41. 0 0
      deploy/cpp/docs/CSharp_deploy/images/8.png
  42. 0 0
      deploy/cpp/docs/CSharp_deploy/images/9.png
  43. 52 45
      deploy/cpp/docs/CSharp_deploy/model_infer.cpp
  44. 9 0
      deploy/cpp/docs/apis/model.md
  45. 1 1
      deploy/cpp/docs/compile/openvino/README.md
  46. 142 0
      deploy/cpp/docs/compile/openvino/openvino_windows.md
  47. 1 1
      deploy/cpp/docs/compile/paddle/linux.md
  48. 33 0
      deploy/cpp/docs/demo/model_infer.md
  49. 9 0
      deploy/cpp/docs/file_format.md
  50. BIN
      deploy/cpp/docs/images/cmakelist_set.png
  51. BIN
      deploy/cpp/docs/images/cpu_infer.png
  52. BIN
      deploy/cpp/docs/images/deploy_build_sh.png
  53. BIN
      deploy/cpp/docs/images/infer_demo_cmakelist.png
  54. BIN
      deploy/cpp/docs/images/paddleinference_filelist.png
  55. BIN
      deploy/cpp/docs/images/show_menu.png
  56. BIN
      deploy/cpp/docs/images/tensorrt.png
  57. 157 0
      deploy/cpp/docs/jetson-deploy/CMakeLists.txt
  58. 44 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/Deploy_infer.pro
  59. 336 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/Deploy_infer.pro.user
  60. 1872 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/inferthread.cpp
  61. 135 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/inferthread.h
  62. 11 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/main.cpp
  63. 577 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.cpp
  64. 126 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.h
  65. 829 0
      deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.ui
  66. 475 0
      deploy/cpp/docs/jetson-deploy/README.md
  67. BIN
      deploy/cpp/docs/jetson-deploy/images/cmakelist_set.png
  68. BIN
      deploy/cpp/docs/jetson-deploy/images/cpu_infer.png
  69. BIN
      deploy/cpp/docs/jetson-deploy/images/debug_incode.png
  70. BIN
      deploy/cpp/docs/jetson-deploy/images/deploy_build_sh.png
  71. BIN
      deploy/cpp/docs/jetson-deploy/images/dong_tai_lianjieku.png
  72. BIN
      deploy/cpp/docs/jetson-deploy/images/gpu_infer.png
  73. BIN
      deploy/cpp/docs/jetson-deploy/images/import_opencv.png
  74. BIN
      deploy/cpp/docs/jetson-deploy/images/infer_demo_cmakelist.png
  75. BIN
      deploy/cpp/docs/jetson-deploy/images/main_slot.png
  76. BIN
      deploy/cpp/docs/jetson-deploy/images/output_info.png
  77. BIN
      deploy/cpp/docs/jetson-deploy/images/pro_set_libpath.png
  78. BIN
      deploy/cpp/docs/jetson-deploy/images/project_list.png
  79. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_add_newfile.png
  80. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_create_class.png
  81. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_create_guipro.png
  82. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_importppipeline.png
  83. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_inferthreadcpp_set.png
  84. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_mainwindowset.png
  85. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_project_inferlib.png
  86. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_set_proname.png
  87. BIN
      deploy/cpp/docs/jetson-deploy/images/qt_start_new_pro.png
  88. BIN
      deploy/cpp/docs/jetson-deploy/images/show_menu.png
  89. BIN
      deploy/cpp/docs/jetson-deploy/images/thread_import_opencv.png
  90. BIN
      deploy/cpp/docs/jetson-deploy/images/thread_signal.png
  91. BIN
      deploy/cpp/docs/jetson-deploy/images/tupian_suofang.png
  92. BIN
      deploy/cpp/docs/jetson-deploy/images/yaml_cmakelist.png
  93. 312 0
      deploy/cpp/docs/jetson-deploy/model_infer.cpp
  94. 3 3
      deploy/cpp/docs/manufacture_sdk/README.md
  95. 3 2
      deploy/cpp/docs/models/paddledetection.md
  96. 3 2
      deploy/cpp/docs/models/paddleseg.md
  97. 4 5
      deploy/cpp/docs/models/paddlex.md
  98. 39 0
      deploy/cpp/scripts/jetson_build.sh
  99. 0 1026
      deploy/resources/resnet50_imagenet.yml
  100. 30 173
      docs/CHANGELOG.md

+ 82 - 87
README.md

@@ -1,133 +1,128 @@
-# PaddleX全面升级动态图,v2.0.0正式发布!
-
-
-
 <p align="center">
   <img src="./docs/gui/images/paddlex.png" width="360" height ="55" alt="PaddleX" align="middle" />
 </p>
  <p align= "center"> PaddleX -- 飞桨全流程开发工具,以低代码的形式支持开发者快速实现产业实际项目落地 </p>
 
-## :heart:重磅功能升级
-### 全新发布Manufacture SDK,提供工业级多端多平台部署加速的预编译飞桨部署开发包(SDK),通过配置业务逻辑流程文件即可以低代码方式快速完成推理部署。[欢迎体验](./deploy/cpp/docs/manufacture_sdk)
-
-### PaddleX部署全面升级,支持飞桨视觉套件PaddleDetection、PaddleClas、PaddleSeg、PaddleX的端到端统一部署能力。[欢迎体验](./deploy/cpp)
-
-
-### 发布产业实践案例:钢筋计数、缺陷检测、机械手抓取、工业表计读数、Windows系统下使用C#语言部署。[欢迎体验](./examples)
+<p align="left">
+    <a href="./LICENSE"><img src="https://img.shields.io/badge/license-Apache%202-red.svg"></a>
+    <a href="https://github.com/PaddlePaddle/PaddleOCR/releases"><img src="https://img.shields.io/github/release/PaddlePaddle/PaddleX.svg"></a>
+    <a href=""><img src="https://img.shields.io/badge/python-3.6+-orange.svg"></a>
+    <a href=""><img src="https://img.shields.io/badge/os-linux%2C%20win%2C%20mac-yellow.svg"></a>
+    <a href=""><img src="https://img.shields.io/badge/QQ_Group-957286141-52B6EF?style=social&logo=tencent-qq&logoColor=000&logoWidth=20"></a>
+</p>
 
-### 升级PaddleX GUI,支持30系列显卡、新增模型PP-YOLO V2、PP-YOLO Tiny 、BiSeNetV2,新增导出API训练脚本功能,无缝切换PaddleX API训练。[欢迎体验](https://github.com/PaddlePaddle/PaddleX/blob/develop/docs/install.md#2-padldex-gui%E5%BC%80%E5%8F%91%E6%A8%A1%E5%BC%8F%E5%AE%89%E8%A3%85)
+## 近期动态
+2021.09.10 PaddleX发布2.0.0正式版本。
+- 全新发布Manufacture SDK,提供工业级多端多平台部署加速的预编译飞桨部署开发包(SDK),通过配置业务逻辑流程文件即可以低代码方式快速完成推理部署。[欢迎体验](./deploy/cpp/docs/manufacture_sdk)
+- PaddleX部署全面升级,支持飞桨视觉套件PaddleDetection、PaddleClas、PaddleSeg、PaddleX的端到端统一部署能力。[欢迎体验](./deploy/cpp/docs/deployment.md)
+- 发布产业实践案例:钢筋计数、缺陷检测、机械手抓取、工业表计读数。[欢迎体验](./examples)
+- 升级PaddleX GUI,支持30系列显卡、新增模型PP-YOLO V2、PP-YOLO Tiny 、BiSeNetV2。[欢迎体验](https://github.com/PaddlePaddle/PaddleX/blob/develop/docs/install.md#2-padldex-gui%E5%BC%80%E5%8F%91%E6%A8%A1%E5%BC%8F%E5%AE%89%E8%A3%85)
 
-[![License](https://img.shields.io/badge/license-Apache%202-red.svg)](LICENSE) [![Version](https://img.shields.io/github/release/PaddlePaddle/PaddleX.svg)](https://github.com/PaddlePaddle/PaddleX/releases) ![python version](https://img.shields.io/badge/python-3.6+-orange.svg) ![support os](https://img.shields.io/badge/os-linux%2C%20win%2C%20mac-yellow.svg)
- ![QQGroup](https://img.shields.io/badge/QQ_Group-1045148026-52B6EF?style=social&logo=tencent-qq&logoColor=000&logoWidth=20)
+详情内容请参考[版本更新文档](./docs/CHANGELOG.md)。
 
+## 产品介绍
 :hugs: PaddleX 集成飞桨智能视觉领域**图像分类**、**目标检测**、**语义分割**、**实例分割**任务能力,将深度学习开发全流程从**数据准备**、**模型训练与优化**到**多端部署**端到端打通,并提供**统一任务API接口**及**图形化开发界面Demo**。开发者无需分别安装不同套件,以**低代码**的形式即可快速完成飞桨全流程开发。
 
 :factory: **PaddleX** 经过**质检**、**安防**、**巡检**、**遥感**、**零售**、**医疗**等十多个行业实际应用场景验证,沉淀产业实际经验,**并提供丰富的案例实践教程**,全程助力开发者产业实践落地。
 
-![](../docs/gui/images/paddlexoverview.png)
-
-
-## PaddleX 使用文档
-
-
-### 1. 快速上手PaddleX
+<p align="center">
+  <img src="./docs/paddlex_whole.png" width="800"  />
+</p>
 
-* [快速安装PaddleX](./docs/install.md)
-  * [PaddleX API开发模式安装](./docs/install.md#1-paddlex-api开发模式安装)
-  * [PadldeX GUI开发模式安装](./docs/install.md#2-padldex-gui开发模式安装)
-  * [PaddleX Restful开发模式安装](./docs/install.md#3-paddlex-restful开发模式安装)
-* [10分钟快速上手使用](./docs/quick_start.md)
-* [AIStudio在线项目示例](https://aistudio.baidu.com/aistudio/projectdetail/2159977)
-* [常见问题汇总](./docs/FAQ/FAQ.md)
+## 安装与快速体验
+PaddleX提供了图像化开发界面、本地API、Restful-API三种开发模式。用户可根据自己的需求选择任意一种开始体验
+- [PadldeX GUI开发模式](./docs/quick_start_GUI.md)
+- [PaddleX API开发模式](./docs/quick_start_API.md)
+- [PaddleX Restful API开发模式](./docs/Resful_API/docs/readme.md)
+- [快速产业部署](#4-模型部署)
 
+## 产业级应用示例
 
-### 2. 数据准备
+- 安防
+    - [安全帽检测](./examples/helmet_detection)  
+- 工业视觉
+    -   [表计读数](./examples/meter_reader)  |  [钢筋计数](./examples/rebar_count)  |  [视觉辅助定位抓取](./examples/robot_grab)
 
-* [数据格式说明](./docs/data/format/README.md)
-* [标注工具LabelMe的安装和启动](./docs/data/annotation/labelme.md)
-* [数据标注](./docs/data/annotation/README.md)
-  * [手机拍照图片旋转](./docs/data/annotation/README.md)
-  * [开始数据标注](./docs/data/annotation/README.md)
-* [数据格式转换](./docs/data/convert.md)
-* [数据划分](./docs/data/split.md)
 
 
-### 3. 模型训练/评估/预测
+## PaddleX 使用文档
+本文档介绍了PaddleX从数据准备、模型训练到模型裁剪量化,及最终部署的全流程使用方法。
+<p align="center">
+  <img src="https://user-images.githubusercontent.com/53808988/134840580-c3152f00-6b0e-44dc-97d0-a82a027d0183.png" width="800"  />
+</p>
 
-* **PaddleX API开发模式:**
+### 1. 数据准备
 
-    * [API文档](./docs/apis)
-      * [数据集读取API](./docs/apis/datasets.md)
-      * [数据预处理和数据增强API](./docs/apis/transforms/transforms.md)
-      * [模型API/模型加载API](./docs/apis/models/README.md)
-      * [预测结果可视化API](./docs/apis/visualize.md)
-    * [模型训练与参数调整](tutorials/train)
-      * [模型训练](tutorials/train)
-      * [训练参数调整](./docs/parameters.md)
-    * [VisualDL可视化训练指标](./docs/visualdl.md)
-    * [加载训好的模型完成预测及预测结果可视化](./docs/apis/prediction.md)
+- [数据准备流程说明](./docs/data)
+- [数据标注](./docs/data/annotation/README.md)
+- [数据格式转换](./docs/data/convert.md)
+- [数据划分](./docs/data/split.md)
 
-* **PaddleX GUI开发模式:**
+### 2. 模型训练/评估/预测
 
-    - [图像分类](https://www.bilibili.com/video/BV1nK411F7J9?from=search&seid=3068181839691103009)
-    - [目标检测](https://www.bilibili.com/video/BV1HB4y1A73b?from=search&seid=3068181839691103009)
-    - [实例分割](https://www.bilibili.com/video/BV1M44y1r7s6?from=search&seid=3068181839691103009)
-    - [语义分割](https://www.bilibili.com/video/BV1qQ4y1Z7co?from=search&seid=3068181839691103009)
+- [GUI开发模式](./docs/quick_start_GUI.md)
+  - 视频教程:[图像分类](./docs/quick_start_GUI.md/#视频教程) | [目标检测](./docs/quick_start_GUI.md/#视频教程) | [语义分割](./docs/quick_start_GUI.md/#视频教程) | [实例分割](./docs/quick_start_GUI.md/#视频教程)
+- API开发模式
+  - [API文档](./docs/apis)
+    - [数据集读取API](./docs/apis/datasets.md)
+    - [数据预处理和数据增强API](./docs/apis/transforms/transforms.md)
+    - [模型API/模型加载API](./docs/apis/models/README.md)
+    - [预测结果可视化API](./docs/apis/visualize.md)
+  - [模型训练与参数调整](tutorials/train)
+    - [模型训练](tutorials/train)
+    - [训练参数调整](./docs/parameters.md)
+  - [VisualDL可视化训练指标](./docs/visualdl.md)
+  - [加载训好的模型完成预测及预测结果可视化](./docs/apis/prediction.md)
+- [Restful API开发模式](./docs/Resful_API/docs)
+  - [使用说明](./docs/Resful_API/docs)
 
 
-### 4. 模型剪裁和量化
+### 3. 模型压缩
 
 - [模型剪裁](tutorials/slim/prune)
 - [模型量化](tutorials/slim/quantize)
 
-### 5. 模型部署
+### 4. 模型部署
 
 - [部署模型导出](./docs/apis/export_model.md)
-- [PaddleX python高性能部署](./docs/python_deploy.md)
-- [PaddleX Manufacture SDK低代码高效C++部署](./deploy/cpp/docs/manufacture_sdk)
-- [PaddleX/PaddleClas/PaddleDetection/PaddleSeg端到端高性能统一C++部署](./deploy/cpp)
-- [PaddleX python轻量级服务化部署](./docs/hub_serving_deploy.md)
-
-### 6. 产业级应用示例
-
-- [钢筋计数](examples/rebar_count)
-- [缺陷检测](examples/defect_detection)
-- [机械手抓取](examples/robot_grab)
-- [工业表计读数](examples/meter_reader)
-- [Windows系统下使用C#语言部署](examples/C%23_deploy)
-
-### 7. 附录
+- [部署方式概览](./deploy)
+  - 本地部署
+    - [OpenVINO](./deploy/cpp/docs/compile/openvino/README.md)(C++)
+    - [C++部署](./deploy/cpp)
+      - [Manufacture SDK](./deploy/cpp/docs/manufacture_sdk)
+      - [Deployment SDK](./deploy/cpp/docs/deployment.md)
+      - [C#工程化部署](./deploy/cpp/docs/CSharp_deploy)
+    - [Python部署](./docs/python_deploy.md)
+  - 边缘侧部署
+    - [NVIDIA-JetsonQT部署](./deploy/cpp/docs/jetson-deploy)
+  - 服务化部署
+    - [HubServing部署](./docs/hub_serving_deploy.md)
+  - Docker部署(C++)
+    - [Triton部署](./deploy/cpp/docs/compile/triton/docker.md)
+    - [TensorRT部署](./deploy/cpp/docs/compile/tensorrt/trt.md)
+
+- [模型加密](./deploy/cpp/docs/demo/decrypt_infer.md)
+- [ONNX格式转换](./deploy/cpp/docs/compile)
+
+### 5. 附录
 
 - [PaddleX模型库](./docs/appendix/model_zoo.md)
 - [PaddleX指标及日志](./docs/appendix/metrics.md)
 - [无联网模型训练](./docs/how_to_offline_run.md)
 
-## 版本更新
-
-- **2021.09.10 v2.0.0**
-
-  PaddleX 2.0动态图版本正式发布,PaddleX API、PaddleX GUI开发模式全面支持飞桨2.0动态图。PaddleX GUI新增导出API训练脚本功能,无缝切换PaddleX API训练。PaddleX python预测部署完备, PaddleX模型使用2个API即可快速完成部署。详细内容请参考[版本更新文档](./docs/CHANGELOG.md)
-
-- **2021.07.06 v2.0.0-rc3**
-
-  PaddleX部署全面升级,支持飞桨视觉套件PaddleDetection、PaddleClas、PaddleSeg、PaddleX的端到端统一部署能力。全新发布Manufacture SDK,提供工业级多端多平台部署加速的预编译飞桨部署开发包(SDK),通过配置业务逻辑流程文件即可以低代码方式快速完成推理部署。发布产业实践案例:钢筋计数、缺陷检测、机械手抓取、工业表计读数、Windows系统下使用C#语言部署。升级PaddleX GUI,支持30系列显卡、新增模型PP-YOLO V2、PP-YOLO Tiny 、BiSeNetV2。详细内容请参考[版本更新文档](./docs/CHANGELOG.md)
-
-- **2021.05.19 v2.0.0-rc**
-
-  全面支持飞桨2.0动态图,更易用的开发模式。 目标检测任务新增PP-YOLOv2, COCO test数据集精度达到49.5%、V100预测速度达到68.9 FPS。目标检测任务新增4.2MB的超轻量级模型PP-YOLO tiny。语义分割任务新增实时分割模型BiSeNetV2。C++部署模块全面升级,PaddleInference部署适配2.0预测库,支持飞桨PaddleDetection、PaddleSeg、PaddleClas以及PaddleX的模型部署;新增基于PaddleInference的GPU多卡预测;GPU部署新增基于ONNX的的TensorRT高性能加速引擎部署方式;GPU部署新增基于ONNX的Triton服务化部署方式。详情内容请参考[版本更新文档](./docs/CHANGELOG.md)。
-
+## 常见问题汇总
+- [GUI相关问题](./docs/FAQ/FAQ.md/#GUI相关问题)
+- [API训练相关问题](./docs/FAQ/FAQ.md/#API训练相关问题)
+- [推理部署问题](./docs/FAQ/FAQ.md/#推理部署问题)
 
 ## 交流与反馈
 
 - 项目官网:https://www.paddlepaddle.org.cn/paddle/paddlex
-
 - PaddleX用户交流群:957286141 (手机QQ扫描如下二维码快速加入)  
-
   <p align="center">
-    <img src="./docs/gui/images/QR2.jpg" width="250" height ="360" alt="QR" align="middle" />
+    <img src="./docs/gui/images/QR2.png" width="180" height ="180" alt="QR" align="middle" />
   </p>
 
-
 ## :hugs: 贡献代码:hugs:
 
 我们非常欢迎您为PaddleX贡献代码或者提供使用建议。如果您可以修复某个issue或者增加一个新功能,欢迎给我们提交Pull Requests。

+ 0 - 0
deploy/Python/README.md


+ 22 - 0
deploy/README.md

@@ -0,0 +1,22 @@
+# 部署方式概览
+
+PaddleX提供了多种部署方式,用户可根据实际需要选择本地部署、边缘侧部署、服务化部署、Docker部署。部署方式目录如下:
+
+- [部署方式概览](./deploy)
+  - 本地部署
+    - [OpenVINO](./deploy/cpp/docs/compile/openvino/README.md)(C++)
+    - [C++部署](./deploy/cpp)
+      - [Manufacture SDK](./deploy/cpp/docs/manufacture_sdk)
+      - [Deployment SDK](./deploy/cpp/docs/deployment.md)
+      - [C#工程化部署](./deploy/cpp/docs/C#_deploy)
+    - [Python部署](./docs/python_deploy.md)
+  - 边缘侧部署
+    - [NVIDIA-JetsonQT部署](./deploy/cpp/docs/jetson-deploy)
+  - 服务化部署
+    - [HubServing部署](./docs/hub_serving_deploy.md)
+  - Docker部署(C++)
+    - [Triton部署](./deploy/cpp/docs/compile/triton/docker.md)
+    - [TensorRT部署](./deploy/cpp/docs/compile/tensorrt/trt.md)
+
+- [模型加密](./deploy/cpp/docs/demo/decrypt_infer.md)
+- [ONNX格式转换](./deploy/cpp/docs/compile)

+ 4 - 43
deploy/cpp/README.md

@@ -1,44 +1,5 @@
-## PaddlePaddle模型C++部署
+# C++部署
 
-本目录下代码,目前支持以下飞桨官方套件基于PaddleInference的部署。
-
-## 模型套件支持
-- PaddleDetection([release/2.1](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.1))
-- PaddleSeg([release/2.1](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.1))
-- PaddleClas([release/2.1](https://github.com/PaddlePaddle/PaddleClas/tree/release/2.1))
-- PaddleX([release/2.0-rc](https://github.com/PaddlePaddle/PaddleX))
-
-## 硬件支持
-- CPU(linux/windows)
-- GPU(linux/windows)
-- Jetson(TX2/Nano/Xavier)
-
-## 文档
-### PaddleInference编译说明
-- [Linux编译(支持加密)指南](./docs/compile/paddle/linux.md)
-- [Windows编译(支持加密)指南](./docs/compile/paddle/windows.md)
-- [Jetson编译指南](./docs/compile/paddle/jetson.md)
-
-### 模型部署说明
-- [PaddleX部署指南](./docs/models/paddlex.md)
-- [PaddleDetection部署指南](./docs/models/paddledetection.md)
-- [PaddleSeg部署指南](./docs/models/paddleseg.md)
-- [PaddleClas部署指南](./docs/models/paddleclas.md)
-
-### 模型预测示例
-- [单卡加载模型预测示例](./docs/demo/model_infer.md)
-- [多卡加载模型预测示例](./docs/demo/multi_gpu_model_infer.md)
-- [PaddleInference集成TensorRT加载模型预测示例](./docs/demo/tensorrt_infer.md)
-- [模型加密预测示例](./docs/demo/decrypt_infer.md)
-
-### API说明
-
-- [部署相关API说明](./docs/apis/model.md)
-- [模型配置文件说明](./docs/apis/yaml.md)
-
-
-## ONNX模型部署
-Paddle的模型除了直接通过PaddleInference部署外,还可以通过[Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX.git)转为ONNX后使用第三方推理引擎进行部署,在本目录下,我们提供了基于OpenVINO、Triton和TensorRT三个引擎的部署支持。
-- [OpenVINO部署](./docs/compile/openvino/README.md)
-- [Triton部署](./docs/compile/triton/docker.md)
-- [TensorRT部署](./docs/compile/tensorrt/trt.md)
+在C++部署部署方式中,我们提供了两类SDK,用户可根据SDK特点及实际需要制定部署方案:
+- [Manufacture SDK](./docs/manufacture_sdk) : 工业级多端多平台部署加速的预编译飞桨部署开发包,支持多模型串联
+- [Deployment SDK](./docs/deployment.md) : 支持飞桨视觉套件PaddleX、PaddleDetection、PaddleClas、PaddleSeg的统一部署能力

+ 0 - 0
examples/C#_deploy/C#/Form1.Designer.cs → deploy/cpp/docs/CSharp_deploy/C#/Form1.Designer.cs


La diferencia del archivo ha sido suprimido porque es demasiado grande
+ 215 - 274
deploy/cpp/docs/CSharp_deploy/C#/Form1.cs


+ 0 - 0
examples/C#_deploy/C#/Form1.resx → deploy/cpp/docs/CSharp_deploy/C#/Form1.resx


+ 0 - 0
examples/C#_deploy/C#/Program.cs → deploy/cpp/docs/CSharp_deploy/C#/Program.cs


+ 0 - 0
examples/C#_deploy/C#/WinFormsApp_final.csproj → deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.csproj


+ 0 - 0
examples/C#_deploy/C#/WinFormsApp_final.csproj.user → deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.csproj.user


+ 0 - 0
examples/C#_deploy/C#/WinFormsApp_final.sln → deploy/cpp/docs/CSharp_deploy/C#/WinFormsApp_final.sln


+ 0 - 0
examples/C#_deploy/README.md → deploy/cpp/docs/CSharp_deploy/README.md


+ 0 - 0
examples/C#_deploy/images/1.png → deploy/cpp/docs/CSharp_deploy/images/1.png


+ 0 - 0
examples/C#_deploy/images/10.png → deploy/cpp/docs/CSharp_deploy/images/10.png


+ 0 - 0
examples/C#_deploy/images/11.png → deploy/cpp/docs/CSharp_deploy/images/11.png


+ 0 - 0
examples/C#_deploy/images/12.png → deploy/cpp/docs/CSharp_deploy/images/12.png


+ 0 - 0
examples/C#_deploy/images/13.png → deploy/cpp/docs/CSharp_deploy/images/13.png


+ 0 - 0
examples/C#_deploy/images/14.png → deploy/cpp/docs/CSharp_deploy/images/14.png


+ 0 - 0
examples/C#_deploy/images/15.png → deploy/cpp/docs/CSharp_deploy/images/15.png


+ 0 - 0
examples/C#_deploy/images/16.png → deploy/cpp/docs/CSharp_deploy/images/16.png


+ 0 - 0
examples/C#_deploy/images/17.png → deploy/cpp/docs/CSharp_deploy/images/17.png


+ 0 - 0
examples/C#_deploy/images/18.png → deploy/cpp/docs/CSharp_deploy/images/18.png


+ 0 - 0
examples/C#_deploy/images/19.png → deploy/cpp/docs/CSharp_deploy/images/19.png


+ 0 - 0
examples/C#_deploy/images/2.png → deploy/cpp/docs/CSharp_deploy/images/2.png


+ 0 - 0
examples/C#_deploy/images/20.png → deploy/cpp/docs/CSharp_deploy/images/20.png


+ 0 - 0
examples/C#_deploy/images/21.png → deploy/cpp/docs/CSharp_deploy/images/21.png


+ 0 - 0
examples/C#_deploy/images/22.png → deploy/cpp/docs/CSharp_deploy/images/22.png


+ 0 - 0
examples/C#_deploy/images/23.png → deploy/cpp/docs/CSharp_deploy/images/23.png


+ 0 - 0
examples/C#_deploy/images/24.png → deploy/cpp/docs/CSharp_deploy/images/24.png


+ 0 - 0
examples/C#_deploy/images/25.png → deploy/cpp/docs/CSharp_deploy/images/25.png


BIN
deploy/cpp/docs/CSharp_deploy/images/26.png


BIN
deploy/cpp/docs/CSharp_deploy/images/27.png


BIN
deploy/cpp/docs/CSharp_deploy/images/28.png


BIN
deploy/cpp/docs/CSharp_deploy/images/29.png


+ 0 - 0
examples/C#_deploy/images/3.png → deploy/cpp/docs/CSharp_deploy/images/3.png


+ 0 - 0
examples/C#_deploy/images/4.png → deploy/cpp/docs/CSharp_deploy/images/4.png


+ 0 - 0
examples/C#_deploy/images/5.png → deploy/cpp/docs/CSharp_deploy/images/5.png


+ 0 - 0
examples/C#_deploy/images/6.png → deploy/cpp/docs/CSharp_deploy/images/6.png


+ 0 - 0
examples/C#_deploy/images/7.png → deploy/cpp/docs/CSharp_deploy/images/7.png


+ 0 - 0
examples/C#_deploy/images/8.5.png → deploy/cpp/docs/CSharp_deploy/images/8.5.png


+ 0 - 0
examples/C#_deploy/images/8.png → deploy/cpp/docs/CSharp_deploy/images/8.png


+ 0 - 0
examples/C#_deploy/images/9.png → deploy/cpp/docs/CSharp_deploy/images/9.png


+ 52 - 45
examples/C#_deploy/model_infer.cpp → deploy/cpp/docs/CSharp_deploy/model_infer.cpp

@@ -1,29 +1,30 @@
-#include <gflags/gflags.h>
 #include <string>
 #include <vector>
 
 #include "model_deploy/common/include/paddle_deploy.h"
 
+// Global model pointer
 PaddleDeploy::Model* model;
 
 /*
-* 模型初始化/注册接口
+* Model initialization / registration API
 * 
-* model_type: 初始化模型类型: det,seg,clas,paddlex
+* model_type: det,seg,clas,paddlex
 * 
-* model_filename: 模型文件路径
+* model_filename: Model file path
 * 
-* params_filename: 参数文件路径
+* params_filename: Parameter file path
 * 
-* cfg_file: 配置文件路径
+* cfg_file: Configuration file path
 * 
-* use_gpu: 是否使用GPU
+* use_gpu: Whether to use GPU
 * 
-* gpu_id: 指定第x号GPU
+* gpu_id: Specify GPU x
+* 
+* paddlex_model_type: When Model_Type is paddlx, the type of actual Paddlex model returned - det, seg, clas
 * 
-* paddlex_model_type: model_type为paddlx时,返回的实际paddlex模型的类型: det, seg, clas
 */
-extern "C" __declspec(dllexport) void InitModel(const char* model_type, const char* model_filename, const char* params_filename, const char* cfg_file, bool use_gpu, int gpu_id, char* paddlex_model_type)
+extern "C" void InitModel(const char* model_type, const char* model_filename, const char* params_filename, const char* cfg_file, bool use_gpu, int gpu_id, char* paddlex_model_type)
 {
 	// create model
 	model = PaddleDeploy::CreateModel(model_type);  //FLAGS_model_type
@@ -44,7 +45,7 @@ extern "C" __declspec(dllexport) void InitModel(const char* model_type, const ch
 	}
 
 	// det, seg, clas, paddlex
-	if (strcmp(model_type, "paddlex") == 0) // 是paddlex模型,则返回具体支持的模型类型: det, seg, clas
+	if (strcmp(model_type, "paddlex") == 0) // If it is a PADDLEX model, return the specifically supported model type: det, seg, clas
 	{
 		// detector
 		if (model->yaml_config_["model_type"].as<std::string>() == std::string("detector"))
@@ -64,7 +65,7 @@ extern "C" __declspec(dllexport) void InitModel(const char* model_type, const ch
 
 
 /*
-* 检测推理接口
+* Detection inference API
 * 
 * img: input for predicting.
 *
@@ -79,8 +80,10 @@ extern "C" __declspec(dllexport) void InitModel(const char* model_type, const ch
 * nBoxesNum£º number of box
 *
 * LabelList: label list of result
+* 
+* extern "C"
 */
-extern "C" __declspec(dllexport) void Det_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* output, int* nBoxesNum, char* LabelList)
+extern "C" void Det_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* output, int* nBoxesNum, char* LabelList)
 {
 	// prepare data
 	std::vector<cv::Mat> imgs;
@@ -98,28 +101,26 @@ extern "C" __declspec(dllexport) void Det_ModelPredict(const unsigned char* img,
 
 	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
 	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
-	//cv::imwrite("./1.png", input);
 	imgs.push_back(std::move(input));
 
 	// predict
 	std::vector<PaddleDeploy::Result> results;
 	model->Predict(imgs, &results, 1);
 
-	// nBoxesNum[0] = results.size();  // results.size()得到的是batch_size
-	nBoxesNum[0] = results[0].det_result->boxes.size();  // 得到单张图片预测的bounding box数
+	// nBoxesNum[0] = results.size();  // results.size() is returning batch_size
+	nBoxesNum[0] = results[0].det_result->boxes.size();  // Get the predicted Bounding Box number of a single image
 	std::string label = "";
 	//std::cout << "res: " << results[num] << std::endl;
-	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // 得到所有框的数据
+	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // Get the data for all the boxes
 	{
-		//std::cout << "category: " << results[num].det_result->boxes[i].category << std::endl;
 		label = label + results[0].det_result->boxes[i].category + " ";
 		// labelindex
-		output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // 类别的id
+		output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // Category ID
 		// score
-		output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // 得分
+		output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // Score
 		//// box
 		output[i * 6 + 2] = results[0].det_result->boxes[i].coordinate[0]; // x1, y1, x2, y2
-		output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // 左上、右下的顶点
+		output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // Upper left and lower right vertices
 		output[i * 6 + 4] = results[0].det_result->boxes[i].coordinate[2];
 		output[i * 6 + 5] = results[0].det_result->boxes[i].coordinate[3];
 	}
@@ -128,7 +129,7 @@ extern "C" __declspec(dllexport) void Det_ModelPredict(const unsigned char* img,
 
 
 /*
-* 分割推理接口
+* Segmented inference 
 * 
 * img: input for predicting.
 *
@@ -139,8 +140,10 @@ extern "C" __declspec(dllexport) void Det_ModelPredict(const unsigned char* img,
 * nChannel: channel of img.
 *
 * output: result of pridict ,include label_map
+* 
+* extern "C"
 */
-extern "C" __declspec(dllexport) void Seg_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, unsigned char* output)
+extern "C" void Seg_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, unsigned char* output)
 {
 	// prepare data
 	std::vector<cv::Mat> imgs;
@@ -158,21 +161,20 @@ extern "C" __declspec(dllexport) void Seg_ModelPredict(const unsigned char* img,
 
 	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
 	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
-	//cv::imwrite("./1.png", input);
 	imgs.push_back(std::move(input));
 
 	// predict
 	std::vector<PaddleDeploy::Result> results;
 	model->Predict(imgs, &results, 1);
 
-	std::vector<uint8_t> result_map = results[0].seg_result->label_map.data; // vector<uint8_t> -- 结果map
-	// 拷贝输出结果到输出上返回 -- 将vector<uint8_t>转成unsigned char *
+	std::vector<uint8_t> result_map = results[0].seg_result->label_map.data; // vector<uint8_t> -- Result Map
+	// Copy output result to the output back -- from vector<uint8_t> to unsigned char *
 	memcpy(output, &result_map[0], result_map.size() * sizeof(uchar));
 }
 
 
 /*
-* 识别推理接口
+* Recognition inference API
 * 
 * img: input for predicting.
 *
@@ -187,8 +189,10 @@ extern "C" __declspec(dllexport) void Seg_ModelPredict(const unsigned char* img,
 * category: result of pridict ,include category_string
 * 
 * category_id: result of pridict ,include category_id
+* 
+* extern "C" 
 */
-extern "C" __declspec(dllexport) void Cls_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* score, char* category, int* category_id)
+extern "C" void Cls_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* score, char* category, int* category_id)
 {
 	// prepare data
 	std::vector<cv::Mat> imgs;
@@ -206,7 +210,6 @@ extern "C" __declspec(dllexport) void Cls_ModelPredict(const unsigned char* img,
 
 	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
 	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
-	//cv::imwrite("./1.png", input);
 	imgs.push_back(std::move(input));
 
 	// predict
@@ -214,15 +217,15 @@ extern "C" __declspec(dllexport) void Cls_ModelPredict(const unsigned char* img,
 	model->Predict(imgs, &results, 1);
 
 	*category_id = results[0].clas_result->category_id;
-	// 拷贝输出类别结果到输出上返回 -- string --> char* 
+	// Copy output category result to output -- string --> char* 
 	memcpy(category, results[0].clas_result->category.c_str(), strlen(results[0].clas_result->category.c_str()));
-	// 拷贝输出概率值返回
+	// Copy output probability value
 	*score = results[0].clas_result->score;
 }	
 
 
 /*
-* MaskRCNN推理接口
+* MaskRCNN Reasoning 
 * 
 * img: input for predicting.
 *
@@ -239,8 +242,10 @@ extern "C" __declspec(dllexport) void Cls_ModelPredict(const unsigned char* img,
 * nBoxesNum: result of pridict ,include BoxesNum
 * 
 * LabelList: result of pridict ,include LabelList
+* 
+* extern "C"
 */
-extern "C" __declspec(dllexport) void Mask_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* box_output, unsigned char* mask_output, int* nBoxesNum, char* LabelList)
+extern "C" void Mask_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* box_output, unsigned char* mask_output, int* nBoxesNum, char* LabelList)
 {
 	// prepare data
 	std::vector<cv::Mat> imgs;
@@ -260,28 +265,28 @@ extern "C" __declspec(dllexport) void Mask_ModelPredict(const unsigned char* img
 	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
 	imgs.push_back(std::move(input));
 
-	// predict  -- 多次点击单张推理时会出错
+	// predict
 	std::vector<PaddleDeploy::Result> results;
-	model->Predict(imgs, &results, 1);  // 在Infer处发生错误
+	model->Predict(imgs, &results, 1);
 
-	nBoxesNum[0] = results[0].det_result->boxes.size();  // 得到单张图片预测的bounding box数
+	nBoxesNum[0] = results[0].det_result->boxes.size();  // Get the predicted Bounding Box number of a single image
 	std::string label = "";
 
-	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // 得到所有框的数据
+	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // Get the data for all the boxes
 	{
-		// 边界框预测结果
+		// prediction results
 		label = label + results[0].det_result->boxes[i].category + " ";
 		// labelindex
-		box_output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // 类别的id
+		box_output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // Category ID
 		// score
-		box_output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // 得分
+		box_output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // Score
 		//// box
 		box_output[i * 6 + 2] = results[0].det_result->boxes[i].coordinate[0]; // x1, y1, x2, y2
-		box_output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // 左上、右下的顶点
+		box_output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // Upper left and lower right vertices
 		box_output[i * 6 + 4] = results[0].det_result->boxes[i].coordinate[2];
 		box_output[i * 6 + 5] = results[0].det_result->boxes[i].coordinate[3];
 		
-		//Mask预测结果
+		// Mask prediction results
 		for (int j = 0; j < results[0].det_result->boxes[i].mask.data.size(); j++)
 		{
 			if (mask_output[j] == 0)
@@ -296,10 +301,12 @@ extern "C" __declspec(dllexport) void Mask_ModelPredict(const unsigned char* img
 
 
 /*
-* 模型销毁/注销接口
+* Model destruction API
+* 
+* extern "C" 
 */
-extern "C" __declspec(dllexport) void DestructModel()
+extern "C" void DestructModel()
 {
 	delete model;
 	std::cout << "destruct model success" << std::endl;
-}
+}

+ 9 - 0
deploy/cpp/docs/apis/model.md

@@ -0,0 +1,9 @@
+## PaddleInference模型部署
+- [Linux编译(支持加密)指南](./paddle/linux.md)
+- [Windows编译(支持加密)指南](./paddle/windows.md)
+
+## ONNX模型部署
+Paddle的模型除了直接通过PaddleInference部署外,还可以通过[Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX.git)转为ONNX后使用第三方推理引擎进行部署,在本目录下,我们提供了基于OpenVINO、Triton和TensorRT三个引擎的部署支持。
+- [OpenVINO部署](./openvino/README.md)
+- [Triton部署](./triton/docker.md)
+- [TensorRT部署](./tensorrt/trt.md)

+ 1 - 1
deploy/cpp/docs/compile/openvino/README.md

@@ -2,7 +2,7 @@
 
 本文档指引用户如何基于OpenVINO对飞桨模型进行推理,并编译执行。进行以下编译操作前请先安装好OpenVINO,OpenVINO安装请参考官网[OpenVINO-Linux](https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html)
 
-**注意:** 
+**注意:**
 
 - 我们测试的openvino版本为2021.3,如果你使用其它版本遇到问题,可以尝试切换到该版本
 - 当前检测模型转换为openvino格式是有问题的,暂时只支持分割和分类模型

+ 142 - 0
deploy/cpp/docs/compile/openvino/openvino_windows.md

@@ -1 +1,143 @@
 # 基于PaddleInference的推理-Jetson环境编译
+本文档指引用户如何基于PaddleInference在Jetson平台上对飞桨模型进行推理,并编译执行。
+
+## 环境依赖
+gcc >= 5.4.0
+cmake >= 3.5.1
+
+(Jetson环境下)Ubuntu 16.04/18.04
+
+## 编译步骤
+### Step1: 获取部署代码
+```
+git clone https://github.com/PaddlePaddle/PaddleX.git
+cd PaddleX/deploy/cpp
+```
+**说明**:`C++`预测代码在`PaddleX/deploy/cpp` 目录,该目录不依赖任何`PaddleX`下其他目录。所有的公共实现代码在`model_deploy`目录下,所有示例代码都在`demo`目录下。
+
+> 也可手动下载完整的`PaddleX`,进行离线安装(接下来的步骤都相通)
+
+### Step 2. 下载Jetson下PaddlePaddle C++ 预编译预测库
+PaddlePaddle C++ 预测库针对是否使用GPU、是否支持TensorRT、以及不同的CUDA版本提供了已经编译好的预测库,目前PaddleX支持Paddle预测库2.0+,最新2.1版本下载链接如下所示:
+
+| 版本说明                               | 预测库(2.1)                                                                                                                   | 编译器  |
+| -------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- | ------- |
+| Jetpack4.4: nv-jetson-cuda10.2-cudnn8-trt7(all) | [paddle_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/2.1.1-nv-jetson-jetpack4.4-all/paddle_inference_install_dir.tgz)   | gcc 8.2 |
+| Jetpack4.4: nv-jetson-cuda10.2-cudnn8-trt7(nano) | [ paddle_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/2.1.1-nv-jetson-jetpack4.4-nano/paddle_inference_install_dir.tgz)  | gcc 8.2 |
+| Jetpack4.4: nv-jetson-cuda10.2-cudnn8-trt7(tx2)	 | [ paddle_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/2.1.1-nv-jetson-jetpack4.4-tx2/paddle_inference_install_dir.tgz) | gcc 8.2 |
+| Jetpack4.4: nv-jetson-cuda10.2-cudnn8-trt7(xavier) | [ paddle_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/2.1.1-nv-jetson-jetpack4.4-xavier/paddle_inference_install_dir.tgz) | gcc 8.2 |
+
+请根据实际情况选择下载,如若以上版本不满足您的需求,请至[C++预测库下载列表](https://paddleinference.paddlepaddle.org.cn/v2.1/user_guides/download_lib.html)选择符合的版本。
+
+将预测库解压后,其所在目录(例如解压至`PaddleX/deploy/cpp/paddle_inferenc/`)下主要包含的内容有:
+```
+|—— CmakeCache.txt
+|
+├── paddle/ # paddle核心库和头文件
+|
+├── third_party # 第三方依赖库和头文件
+|
+└── version.txt # 版本和编译信息(里边有编译时gcc、cuda、cudnn的版本信息)
+```
+<div>
+  <img src="../../images/paddleinference_filelist.png">
+  </div>
+
+### Step 3. 修改编译参数
+
+根据自己的系统环境,修改`PaddleX/deploy/cpp/script/jetson_build.sh`脚本中的参数,主要修改的参数为以下几个
+| 参数          | 说明                                                                                 |
+| :------------ | :----------------------------------------------------------------------------------- |
+| WITH_GPU      | ON或OFF,表示是否使用GPU,当下载的为CPU预测库时,设为OFF                             |
+| PADDLE_DIR    | 预测库所在路径,默认为`PaddleX/deploy/cpp/paddle_inference`目录下                    |
+| CUDA_LIB      | cuda相关lib文件所在的目录路径 -- 请注意jetson预装的cuda所在路径(如:/usr/local/cuda/lib64) |
+| CUDNN_LIB     | cudnn相关lib文件所在的目录路径 -- 请注意jetson预装的cuda所在路径(如:/usr/lib/aarch64-linux-gnu)    |
+| WITH_TENSORRT | ON或OFF,表示是否使用开启TensorRT                                                    |
+| TENSORRT_DIR  | TensorRT 的路径,如果开启TensorRT开关WITH_TENSORRT,需修改为您实际安装的TensorRT路径     |
+| WITH_ENCRYPTION      | ON或OFF,表示是否开启加密模块                             |
+| OPENSSL_DIR    | OPENSSL所在路径,解密所需。默认为`PaddleX/deploy/cpp/deps/penssl-1.1.0k`目录下        |
+
+> **要注意相关参数路径不要有误——特别是CUDA_LIB以及CUDNN_LIB,如果需要启动TensorRt,也需指定当前的路径。**
+
+<div>
+  <img src="../../images/deploy_build_sh.png">
+  </div>
+
+> 不需要添加oepncv路径,在jetson中编译可直接使用环境本身预装的opencv进行deploy编译——具体配置在Step4中。
+
+
+### Step 4. 修改build时需对应的CMakeLists.txt
+根据自己的系统环境,修改`PaddleX/deploy/cpp/CMakeLists.txt`脚本中的参数,主要修改的参数为以下几个:位于其中注释`#OPENCV`之后的部分
+| 参数          | 说明                                                                                 |
+| :------------ | :----------------------------------------------------------------------------------- |
+| set(OpenCV_INCLUDE_DIRS "/usr/include/opencv")      | 配置Jetson预置opencv的include路径    |
+| file(GLOB OpenCV_LIBS /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX})    | 配置opencv动态链接库*.so    |
+
+替换具体如下:(xavier为例)
+
+1. /usr/include/opencv --> /usr/include/opencv4
+  > 具体路径,以部署环境中opencv的include路径为准。
+  > opencv4 中包含: opencv, opencv2
+
+2. /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX} --> /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX}
+  > 具体路径,以部署环境中opencv的*.so路径为准, 主要修改libopencv_前的路径。
+
+<div>
+  <img src="../../images/cmakelist_set.png">
+  </div>
+
+### Step 5. 添加yaml库源码
+由于Jetson环境下编译还需要yaml,所以这里需要手动下载yaml包,保证编译的正常运行。
+
+> 1. 点击[下载yaml依赖包](https://bj.bcebos.com/paddlex/deploy/deps/yaml-cpp.zip),无需解压
+> 2. 修改`PaddleX/deploy/cpp/cmake/yaml.cmake`文件,将`URL https://bj.bcebos.com/paddlex/deploy/deps/yaml-cpp.zip`中网址替换为第3步中下载的路径,如改为`URL /Users/Download/yaml-cpp.zip`
+
+**这里yaml存放路径为了确保使用最好保证全英文路径**
+eg:
+
+<div>
+  <img src="../../images/yaml_cmakelist.png">
+  </div>
+  
+> 其它支持的加密操作以及TensorRT,可参考[Linux环境编译指南](./linux.md).
+
+### Step 6. 编译
+以上yaml库添加完成,同时也修改完jetson_build.sh后,即可执行编译, **[注意]**: 以下命令在`PaddleX/deploy/cpp`目录下进行执行
+
+```
+sh script/jetson_build.sh
+```
+
+> 编译时,如果存在cmake多线程问题——请前往`jetson_build.sh`末尾,将`make -j8`改为`make`或者小于8.
+
+
+### Step 7. 编译结果
+
+编译后会在`PaddleX/deploy/cpp/build/demo`目录下生成`model_infer`、`multi_gpu_model_infer`和`batch_infer`等几个可执行二进制文件示例,分别用于在单卡/多卡/多batch上加载模型进行预测,示例使用参考如下文档:
+
+- [单卡加载模型预测示例](../../demo/model_infer.md)
+- [多卡加载模型预测示例](../../demo/multi_gpu_model_infer.md)
+
+如果编译时开启TensorRT, 会多成一个`tensorrt_infer`二进制文件示例。示例使用参考如下文档:
+- [PaddleInference集成TensorRT加载模型预测示例](../../demo/tensorrt_infer.md)
+
+如果编译时开启加密, 会多成一个`decrypt_infer`二进制文件示例。示例使用参考如下文档:
+- [模型加密预测示例](../../demo/decrypt_infer.md)
+
+
+### Step 8. QT界面部署应用Demo
+
+通过修改`PaddleX/deploy/cpp/demo/model_infer.cpp`以及`PaddleX/deploy/cpp/demo/CMakeLists.txt`, 再执行`jetson_build.sh`生成`libmodel_infer.so`动态链接库,用于QT应用调用,执行模型初始化、模型推理预测、模型注销等操作。[现已经在Jetson Xavier上利用原生编译的opencv实现了模型单张预测与文件夹连续预测,由于预编译opencv不支持解析视频格式,因此暂未对视频进行测试——仅在windows上完成了单张图片-文件夹连续预测-视频流预测的全流程验证。]
+
+> 该版本对于MaskRCNN模型的推理需要使用GPU进行推理——如果CPU下进行推理可能由于内存使用问题报错。
+> 
+> 鉴于Qt跨平台属性,因此如果部署环境下opencv支持视频格式,则该Demo-Gui程序可启动完整的推理可视化功能。
+
+<div>
+  <img src="../../images/show_menu.png">
+  <img src="../../images/cpu_infer.png">
+  </div>
+
+具体Demo信息可前往如下文档链接:
+- [基于QT的Jetson部署Demo](../../jetson-deploy/README.md)
+

+ 1 - 1
deploy/cpp/docs/compile/paddle/linux.md

@@ -1,4 +1,4 @@
-# 模型加密预测示例
+# 模型加密预测
 
 本文档说明如何对模型进行加密解密部署,仅供用户参考进行使用,开发者可基于此demo示例进行二次开发,满足集成的需求。
 

+ 33 - 0
deploy/cpp/docs/demo/model_infer.md

@@ -0,0 +1,33 @@
+# PaddleX Deployment部署方式
+
+PaddleX Deployment适配业界常用的CPU、GPU(包括NVIDIA Jetson)、树莓派等硬件,支持[PaddleClas](https://github.com/PaddlePaddle/PaddleClas)、[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection)、[PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg)三个套件的训练的部署,支持用户采用OpenVINO或TensorRT进行推理加速。完备支持工业最常使用的Windows系统,且提供C#语言进行部署的方式!
+## 模型套件支持
+本目录下代码,目前支持以下飞桨官方套件基于PaddleInference的部署。用户可参考[文件夹结构](./file_format.md)了解模型导出前后文件夹状态。
+
+| 套件名称 | 版本号   | 支持模型 |
+| -------- | -------- | ------- |
+| PaddleDetection  | [release/2.1](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.1)、[release/0.5](https://github.com/PaddlePaddle/PaddleDetection/tree/release/0.5) |  FasterRCNN / MaskRCNN / PPYOLO / PPYOLOv2 / YOLOv3   |  
+| PaddleSeg        | [release/2.1](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.1)       |  全部分割模型  |
+| PaddleClas       | [release/2.1](https://github.com/PaddlePaddle/PaddleClas/tree/release/2.1)      |  全部分类模型  |
+| PaddleX          | [release/2.0.0](https://github.com/PaddlePaddle/PaddleX)                        |  全部静态图、动态图模型   |
+
+## 硬件支持
+- CPU(linux/windows)
+- GPU(linux/windows)
+
+## 各套件部署方式说明
+
+- [PaddleX部署指南](./models/paddlex.md)
+- [PaddleDetection部署指南](./models/paddledetection.md)
+- [PaddleSeg部署指南](./models/paddleseg.md)
+- [PaddleClas部署指南](./models/paddleclas.md)
+
+## 模型加密与预测加速
+
+- [模型加密预测示例](./demo/decrypt_infer.md)
+- [PaddleInference集成TensorRT加载模型预测示例](./demo/tensorrt_infer.md)
+
+## <h2 id="1">C++代码预测说明</h2>
+
+- [部署相关API说明](./apis/model.md)
+- [模型配置文件说明](./apis/yaml.md)

+ 9 - 0
deploy/cpp/docs/file_format.md

@@ -0,0 +1,9 @@
+# 各套件模型导出前后文件夹状态
+
+| 套件名称 | 导出前文件 | 导出后文件 |
+| :-- | :-- | :-- |
+| PaddleX静态图 | model.pdparams<br>model.pdmodel<br>model.yml | __ model__<br>__ params__<br>model.yml |
+| PaddleX动态图 | model.pdparams<br>model.pdopt<br>model.yml | model.pdmodel<br>model.pdiparams<br>model.pdiparams.info<br>model.yml<br>pipeline.yml |
+| PaddleDetection | XXX.pdparams |  infer_cfg.yml<br>model.pdiparams<br>model.pdiparams.info<br>model.pdmodel |
+| PaddleSeg | XXX.pdparams | deploy.yaml<br>model.pdiparams<br>model.pdiparams.info<br>model.pdmodel |
+| PaddleClas | XXX.pdparams | model.pdiparams<br>model.pdiparams.info<br>model.pdmodel |

BIN
deploy/cpp/docs/images/cmakelist_set.png


BIN
deploy/cpp/docs/images/cpu_infer.png


BIN
deploy/cpp/docs/images/deploy_build_sh.png


BIN
deploy/cpp/docs/images/infer_demo_cmakelist.png


BIN
deploy/cpp/docs/images/paddleinference_filelist.png


BIN
deploy/cpp/docs/images/show_menu.png


BIN
deploy/cpp/docs/images/tensorrt.png


+ 157 - 0
deploy/cpp/docs/jetson-deploy/CMakeLists.txt

@@ -0,0 +1,157 @@
+#paddle inference
+if (NOT DEFINED PADDLE_DIR OR ${PADDLE_DIR} STREQUAL "")
+    message(FATAL_ERROR "please set PADDLE_DIR with -DPADDLE_DIR=/path/paddle_influence_dir")
+endif()
+
+#paddle inference third party
+include_directories("${PADDLE_DIR}")
+include_directories("${PADDLE_DIR}/third_party/install/protobuf/include")
+include_directories("${PADDLE_DIR}/third_party/install/glog/include")
+include_directories("${PADDLE_DIR}/third_party/install/gflags/include")
+include_directories("${PADDLE_DIR}/third_party/install/xxhash/include")
+include_directories("${PADDLE_DIR}/third_party/install/cryptopp/include")
+
+link_directories("${PADDLE_DIR}/paddle/lib/")
+link_directories("${PADDLE_DIR}/third_party/install/protobuf/lib")
+link_directories("${PADDLE_DIR}/third_party/install/glog/lib")
+link_directories("${PADDLE_DIR}/third_party/install/gflags/lib")
+link_directories("${PADDLE_DIR}/third_party/install/xxhash/lib")
+link_directories("${PADDLE_DIR}/third_party/install/cryptopp/lib")
+
+if (WIN32)
+  set(DEPS ${DEPS} ${PADDLE_DIR}/paddle/lib/paddle_inference.lib)
+  set(DEPS ${DEPS} glog gflags_static libprotobuf xxhash cryptopp-static libyaml-cppmt shlwapi)
+else()
+  if (WITH_STATIC_LIB)
+    set(DEPS ${DEPS} ${PADDLE_DIR}/paddle/lib/libpaddle_inference${CMAKE_STATIC_LIBRARY_SUFFIX})
+  else()
+    set(DEPS ${DEPS} ${PADDLE_DIR}/paddle/lib/libpaddle_inference${CMAKE_SHARED_LIBRARY_SUFFIX})
+  endif()
+  set(DEPS ${DEPS} glog gflags protobuf xxhash cryptopp yaml-cpp)
+endif(WIN32)
+
+#MKL
+if(WITH_MKL)
+  ADD_DEFINITIONS(-DUSE_MKL)
+  set(MKLML_PATH "${PADDLE_DIR}/third_party/install/mklml")
+  include_directories("${MKLML_PATH}/include")
+  if (WIN32)
+    set(MATH_LIB ${MKLML_PATH}/lib/mklml.lib ${MKLML_PATH}/lib/libiomp5md.lib)
+  else ()
+    set(MATH_LIB ${MKLML_PATH}/lib/libmklml_intel${CMAKE_SHARED_LIBRARY_SUFFIX} ${MKLML_PATH}/lib/libiomp5${CMAKE_SHARED_LIBRARY_SUFFIX})
+    execute_process(COMMAND cp -r ${MKLML_PATH}/lib/libmklml_intel${CMAKE_SHARED_LIBRARY_SUFFIX} /usr/lib)
+  endif ()
+  set(MKLDNN_PATH "${PADDLE_DIR}/third_party/install/mkldnn")
+  if(EXISTS ${MKLDNN_PATH})
+    include_directories("${MKLDNN_PATH}/include")
+    if (WIN32)
+      set(MKLDNN_LIB ${MKLDNN_PATH}/lib/mkldnn.lib)
+    else ()
+      set(MKLDNN_LIB ${MKLDNN_PATH}/lib/libmkldnn.so.0)
+    endif ()
+  endif()
+else()
+  set(MATH_LIB ${PADDLE_DIR}/third_party/install/openblas/lib/libopenblas${CMAKE_STATIC_LIBRARY_SUFFIX})
+endif()
+
+set(DEPS ${DEPS} ${MATH_LIB} ${MKLDNN_LIB})
+
+#set GPU
+if (WITH_PADDLE_TENSORRT AND WITH_GPU)
+  include_directories("${TENSORRT_DIR}/include")
+  link_directories("${TENSORRT_DIR}/lib")
+
+  file(READ ${TENSORRT_DIR}/include/NvInfer.h TENSORRT_VERSION_FILE_CONTENTS)
+  string(REGEX MATCH "define NV_TENSORRT_MAJOR +([0-9]+)" TENSORRT_MAJOR_VERSION
+    "${TENSORRT_VERSION_FILE_CONTENTS}")
+  if("${TENSORRT_MAJOR_VERSION}" STREQUAL "")
+    file(READ ${TENSORRT_DIR}/include/NvInferVersion.h TENSORRT_VERSION_FILE_CONTENTS)
+    string(REGEX MATCH "define NV_TENSORRT_MAJOR +([0-9]+)" TENSORRT_MAJOR_VERSION
+      "${TENSORRT_VERSION_FILE_CONTENTS}")
+  endif()
+  if("${TENSORRT_MAJOR_VERSION}" STREQUAL "")
+    message(SEND_ERROR "Failed to detect TensorRT version.")
+  endif()
+  string(REGEX REPLACE "define NV_TENSORRT_MAJOR +([0-9]+)" "\\1"
+    TENSORRT_MAJOR_VERSION "${TENSORRT_MAJOR_VERSION}")
+  message(STATUS "Current TensorRT header is ${TENSORRT_INCLUDE_DIR}/NvInfer.h. "
+    "Current TensorRT version is v${TENSORRT_MAJOR_VERSION}. ")
+endif()
+
+if(WITH_GPU)
+  if (NOT DEFINED CUDA_LIB OR ${CUDA_LIB} STREQUAL "")
+    message(FATAL_ERROR "please set CUDA_LIB with -DCUDA_LIB=/path/cuda/lib64")
+  endif()
+
+  
+  if(NOT WIN32)
+    if (NOT DEFINED CUDNN_LIB)
+      message(FATAL_ERROR "please set CUDNN_LIB with -DCUDNN_LIB=/path/cudnn/")
+    endif()
+
+    set(DEPS ${DEPS} ${CUDA_LIB}/libcudart${CMAKE_SHARED_LIBRARY_SUFFIX})
+    set(DEPS ${DEPS} ${CUDNN_LIB}/libcudnn${CMAKE_SHARED_LIBRARY_SUFFIX})
+
+    if (WITH_PADDLE_TENSORRT)
+      set(DEPS ${DEPS} ${TENSORRT_DIR}/lib/libnvinfer${CMAKE_SHARED_LIBRARY_SUFFIX})
+      set(DEPS ${DEPS} ${TENSORRT_DIR}/lib/libnvinfer_plugin${CMAKE_SHARED_LIBRARY_SUFFIX})
+    endif()
+
+  else()
+    set(DEPS ${DEPS} ${CUDA_LIB}/cudart${CMAKE_STATIC_LIBRARY_SUFFIX} )
+    set(DEPS ${DEPS} ${CUDA_LIB}/cublas${CMAKE_STATIC_LIBRARY_SUFFIX} )
+    set(DEPS ${DEPS} ${CUDA_LIB}/cudnn${CMAKE_STATIC_LIBRARY_SUFFIX})
+
+    if (WITH_PADDLE_TENSORRT)
+      set(DEPS ${DEPS} ${TENSORRT_DIR}/lib/nvinfer${CMAKE_STATIC_LIBRARY_SUFFIX})
+      set(DEPS ${DEPS} ${TENSORRT_DIR}/lib/nvinfer_plugin${CMAKE_STATIC_LIBRARY_SUFFIX})
+      if(${TENSORRT_MAJOR_VERSION} GREATER_EQUAL 7)
+        set(DEPS ${DEPS} ${TENSORRT_DIR}/lib/myelin64_1${CMAKE_STATIC_LIBRARY_SUFFIX})
+      endif()
+    endif()
+  endif()
+endif()
+
+message("-----DEPS = ${DEPS}")
+
+# engine src
+set(ENGINE_SRC ${PROJECT_SOURCE_DIR}/model_deploy/engine/src/ppinference_engine.cpp)
+
+ADD_library(model_infer SHARED ${PROJECT_SOURCE_DIR}/demo/model_infer.cpp ${SRC} ${ENGINE_SRC} ${DETECTOR_SRC} ${ENCRYPTION_SRC})
+# add_executable(model_infer model_infer.cpp ${SRC} ${ENGINE_SRC} ${DETECTOR_SRC} ${ENCRYPTION_SRC})
+ADD_DEPENDENCIES(model_infer ext-yaml-cpp)
+target_link_libraries(model_infer ${DEPS})
+
+add_executable(batch_infer batch_infer.cpp ${SRC} ${ENGINE_SRC} ${DETECTOR_SRC} ${ENCRYPTION_SRC})
+ADD_DEPENDENCIES(batch_infer ext-yaml-cpp)
+target_link_libraries(batch_infer ${DEPS})
+
+add_executable(multi_gpu_model_infer multi_gpu_model_infer.cpp ${SRC} ${ENGINE_SRC} ${DETECTOR_SRC} ${ENCRYPTION_SRC})
+ADD_DEPENDENCIES(multi_gpu_model_infer ext-yaml-cpp)
+target_link_libraries(multi_gpu_model_infer ${DEPS})
+
+if (WITH_PADDLE_TENSORRT)
+  add_executable(tensorrt_infer tensorrt_infer.cpp ${SRC} ${ENGINE_SRC} ${DETECTOR_SRC} ${ENCRYPTION_SRC})
+  ADD_DEPENDENCIES(tensorrt_infer ext-yaml-cpp)
+  target_link_libraries(tensorrt_infer ${DEPS})
+endif()
+
+if(WIN32)
+  add_custom_command(TARGET model_infer POST_BUILD
+    COMMAND ${CMAKE_COMMAND} -E copy ${PADDLE_DIR}/third_party/install/mklml/lib/mklml.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+    COMMAND ${CMAKE_COMMAND} -E copy ${PADDLE_DIR}/third_party/install/mklml/lib/libiomp5md.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+    COMMAND ${CMAKE_COMMAND} -E copy ${PADDLE_DIR}/third_party/install/mkldnn/lib/mkldnn.dll  ${CMAKE_BINARY_DIR}/paddle_deploy
+    COMMAND ${CMAKE_COMMAND} -E copy ${PADDLE_DIR}/paddle/lib/paddle_inference.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+  )
+  if (WITH_PADDLE_TENSORRT)
+    add_custom_command(TARGET model_infer POST_BUILD
+      COMMAND ${CMAKE_COMMAND} -E copy ${TENSORRT_DIR}/lib/nvinfer.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+      COMMAND ${CMAKE_COMMAND} -E copy ${TENSORRT_DIR}/lib/nvinfer_plugin.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+    )
+    if(${TENSORRT_MAJOR_VERSION} GREATER_EQUAL 7)
+      add_custom_command(TARGET model_infer POST_BUILD
+        COMMAND ${CMAKE_COMMAND} -E copy ${TENSORRT_DIR}/lib/myelin64_1.dll ${CMAKE_BINARY_DIR}/paddle_deploy
+      )
+    endif()
+  endif()
+endif()

+ 44 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/Deploy_infer.pro

@@ -0,0 +1,44 @@
+#-------------------------------------------------
+#
+# Project created by QtCreator 2021-10-08T16:09:04
+#
+#-------------------------------------------------
+
+QT       += core gui
+
+greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
+
+TARGET = Deploy_infer
+TEMPLATE = app
+
+# The following define makes your compiler emit warnings if you use
+# any feature of Qt which has been marked as deprecated (the exact warnings
+# depend on your compiler). Please consult the documentation of the
+# deprecated API in order to know how to port your code away from it.
+DEFINES += QT_DEPRECATED_WARNINGS
+
+# You can also make your code fail to compile if you use deprecated APIs.
+# In order to do so, uncomment the following line.
+# You can also select to disable deprecated APIs only up to a certain version of Qt.
+#DEFINES += QT_DISABLE_DEPRECATED_BEFORE=0x060000    # disables all the APIs deprecated before Qt 6.0.0
+
+
+SOURCES += \
+        main.cpp \
+        mainwindow.cpp \
+    inferthread.cpp
+
+HEADERS += \
+        mainwindow.h \
+    inferthread.h
+
+FORMS += \
+        mainwindow.ui
+
+INCLUDEPATH += /usr/include/opencv4 \
+               /usr/include/opencv4/opencv \
+               /usr/include/opencv4/opencv2
+
+LIBS += /usr/lib/libopencv_*.so
+
+

+ 336 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/Deploy_infer.pro.user

@@ -0,0 +1,336 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!DOCTYPE QtCreatorProject>
+<!-- Written by QtCreator 4.5.2, 2021-10-09T20:31:16. -->
+<qtcreator>
+ <data>
+  <variable>EnvironmentId</variable>
+  <value type="QByteArray">{cd61ab8b-668c-491d-9e63-a017e466d6ff}</value>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.ActiveTarget</variable>
+  <value type="int">0</value>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.EditorSettings</variable>
+  <valuemap type="QVariantMap">
+   <value type="bool" key="EditorConfiguration.AutoIndent">true</value>
+   <value type="bool" key="EditorConfiguration.AutoSpacesForTabs">false</value>
+   <value type="bool" key="EditorConfiguration.CamelCaseNavigation">true</value>
+   <valuemap type="QVariantMap" key="EditorConfiguration.CodeStyle.0">
+    <value type="QString" key="language">Cpp</value>
+    <valuemap type="QVariantMap" key="value">
+     <value type="QByteArray" key="CurrentPreferences">CppGlobal</value>
+    </valuemap>
+   </valuemap>
+   <valuemap type="QVariantMap" key="EditorConfiguration.CodeStyle.1">
+    <value type="QString" key="language">QmlJS</value>
+    <valuemap type="QVariantMap" key="value">
+     <value type="QByteArray" key="CurrentPreferences">QmlJSGlobal</value>
+    </valuemap>
+   </valuemap>
+   <value type="int" key="EditorConfiguration.CodeStyle.Count">2</value>
+   <value type="QByteArray" key="EditorConfiguration.Codec">UTF-8</value>
+   <value type="bool" key="EditorConfiguration.ConstrainTooltips">false</value>
+   <value type="int" key="EditorConfiguration.IndentSize">4</value>
+   <value type="bool" key="EditorConfiguration.KeyboardTooltips">false</value>
+   <value type="int" key="EditorConfiguration.MarginColumn">80</value>
+   <value type="bool" key="EditorConfiguration.MouseHiding">true</value>
+   <value type="bool" key="EditorConfiguration.MouseNavigation">true</value>
+   <value type="int" key="EditorConfiguration.PaddingMode">1</value>
+   <value type="bool" key="EditorConfiguration.ScrollWheelZooming">true</value>
+   <value type="bool" key="EditorConfiguration.ShowMargin">false</value>
+   <value type="int" key="EditorConfiguration.SmartBackspaceBehavior">0</value>
+   <value type="bool" key="EditorConfiguration.SmartSelectionChanging">true</value>
+   <value type="bool" key="EditorConfiguration.SpacesForTabs">true</value>
+   <value type="int" key="EditorConfiguration.TabKeyBehavior">0</value>
+   <value type="int" key="EditorConfiguration.TabSize">8</value>
+   <value type="bool" key="EditorConfiguration.UseGlobal">true</value>
+   <value type="int" key="EditorConfiguration.Utf8BomBehavior">1</value>
+   <value type="bool" key="EditorConfiguration.addFinalNewLine">true</value>
+   <value type="bool" key="EditorConfiguration.cleanIndentation">true</value>
+   <value type="bool" key="EditorConfiguration.cleanWhitespace">true</value>
+   <value type="bool" key="EditorConfiguration.inEntireDocument">false</value>
+  </valuemap>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.PluginSettings</variable>
+  <valuemap type="QVariantMap"/>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.Target.0</variable>
+  <valuemap type="QVariantMap">
+   <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Desktop</value>
+   <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName">Desktop</value>
+   <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">{a77c2a71-4b3f-4a86-9f9a-11225dc72ef8}</value>
+   <value type="int" key="ProjectExplorer.Target.ActiveBuildConfiguration">0</value>
+   <value type="int" key="ProjectExplorer.Target.ActiveDeployConfiguration">0</value>
+   <value type="int" key="ProjectExplorer.Target.ActiveRunConfiguration">0</value>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.BuildConfiguration.0">
+    <value type="QString" key="ProjectExplorer.BuildConfiguration.BuildDirectory">/home/nvidia/Desktop/build-Deploy_infer-Desktop-Debug</value>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.0">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">qmake</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">QtProjectManager.QMakeBuildStep</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.LinkQmlDebuggingLibrary">true</value>
+      <value type="QString" key="QtProjectManager.QMakeBuildStep.QMakeArguments"></value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.QMakeForced">false</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.SeparateDebugInfo">false</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.UseQtQuickCompiler">false</value>
+     </valuemap>
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.1">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">false</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments"></value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">2</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Build</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Build</value>
+    </valuemap>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.1">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">true</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments">clean</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">1</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Clean</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Clean</value>
+    </valuemap>
+    <value type="int" key="ProjectExplorer.BuildConfiguration.BuildStepListCount">2</value>
+    <value type="bool" key="ProjectExplorer.BuildConfiguration.ClearSystemEnvironment">false</value>
+    <valuelist type="QVariantList" key="ProjectExplorer.BuildConfiguration.UserEnvironmentChanges"/>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Debug</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.Qt4BuildConfiguration</value>
+    <value type="int" key="Qt4ProjectManager.Qt4BuildConfiguration.BuildConfiguration">2</value>
+    <value type="bool" key="Qt4ProjectManager.Qt4BuildConfiguration.UseShadowBuild">true</value>
+   </valuemap>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.BuildConfiguration.1">
+    <value type="QString" key="ProjectExplorer.BuildConfiguration.BuildDirectory">/home/nvidia/Desktop/build-Deploy_infer-Desktop-Release</value>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.0">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">qmake</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">QtProjectManager.QMakeBuildStep</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.LinkQmlDebuggingLibrary">false</value>
+      <value type="QString" key="QtProjectManager.QMakeBuildStep.QMakeArguments"></value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.QMakeForced">false</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.SeparateDebugInfo">false</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.UseQtQuickCompiler">false</value>
+     </valuemap>
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.1">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">false</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments"></value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">2</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Build</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Build</value>
+    </valuemap>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.1">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">true</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments">clean</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">1</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Clean</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Clean</value>
+    </valuemap>
+    <value type="int" key="ProjectExplorer.BuildConfiguration.BuildStepListCount">2</value>
+    <value type="bool" key="ProjectExplorer.BuildConfiguration.ClearSystemEnvironment">false</value>
+    <valuelist type="QVariantList" key="ProjectExplorer.BuildConfiguration.UserEnvironmentChanges"/>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Release</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.Qt4BuildConfiguration</value>
+    <value type="int" key="Qt4ProjectManager.Qt4BuildConfiguration.BuildConfiguration">0</value>
+    <value type="bool" key="Qt4ProjectManager.Qt4BuildConfiguration.UseShadowBuild">true</value>
+   </valuemap>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.BuildConfiguration.2">
+    <value type="QString" key="ProjectExplorer.BuildConfiguration.BuildDirectory">/home/nvidia/Desktop/build-Deploy_infer-Desktop-Profile</value>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.0">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">qmake</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">QtProjectManager.QMakeBuildStep</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.LinkQmlDebuggingLibrary">true</value>
+      <value type="QString" key="QtProjectManager.QMakeBuildStep.QMakeArguments"></value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.QMakeForced">false</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.SeparateDebugInfo">true</value>
+      <value type="bool" key="QtProjectManager.QMakeBuildStep.UseQtQuickCompiler">false</value>
+     </valuemap>
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.1">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">false</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments"></value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">2</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Build</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Build</value>
+    </valuemap>
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.1">
+     <valuemap type="QVariantMap" key="ProjectExplorer.BuildStepList.Step.0">
+      <value type="bool" key="ProjectExplorer.BuildStep.Enabled">true</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Make</value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+      <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.MakeStep</value>
+      <valuelist type="QVariantList" key="Qt4ProjectManager.MakeStep.AutomaticallyAddedMakeArguments">
+       <value type="QString">-w</value>
+       <value type="QString">-r</value>
+      </valuelist>
+      <value type="bool" key="Qt4ProjectManager.MakeStep.Clean">true</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeArguments">clean</value>
+      <value type="QString" key="Qt4ProjectManager.MakeStep.MakeCommand"></value>
+     </valuemap>
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">1</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Clean</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Clean</value>
+    </valuemap>
+    <value type="int" key="ProjectExplorer.BuildConfiguration.BuildStepListCount">2</value>
+    <value type="bool" key="ProjectExplorer.BuildConfiguration.ClearSystemEnvironment">false</value>
+    <valuelist type="QVariantList" key="ProjectExplorer.BuildConfiguration.UserEnvironmentChanges"/>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Profile</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.Qt4BuildConfiguration</value>
+    <value type="int" key="Qt4ProjectManager.Qt4BuildConfiguration.BuildConfiguration">0</value>
+    <value type="bool" key="Qt4ProjectManager.Qt4BuildConfiguration.UseShadowBuild">true</value>
+   </valuemap>
+   <value type="int" key="ProjectExplorer.Target.BuildConfigurationCount">3</value>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.DeployConfiguration.0">
+    <valuemap type="QVariantMap" key="ProjectExplorer.BuildConfiguration.BuildStepList.0">
+     <value type="int" key="ProjectExplorer.BuildStepList.StepsCount">0</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Deploy</value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+     <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.BuildSteps.Deploy</value>
+    </valuemap>
+    <value type="int" key="ProjectExplorer.BuildConfiguration.BuildStepListCount">1</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Deploy locally</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">ProjectExplorer.DefaultDeployConfiguration</value>
+   </valuemap>
+   <value type="int" key="ProjectExplorer.Target.DeployConfigurationCount">1</value>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.PluginSettings"/>
+   <valuemap type="QVariantMap" key="ProjectExplorer.Target.RunConfiguration.0">
+    <value type="bool" key="Analyzer.QmlProfiler.AggregateTraces">false</value>
+    <value type="bool" key="Analyzer.QmlProfiler.FlushEnabled">false</value>
+    <value type="uint" key="Analyzer.QmlProfiler.FlushInterval">1000</value>
+    <value type="QString" key="Analyzer.QmlProfiler.LastTraceFile"></value>
+    <value type="bool" key="Analyzer.QmlProfiler.Settings.UseGlobalSettings">true</value>
+    <valuelist type="QVariantList" key="Analyzer.Valgrind.AddedSuppressionFiles"/>
+    <value type="bool" key="Analyzer.Valgrind.Callgrind.CollectBusEvents">false</value>
+    <value type="bool" key="Analyzer.Valgrind.Callgrind.CollectSystime">false</value>
+    <value type="bool" key="Analyzer.Valgrind.Callgrind.EnableBranchSim">false</value>
+    <value type="bool" key="Analyzer.Valgrind.Callgrind.EnableCacheSim">false</value>
+    <value type="bool" key="Analyzer.Valgrind.Callgrind.EnableEventToolTips">true</value>
+    <value type="double" key="Analyzer.Valgrind.Callgrind.MinimumCostRatio">0.01</value>
+    <value type="double" key="Analyzer.Valgrind.Callgrind.VisualisationMinimumCostRatio">10</value>
+    <value type="bool" key="Analyzer.Valgrind.FilterExternalIssues">true</value>
+    <value type="int" key="Analyzer.Valgrind.LeakCheckOnFinish">1</value>
+    <value type="int" key="Analyzer.Valgrind.NumCallers">25</value>
+    <valuelist type="QVariantList" key="Analyzer.Valgrind.RemovedSuppressionFiles"/>
+    <value type="int" key="Analyzer.Valgrind.SelfModifyingCodeDetection">1</value>
+    <value type="bool" key="Analyzer.Valgrind.Settings.UseGlobalSettings">true</value>
+    <value type="bool" key="Analyzer.Valgrind.ShowReachable">false</value>
+    <value type="bool" key="Analyzer.Valgrind.TrackOrigins">true</value>
+    <value type="QString" key="Analyzer.Valgrind.ValgrindExecutable">valgrind</value>
+    <valuelist type="QVariantList" key="Analyzer.Valgrind.VisibleErrorKinds">
+     <value type="int">0</value>
+     <value type="int">1</value>
+     <value type="int">2</value>
+     <value type="int">3</value>
+     <value type="int">4</value>
+     <value type="int">5</value>
+     <value type="int">6</value>
+     <value type="int">7</value>
+     <value type="int">8</value>
+     <value type="int">9</value>
+     <value type="int">10</value>
+     <value type="int">11</value>
+     <value type="int">12</value>
+     <value type="int">13</value>
+     <value type="int">14</value>
+    </valuelist>
+    <value type="int" key="PE.EnvironmentAspect.Base">2</value>
+    <valuelist type="QVariantList" key="PE.EnvironmentAspect.Changes"/>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DefaultDisplayName">Deploy_infer</value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.DisplayName"></value>
+    <value type="QString" key="ProjectExplorer.ProjectConfiguration.Id">Qt4ProjectManager.Qt4RunConfiguration:/home/nvidia/Desktop/Deploy_infer/Deploy_infer.pro</value>
+    <value type="bool" key="QmakeProjectManager.QmakeRunConfiguration.UseLibrarySearchPath">true</value>
+    <value type="QString" key="Qt4ProjectManager.Qt4RunConfiguration.CommandLineArguments"></value>
+    <value type="QString" key="Qt4ProjectManager.Qt4RunConfiguration.ProFile">Deploy_infer.pro</value>
+    <value type="bool" key="Qt4ProjectManager.Qt4RunConfiguration.UseDyldImageSuffix">false</value>
+    <value type="QString" key="Qt4ProjectManager.Qt4RunConfiguration.UserWorkingDirectory"></value>
+    <value type="QString" key="Qt4ProjectManager.Qt4RunConfiguration.UserWorkingDirectory.default">/home/nvidia/Desktop/build-Deploy_infer-Desktop-Debug</value>
+    <value type="uint" key="RunConfiguration.QmlDebugServerPort">3768</value>
+    <value type="bool" key="RunConfiguration.UseCppDebugger">false</value>
+    <value type="bool" key="RunConfiguration.UseCppDebuggerAuto">true</value>
+    <value type="bool" key="RunConfiguration.UseMultiProcess">false</value>
+    <value type="bool" key="RunConfiguration.UseQmlDebugger">false</value>
+    <value type="bool" key="RunConfiguration.UseQmlDebuggerAuto">true</value>
+   </valuemap>
+   <value type="int" key="ProjectExplorer.Target.RunConfigurationCount">1</value>
+  </valuemap>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.TargetCount</variable>
+  <value type="int">1</value>
+ </data>
+ <data>
+  <variable>ProjectExplorer.Project.Updater.FileVersion</variable>
+  <value type="int">18</value>
+ </data>
+ <data>
+  <variable>Version</variable>
+  <value type="int">18</value>
+ </data>
+</qtcreator>

+ 1872 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/inferthread.cpp

@@ -0,0 +1,1872 @@
+#include "inferthread.h"
+#include <QTimer>
+#include <ctime>
+
+void InferThread::setStopBtn(QPushButton *btn)
+{
+    btnStop = btn;
+}
+
+void InferThread::setInferBtn(QPushButton *btn)
+{
+    btnInfer = btn;
+}
+
+void InferThread::setDetThreshold(float threshold)
+{
+    det_Threshold = threshold;
+}
+
+void InferThread::setInferDelay(int delay)
+{
+    infer_Delay = delay;
+}
+
+uchar *InferThread::get_color_map_list(int num_classes)
+{
+    uchar *color_list = new uchar[num_classes * 3];
+    num_classes += 1;
+    for (int i = 1; i < num_classes; i++)
+    {
+        int j = 0;
+        int lab = i;
+        while (lab != 0)
+        {
+            color_list[(i-1) * 3] |= (uchar)(((lab >> 0) & 1) << (7 - j));
+            color_list[(i-1) * 3 + 1] |= (uchar)(((lab >> 1) & 1) << (7 - j));
+            color_list[(i-1) * 3 + 2] |= (uchar)(((lab >> 2) & 1) << (7 - j));
+
+            j += 1;
+            lab >>= 3;
+        }
+    }
+    return color_list;
+}
+
+InferThread::InferThread(QObject *parent) : QThread(parent)
+{
+    doing_Infer = false;
+    break_Infer = false;
+    dataLoaded = false;  // false: Unloaded data
+    color_map = get_color_map_list();
+
+    model_Type = "det";
+    image_path = "";
+    images_path = QStringList();
+    video_path = "";
+
+    label1_image = nullptr;
+    label2_image = nullptr;
+
+    image1 = nullptr;
+    image2 = nullptr;
+}
+
+void InferThread::setModelType(QString &model_type)
+{
+    if (model_type=="det") // Check whether the type is met, otherwise set ""
+    {
+        model_Type = model_type;
+        return;
+    }
+    else if (model_type=="seg")
+    {
+        model_Type = model_type;
+        return;
+    }
+    else if (model_type=="clas")
+    {
+        model_Type = model_type;
+        return;
+    }
+    else if (model_type=="mask")
+    {
+        model_Type = model_type;
+        return;
+    }
+    else
+    {
+        // set empty
+        model_Type = "";
+    }
+}
+
+void InferThread::setInputImage(QString &image_path)
+{
+    this->image_path = image_path;
+    this->images_path = QStringList();
+    this->video_path = "";
+
+    dataLoaded = true;
+}
+
+void InferThread::setInputImages(QStringList &images_path)
+{
+    this->images_path = images_path;
+    this->image_path = "";
+    this->video_path = "";
+
+    dataLoaded = true;
+}
+
+void InferThread::setInputVideo(QString &video_path)
+{
+    this->video_path = video_path;
+    this->image_path = "";
+    this->images_path = QStringList();
+
+    dataLoaded = true;
+}
+
+void InferThread::setInferFuncs(Det_ModelPredict det_Inferfunc, Seg_ModelPredict seg_Inferfunc, Cls_ModelPredict cls_Inferfunc, Mask_ModelPredict mask_Inferfunc)
+{
+    det_ModelPredict = det_Inferfunc;
+    seg_ModelPredict = seg_Inferfunc;
+    cls_ModelPredict = cls_Inferfunc;
+    mask_ModelPredict = mask_Inferfunc;
+}
+
+void InferThread::runInferDet()
+{
+    if (doing_Infer == false)
+    {
+        if (is_InferImage())
+        {
+            Det_Image();
+        }
+        else if (is_InferImages())
+        {
+            Det_Images();
+        }
+        else if (is_InferVideo())
+        {
+            Det_Video();
+        }
+    }
+    else
+    {
+        // TODO
+    }
+}
+
+void InferThread::runInferSeg()
+{
+    if (doing_Infer == false)
+    {
+        if (is_InferImage())
+        {
+            Seg_Image();
+        }
+        else if (is_InferImages())
+        {
+            Seg_Images();
+        }
+        else if (is_InferVideo())
+        {
+            Seg_Video();
+        }
+    }
+    else
+    {
+        // TODO
+    }
+}
+
+void InferThread::runInferCls()
+{
+    if (doing_Infer == false)
+    {
+        if (is_InferImage())
+        {
+            Cls_Image();
+        }
+        else if (is_InferImages())
+        {
+            Cls_Images();
+        }
+        else if (is_InferVideo())
+        {
+            Cls_Video();
+        }
+    }
+    else
+    {
+        // TODO
+    }
+}
+
+void InferThread::runInferMask()
+{
+    if (doing_Infer == false)
+    {
+        if (is_InferImage())
+        {
+            Mask_Image();
+        }
+        else if (is_InferImages())
+        {
+            Mask_Images();
+        }
+        else if (is_InferVideo())
+        {
+            Mask_Video();
+        }
+    }
+    else
+    {
+        // TODO
+    }
+}
+
+// The thread actually runs the configuration
+void InferThread::run()
+{
+    if (model_Type == "det")
+    {
+        runInferDet();
+    }
+    else if (model_Type == "seg")
+    {
+        runInferSeg();
+    }
+    else if (model_Type == "clas")
+    {
+        runInferCls();
+    }
+    else if (model_Type == "mask")
+    {
+        runInferMask();
+    }
+
+}
+
+bool InferThread::is_InferImage()
+{
+    if (image_path.isEmpty()) return false;
+    else return true;
+}
+
+bool InferThread::is_InferImages()
+{
+    if (images_path.isEmpty()) return false;
+    else return true;
+}
+
+bool InferThread::is_InferVideo()
+{
+    if (video_path.isEmpty()) return false;
+    else return true;
+}
+
+QString InferThread::makeLabelInfo(QString label, int id, float score)
+{
+    QString describe_str = QString::number(id) + ":";
+    describe_str += label + "-";
+    describe_str += QString::number(score);
+
+    return describe_str;
+}
+
+void InferThread::Det_Image()
+{
+    // Read the picture
+    Mat image = imread(image_path.toLocal8Bit().toStdString());  //BGR
+
+    if (image.cols > 512 || image.rows > 512)
+    {
+        float ratio = min(image.cols, image.rows) / 512.;
+        int new_h = image.cols / ratio;
+        int new_w = image.rows / ratio;
+
+        cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4)); // Make sure pixMap displays properly - cut to scale images
+    }
+
+    // Predict output result
+    float bboxs[600];
+    int bbox_num[1];
+    char labellist[1000];
+
+    // Set the start reasoning state
+    doing_Infer = true;
+    try {
+        clock_t start_infer_time = clock();
+        // Perform reasoning and get results
+        qDebug() << "Doing Det-Infer." << "\n";
+        det_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, bboxs, bbox_num, labellist);
+        double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+        emit SetCostTime(cost_time);
+    } catch (QException &e) {
+        // Set the end reasoning state
+        doing_Infer = false;
+        qDebug() << "Finished Det-Infer, but it is raise a exception." << "\n";
+
+        emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+//    btnStop->setEnabled(false);  // When reasoning is complete, close the button of reasoning interruption to prevent late point
+//    btnInfer->setEnabled(true);  // When the reasoning is complete, the button for the execution of the reasoning is opened to allow the reasoning again
+        return;
+    }
+
+    // Post-processing
+    cvtColor(image, image, COLOR_BGR2RGB);
+    if (image1 == nullptr)
+    {
+        image1 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image1;
+        image1 = new Mat(image.clone());
+    }
+    if (label1_image == nullptr)
+    {
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label1_image;
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+
+    QString labels(labellist);
+    QStringList label_list = labels.split(' ');  // Get Label
+    for (int i = 0; i < bbox_num[0]; i++)
+    {
+        int categry_id = (int)bboxs[i*6];
+        float score = bboxs[i*6 + 1];
+        int left_topx = (int)bboxs[i*6 + 2];
+        int left_topy = (int)bboxs[i*6 + 3];
+        int right_downx = left_topx + (int)bboxs[i*6 + 4];  // Parameters 4 and 5 are width and height, but the same DLL using c# is the lower right vertex
+        int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+        if (score >= det_Threshold)
+        {
+            int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                              (int)(color_map[(categry_id % 256) * 3 + 1]),
+                              (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+            QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+            int baseline[1];
+            auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                              1.0, 2, baseline);
+            int text_left_downx = left_topx; // Small offset adjustment: (int)(text_size.Width/10)
+            int text_left_downy = left_topy + text_size.height;
+
+            rectangle(image, Point(left_topx, left_topy),
+                      Point(right_downx, right_downy),
+                      Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                    FONT_HERSHEY_SIMPLEX, 1.0,
+                    Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+        }
+    }
+
+    if (image2 == nullptr)
+    {
+        image2 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image2;
+        image2 = new Mat(image.clone());
+    }
+    if (label2_image == nullptr)
+    {
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label2_image;
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+
+    emit InferFinished(label1_image, label2_image);
+
+    // Set the end reasoning state
+    doing_Infer = false;
+    qDebug() << "Finished Det-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+//    btnStop->setEnabled(false);
+//    btnInfer->setEnabled(true);
+}
+
+void InferThread::Det_Images()
+{
+    doing_Infer = true;
+
+    for (int j = 0; j < images_path.count(); j++)
+    {
+        if (break_Infer) // Exit continuous detection
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Det-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+            return;
+        }
+
+        QString img_file = images_path[j]; // Get image path
+        Mat image = imread(img_file.toLocal8Bit().toStdString());  // Help with Chinese paths
+
+        if (image.cols > 512 || image.rows > 512)
+        {
+            float ratio = min(image.cols, image.rows) / 512.;
+            int new_h = image.cols / ratio;
+            int new_w = image.rows / ratio;
+
+            cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        float bboxs[600];
+        int bbox_num[1];
+        char labellist[1000];
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Det-Infer." << "\n";
+            det_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, bboxs, bbox_num, labellist);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Det-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+            return;
+        }
+
+        cvtColor(image, image, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(image.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        QString labels(labellist);
+        QStringList label_list = labels.split(' ');
+        for (int i = 0; i < bbox_num[0]; i++)
+        {
+            int categry_id = (int)bboxs[i*6];
+            float score = bboxs[i*6 + 1];
+            int left_topx = (int)bboxs[i*6 + 2];
+            int left_topy = (int)bboxs[i*6 + 3];
+            int right_downx = left_topx + (int)bboxs[i*6 + 4];
+            int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+            if (score >= det_Threshold)
+            {
+                int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 1]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+                QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+                int baseline[1];
+                auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                                  1.0, 2, baseline);
+                int text_left_downx = left_topx;
+                int text_left_downy = left_topy + text_size.height;
+
+                rectangle(image, Point(left_topx, left_topy),
+                          Point(right_downx, right_downy),
+                          Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+                putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                        FONT_HERSHEY_SIMPLEX, 1.0,
+                        Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            }
+        }
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(image.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        this->msleep(infer_Delay); // Thread sleep wait
+    }
+
+    doing_Infer = false;
+    qDebug() << "Finished Det-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+}
+
+void InferThread::Det_Video()
+{
+    doing_Infer = true;
+
+    VideoCapture cap = VideoCapture(video_path.toLocal8Bit().toStdString());
+    if(!cap.isOpened()) return; // Return if the video does not open properly
+
+    Mat frame;
+    cap >> frame;
+    while(!frame.empty()) // Exit the loop if a frame is empty
+    {
+        if (frame.cols > 512 || frame.rows > 512)
+        {
+            float ratio = min(frame.cols, frame.rows) / 512.;
+            int new_h = frame.cols / ratio;
+            int new_w = frame.rows / ratio;
+
+            cv::resize(frame, frame, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Det-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+            return;
+        }
+
+        float bboxs[600];
+        int bbox_num[1];
+        char labellist[1000];
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Det-Infer." << "\n";
+            det_ModelPredict((const uchar*)frame.data, frame.cols, frame.rows, 3, bboxs, bbox_num, labellist);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Det-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        cvtColor(frame, frame, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(frame.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        QString labels(labellist);
+        QStringList label_list = labels.split(' ');
+        for (int i = 0; i < bbox_num[0]; i++)
+        {
+            int categry_id = (int)bboxs[i*6];
+            float score = bboxs[i*6 + 1];
+            int left_topx = (int)bboxs[i*6 + 2];
+            int left_topy = (int)bboxs[i*6 + 3];
+            int right_downx = left_topx + (int)bboxs[i*6 + 4];
+            int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+            if (score >= det_Threshold)
+            {
+                int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 1]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+                QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+                int baseline[1];
+                auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                                  1.0, 2, baseline);
+                int text_left_downx = left_topx;
+                int text_left_downy = left_topy + text_size.height;
+
+                rectangle(frame, Point(left_topx, left_topy),
+                          Point(right_downx, right_downy),
+                          Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+                putText(frame, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                        FONT_HERSHEY_SIMPLEX, 1.0,
+                        Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            }
+        }
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(frame.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        cap >> frame;
+    }
+
+    doing_Infer = false;
+    qDebug() << "Finished Det-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+}
+
+void InferThread::Seg_Image()
+{
+    Mat image = imread(image_path.toLocal8Bit().toStdString());  //BGR
+
+    if (image.cols > 512 || image.rows > 512)
+    {
+        float ratio = min(image.cols, image.rows) / 512.;
+        int new_h = image.cols / ratio;
+        int new_w = image.rows / ratio;
+
+        cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+    }
+
+    // Predict output result
+    unsigned char out_image[image.cols * image.rows];
+
+    doing_Infer = true;
+    try {
+        clock_t start_infer_time = clock();
+
+        qDebug() << "Doing Seg-Infer." << "\n";
+        seg_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, out_image);
+        double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+        emit SetCostTime(cost_time);
+    } catch (QException &e) {
+
+        doing_Infer = false;
+        qDebug() << "Finished Seg-Infer, but it is raise a exception." << "\n";
+
+        emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+        return;
+    }
+
+    // Generate the mask three-channel image
+    Mat out3c_image = Mat(image.clone());
+    for (int i = 0; i < out3c_image.rows; i++)   // height
+    {
+        for (int j = 0; j < out3c_image.cols; j++)  // width
+        {
+            int indexSrc = i*out3c_image.cols + j;
+
+            unsigned char color_id = (int)out_image[indexSrc] % 256; // Pixel category ID
+
+            if (color_id == 0)
+                out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+            else
+                out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+        }
+    }
+
+    cvtColor(image, image, COLOR_BGR2RGB);
+    if (image1 == nullptr)
+    {
+        image1 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image1;
+        image1 = new Mat(image.clone());
+    }
+    if (label1_image == nullptr)
+    {
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label1_image;
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+
+    // merge images
+    addWeighted(image, 0.5, out3c_image, 0.5, 0, image);
+
+    if (image2 == nullptr)
+    {
+        image2 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image2;
+        image2 = new Mat(image.clone());
+    }
+    if (label2_image == nullptr)
+    {
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label2_image;
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+
+    emit InferFinished(label1_image, label2_image);
+
+    doing_Infer = false;
+    qDebug() << "Finished Seg-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Seg_Images()
+{
+
+    doing_Infer = true;
+
+    for (int j = 0; j < images_path.count(); j++)
+    {
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Seg-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        QString img_file = images_path[j];
+        Mat image = imread(img_file.toLocal8Bit().toStdString());
+
+        if (image.cols > 512 || image.rows > 512)
+        {
+            float ratio = min(image.cols, image.rows) / 512.;
+            int new_h = image.cols / ratio;
+            int new_w = image.rows / ratio;
+
+            cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        unsigned char out_image[image.cols * image.rows];
+        memset(out_image, 0, sizeof (out_image));
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing --Seg Infer." << "\n";
+            seg_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, out_image);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Seg-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        Mat out3c_image = Mat(image.clone());
+        for (int i = 0; i < out3c_image.rows; i++)   // height
+        {
+            for (int j = 0; j < out3c_image.cols; j++)  // width
+            {
+                int indexSrc = i*out3c_image.cols + j;
+
+                unsigned char color_id = (int)out_image[indexSrc] % 256;
+
+                if (color_id == 0)
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+                else
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+            }
+        }
+
+
+        cvtColor(image, image, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(image.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+
+        addWeighted(image, 0.5, out3c_image, 0.5, 0, image);
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(image.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        this->msleep(infer_Delay);
+    }
+
+
+    doing_Infer = false;
+    qDebug() << "Finished Seg-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Seg_Video()
+{
+
+    doing_Infer = true;
+
+    VideoCapture cap = VideoCapture(video_path.toLocal8Bit().toStdString());
+    if(!cap.isOpened()) return;
+
+    Mat frame;
+    cap >> frame;
+    while(!frame.empty())
+    {
+        if (frame.cols > 512 || frame.rows > 512)
+        {
+            float ratio = min(frame.cols, frame.rows) / 512.;
+            int new_h = frame.cols / ratio;
+            int new_w = frame.rows / ratio;
+
+            cv::resize(frame, frame, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Seg-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        unsigned char out_image[frame.cols * frame.rows];
+        memset(out_image, 0, sizeof (out_image));
+
+        try {
+            clock_t start_infer_time = clock();
+            
+            qDebug() << "Doing Seg-Infer." << "\n";
+            seg_ModelPredict((const uchar*)frame.data, frame.cols, frame.rows, 3, out_image);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Seg-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+            
+            return;
+        }
+
+
+        Mat out3c_image = Mat(frame.clone());
+        for (int i = 0; i < out3c_image.rows; i++)   // height
+        {
+            for (int j = 0; j < out3c_image.cols; j++)  // width
+            {
+                int indexSrc = i*out3c_image.cols + j;
+
+                unsigned char color_id = (int)out_image[indexSrc] % 256;
+
+                if (color_id == 0)
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+                else
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+            }
+        }
+
+
+        cvtColor(frame, frame, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(frame.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+
+        addWeighted(frame, 0.5, out3c_image, 0.5, 0, frame);
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(frame.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        cap >> frame;
+    }
+
+
+    doing_Infer = false;
+    qDebug() << "Finished Seg-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Cls_Image()
+{
+
+    Mat image = imread(image_path.toLocal8Bit().toStdString());  //BGR
+
+    if (image.cols > 512 || image.rows > 512)
+    {
+        float ratio = min(image.cols, image.rows) / 512.;
+        int new_h = image.cols / ratio;
+        int new_w = image.rows / ratio;
+
+        cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+    }
+
+    // Predict output result
+    float pre_score[1];
+    int pre_category_id[1];
+    char pre_category[200];
+
+    doing_Infer = true;
+    try {
+        clock_t start_infer_time = clock();
+
+        qDebug() << "Doing Clas-Infer." << "\n";
+        cls_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, pre_score, pre_category, pre_category_id);
+        double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+        emit SetCostTime(cost_time);
+    } catch (QException &e) {
+
+        doing_Infer = false;
+        qDebug() << "Finished Clas-Infer, but it is raise a exception." << "\n";
+
+        emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+        return;
+    }
+
+
+    cvtColor(image, image, COLOR_BGR2RGB);
+
+    float ratio = min(image.cols, image.rows) / 512.;
+    int new_h = image.cols / ratio;
+    int new_w = image.rows / ratio;
+    cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+
+    if (image1 == nullptr)
+    {
+        image1 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image1;
+        image1 = new Mat(image.clone());
+    }
+    if (label1_image == nullptr)
+    {
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label1_image;
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+
+    int color_[3] = { (int)(color_map[(pre_category_id[0] % 256) * 3]),
+                      (int)(color_map[(pre_category_id[0] % 256) * 3 + 1]),
+                      (int)(color_map[(pre_category_id[0] % 256) * 3 + 2]) };
+
+    QString disscribe_str = makeLabelInfo(QString(pre_category), pre_category_id[0], pre_score[0]);
+    int baseline[1];
+    auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                      1.0, 2, baseline);
+    int text_left_downx = 0;
+    int text_left_downy = 0 + text_size.height;
+
+    putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+            FONT_HERSHEY_SIMPLEX, 1.0,
+            Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+
+
+    if (image2 == nullptr)
+    {
+        image2 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image2;
+        image2 = new Mat(image.clone());
+    }
+    if (label2_image == nullptr)
+    {
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label2_image;
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+
+    emit InferFinished(label1_image, label2_image);
+
+    doing_Infer = false;
+    qDebug() << "Finished Clas-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Cls_Images()
+{
+
+    doing_Infer = true;
+
+    for (int j = 0; j < images_path.count(); j++)
+    {
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Clas-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        QString img_file = images_path[j];
+        Mat image = imread(img_file.toLocal8Bit().toStdString());
+
+        if (image.cols > 512 || image.rows > 512)
+        {
+            float ratio = min(image.cols, image.rows) / 512.;
+            int new_h = image.cols / ratio;
+            int new_w = image.rows / ratio;
+
+            cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+
+        float pre_score[1];
+        int pre_category_id[1];
+        char pre_category[200];
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Clas-Infer." << "\n";
+            cls_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, pre_score, pre_category, pre_category_id);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Clas-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        cvtColor(image, image, COLOR_BGR2RGB);
+
+        float ratio = min(image.cols, image.rows) / 512.;
+        int new_h = image.cols / ratio;
+        int new_w = image.rows / ratio;
+        cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(image.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        int color_[3] = { (int)(color_map[(pre_category_id[0] % 256) * 3]),
+                          (int)(color_map[(pre_category_id[0] % 256) * 3 + 1]),
+                          (int)(color_map[(pre_category_id[0] % 256) * 3 + 2]) };
+
+        QString disscribe_str = makeLabelInfo(QString(pre_category), pre_category_id[0], pre_score[0]);
+        int baseline[1];
+        auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                          1.0, 2, baseline);
+        int text_left_downx = 0;
+        int text_left_downy = 0 + text_size.height;
+
+        putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                FONT_HERSHEY_SIMPLEX, 1.0,
+                Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(image.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        this->msleep(infer_Delay);
+    }
+
+
+    doing_Infer = false;
+    qDebug() << "Finished Clas-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Cls_Video()
+{
+
+    doing_Infer = true;
+
+    VideoCapture cap = VideoCapture(video_path.toLocal8Bit().toStdString());
+    if(!cap.isOpened()) return;
+
+    Mat frame;
+    cap >> frame;
+    while(!frame.empty())
+    {
+        if (frame.cols > 512 || frame.rows > 512)
+        {
+            float ratio = min(frame.cols, frame.rows) / 512.;
+            int new_h = frame.cols / ratio;
+            int new_w = frame.rows / ratio;
+
+            cv::resize(frame, frame, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Clas-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        float pre_score[1];
+        int pre_category_id[1];
+        char pre_category[200];
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Clas-Infer." << "\n";
+            cls_ModelPredict((const uchar*)frame.data, frame.cols, frame.rows, 3, pre_score, pre_category, pre_category_id);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Clas-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        cvtColor(frame, frame, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(frame.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        int color_[3] = { (int)(color_map[(pre_category_id[0] % 256) * 3]),
+                          (int)(color_map[(pre_category_id[0] % 256) * 3 + 1]),
+                          (int)(color_map[(pre_category_id[0] % 256) * 3 + 2]) };
+
+        QString disscribe_str = makeLabelInfo(QString(pre_category), pre_category_id[0], pre_score[0]);
+        int baseline[1];
+        auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                          1.0, 2, baseline);
+        int text_left_downx = 0;
+        int text_left_downy = 0 + text_size.height;
+
+        putText(frame, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                FONT_HERSHEY_SIMPLEX, 1.0,
+                Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(frame.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        cap >> frame;
+    }
+
+
+    doing_Infer = false;
+    qDebug() << "Finished Clas-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Mask_Image()
+{
+
+    Mat image = imread(image_path.toLocal8Bit().toStdString());  //BGR
+
+    if (image.cols > 512 || image.rows > 512)
+    {
+        float ratio = min(image.cols, image.rows) / 512.;
+        int new_h = image.cols / ratio;
+        int new_w = image.rows / ratio;
+
+        cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+    }
+
+    // Predict output result
+    float bboxs[600];
+    int bbox_num[1];
+    char labellist[1000];
+    unsigned char out_image[image.cols * image.rows];
+
+    doing_Infer = true;
+    try {
+        clock_t start_infer_time = clock();
+
+        qDebug() << "Doing Mask-Infer." << "\n";
+        mask_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, bboxs, out_image, bbox_num, labellist);
+        double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+        emit SetCostTime(cost_time);
+    } catch (QException &e) {
+
+        doing_Infer = false;
+        qDebug() << "Finished Mask-Infer, but it is raise a exception." << "\n";
+
+        emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+        return;
+    }
+
+    Mat out3c_image = Mat(image.clone());
+    for (int i = 0; i < out3c_image.rows; i++)   // height
+    {
+        for (int j = 0; j < out3c_image.cols; j++)  // width
+        {
+            int indexSrc = i*out3c_image.cols + j;
+
+            unsigned char color_id = (int)out_image[indexSrc] % 256;
+
+            if (color_id == 0)
+                out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+            else
+                out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+        }
+    }
+
+
+    cvtColor(image, image, COLOR_BGR2RGB);
+    if (image1 == nullptr)
+    {
+        image1 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image1;
+        image1 = new Mat(image.clone());
+    }
+    if (label1_image == nullptr)
+    {
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label1_image;
+        label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                  image1->step, QImage::Format_RGB888);
+    }
+
+
+    addWeighted(image, 0.5, out3c_image, 0.5, 0, image);
+
+    QString labels(labellist);
+    QStringList label_list = labels.split(' ');  // Get Label
+    for (int i = 0; i < bbox_num[0]; i++)
+    {
+        int categry_id = (int)bboxs[i*6];
+        float score = bboxs[i*6 + 1];
+        int left_topx = (int)bboxs[i*6 + 2];
+        int left_topy = (int)bboxs[i*6 + 3];
+        int right_downx = left_topx + (int)bboxs[i*6 + 4];
+        int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+        if (score >= det_Threshold)
+        {
+            int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                              (int)(color_map[(categry_id % 256) * 3 + 1]),
+                              (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+            QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+            int baseline[1];
+            auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                              1.0, 2, baseline);
+            int text_left_downx = left_topx;
+            int text_left_downy = left_topy + text_size.height;
+
+            rectangle(image, Point(left_topx, left_topy),
+                      Point(right_downx, right_downy),
+                      Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                    FONT_HERSHEY_SIMPLEX, 1.0,
+                    Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+        }
+    }
+
+    if (image2 == nullptr)
+    {
+        image2 = new Mat(image.clone());
+    }
+    else
+    {
+        delete image2;
+        image2 = new Mat(image.clone());
+    }
+    if (label2_image == nullptr)
+    {
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+    else
+    {
+        delete label2_image;
+        label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                  image2->step, QImage::Format_RGB888);
+    }
+
+    emit InferFinished(label1_image, label2_image);
+
+
+    doing_Infer = false;
+    qDebug() << "Finished Mask-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Mask_Images()
+{
+
+    doing_Infer = true;
+
+    for (int j = 0; j < images_path.count(); j++)
+    {
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Mask-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        QString img_file = images_path[j];
+        Mat image = imread(img_file.toLocal8Bit().toStdString());
+
+        if (image.cols > 512 || image.rows > 512)
+        {
+            float ratio = min(image.cols, image.rows) / 512.;
+            int new_h = image.cols / ratio;
+            int new_w = image.rows / ratio;
+
+            cv::resize(image, image, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        float bboxs[600];
+        int bbox_num[1];
+        char labellist[1000];
+        unsigned char out_image[image.cols * image.rows];
+        memset(out_image, 0, sizeof (out_image));
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Mask-Infer." << "\n";
+            mask_ModelPredict((const uchar*)image.data, image.cols, image.rows, 3, bboxs, out_image, bbox_num, labellist);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Mask-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        Mat out3c_image = Mat(image.clone());
+        for (int i = 0; i < out3c_image.rows; i++)   // height
+        {
+            for (int j = 0; j < out3c_image.cols; j++)  // width
+            {
+                int indexSrc = i*out3c_image.cols + j;
+
+                unsigned char color_id = (int)out_image[indexSrc] % 256;
+
+                if (color_id == 0)
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+                else
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+            }
+        }
+
+
+        cvtColor(image, image, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(image.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        addWeighted(image, 0.5, out3c_image, 0.5, 0, image);
+
+        QString labels(labellist);
+        QStringList label_list = labels.split(' ');  // 获取label
+        for (int i = 0; i < bbox_num[0]; i++)
+        {
+            int categry_id = (int)bboxs[i*6];
+            float score = bboxs[i*6 + 1];
+            int left_topx = (int)bboxs[i*6 + 2];
+            int left_topy = (int)bboxs[i*6 + 3];
+            int right_downx = left_topx + (int)bboxs[i*6 + 4];
+            int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+            if (score >= det_Threshold)
+            {
+                int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 1]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+                QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+                int baseline[1];
+                auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                                  1.0, 2, baseline);
+                int text_left_downx = left_topx;
+                int text_left_downy = left_topy + text_size.height;
+
+                rectangle(image, Point(left_topx, left_topy),
+                          Point(right_downx, right_downy),
+                          Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+                putText(image, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                        FONT_HERSHEY_SIMPLEX, 1.0,
+                        Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            }
+        }
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(image.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(image.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        this->msleep(infer_Delay);
+    }
+
+    doing_Infer = false;
+    qDebug() << "Finished Mask-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}
+
+void InferThread::Mask_Video()
+{
+    doing_Infer = true;
+
+    VideoCapture cap = VideoCapture(video_path.toLocal8Bit().toStdString());
+    if(!cap.isOpened()) return;
+
+    Mat frame;
+    cap >> frame;
+    while(!frame.empty())
+    {
+        if (frame.cols > 512 || frame.rows > 512)
+        {
+            float ratio = min(frame.cols, frame.rows) / 512.;
+            int new_h = frame.cols / ratio;
+            int new_w = frame.rows / ratio;
+
+            cv::resize(frame, frame, cv::Size(new_h/4*4,new_w/4*4));
+        }
+
+        if (break_Infer)
+        {
+            doing_Infer = false;
+            break_Infer = false;
+
+            qDebug() << "Mask-Infer has Break." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+
+        float bboxs[600];
+        int bbox_num[1];
+        char labellist[1000];
+        unsigned char out_image[frame.cols * frame.rows];
+        memset(out_image, 0, sizeof (out_image));
+
+        try {
+            clock_t start_infer_time = clock();
+
+            qDebug() << "Doing Mask-Infer." << "\n";
+            mask_ModelPredict((const uchar*)frame.data, frame.cols, frame.rows, 3, bboxs, out_image, bbox_num, labellist);
+            double cost_time = 1000 * (clock() - start_infer_time) / (double)CLOCKS_PER_SEC;
+            emit SetCostTime(cost_time);
+        } catch (QException &e) {
+
+            doing_Infer = false;
+            qDebug() << "Finished Mask-Infer, but it is raise a exception." << "\n";
+
+            emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+            return;
+        }
+
+        Mat out3c_image = Mat(frame.clone());
+        for (int i = 0; i < out3c_image.rows; i++)   // height
+        {
+            for (int j = 0; j < out3c_image.cols; j++)  // width
+            {
+                int indexSrc = i*out3c_image.cols + j;
+
+                unsigned char color_id = (int)out_image[indexSrc] % 256;
+
+                if (color_id == 0)
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(0, 0, 0);
+                else
+                    out3c_image.at<Vec3b>(i, j) = Vec3b(color_map[color_id * 3], color_map[color_id * 3 + 1], color_map[color_id * 3 + 2]);
+            }
+        }
+
+        cvtColor(frame, frame, COLOR_BGR2RGB);
+        if (image1 == nullptr)
+        {
+            image1 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image1;
+            image1 = new Mat(frame.clone());
+        }
+        if (label1_image == nullptr)
+        {
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label1_image;
+            label1_image = new QImage((const uchar*)image1->data, image1->cols, image1->rows,
+                                      image1->step, QImage::Format_RGB888);
+        }
+
+        addWeighted(frame, 0.5, out3c_image, 0.5, 0, frame);
+
+        QString labels(labellist);
+        QStringList label_list = labels.split(' ');
+        for (int i = 0; i < bbox_num[0]; i++)
+        {
+            int categry_id = (int)bboxs[i*6];
+            float score = bboxs[i*6 + 1];
+            int left_topx = (int)bboxs[i*6 + 2];
+            int left_topy = (int)bboxs[i*6 + 3];
+            int right_downx = left_topx + (int)bboxs[i*6 + 4];
+            int right_downy = left_topy + (int)bboxs[i*6 + 5];
+
+            if (score >= det_Threshold)
+            {
+                int color_[3] = { (int)(color_map[(categry_id % 256) * 3]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 1]),
+                                  (int)(color_map[(categry_id % 256) * 3 + 2]) };
+
+                QString disscribe_str = makeLabelInfo(label_list[i], categry_id, score);
+                int baseline[1];
+                auto text_size = getTextSize(disscribe_str.toStdString(), FONT_HERSHEY_SIMPLEX,
+                                  1.0, 2, baseline);
+                int text_left_downx = left_topx;
+                int text_left_downy = left_topy + text_size.height;
+
+                rectangle(frame, Point(left_topx, left_topy),
+                          Point(right_downx, right_downy),
+                          Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+                putText(frame, disscribe_str.toStdString(), Point(text_left_downx, text_left_downy),
+                        FONT_HERSHEY_SIMPLEX, 1.0,
+                        Scalar(color_[0], color_[1], color_[2]), 2, LINE_8);
+            }
+        }
+
+        if (image2 == nullptr)
+        {
+            image2 = new Mat(frame.clone());
+        }
+        else
+        {
+            delete image2;
+            image2 = new Mat(frame.clone());
+        }
+        if (label2_image == nullptr)
+        {
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+        else
+        {
+            delete label2_image;
+            label2_image = new QImage((const uchar*)image2->data, image2->cols, image2->rows,
+                                      image2->step, QImage::Format_RGB888);
+        }
+
+        emit InferFinished(label1_image, label2_image);
+
+        cap >> frame;
+    }
+
+    doing_Infer = false;
+    qDebug() << "Finished Mask-Infer." << "\n";
+
+    emit SetState_Btn_StopAndInfer(false, true);  // first is stop, second is infer
+
+}

+ 135 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/inferthread.h

@@ -0,0 +1,135 @@
+#ifndef INFERTHREAD_H
+#define INFERTHREAD_H
+
+#include <QObject>
+#include <QString>
+#include <QException>
+#include <QDebug>
+#include <QPixmap>
+#include <QtWidgets/QPushButton>
+#include <QtWidgets/QLabel>
+#include <QThread>
+
+#include "opencv2/opencv.hpp"
+#include "opencv2/core/core.hpp"
+#include "opencv2/highgui/highgui.hpp"
+#include "opencv2/imgproc/imgproc.hpp"
+using namespace cv;
+
+// Model inference API: det, seg, clas, mask
+typedef void (*Det_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 float* , int* , char* );
+typedef void (*Seg_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 unsigned char* );
+typedef void (*Cls_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 float* , char* , int* );
+typedef void (*Mask_ModelPredict)(const unsigned char* ,
+                                  int , int , int ,
+                                  float* , unsigned char* ,
+                                  int* , char* );
+
+class InferThread : public QThread
+{
+    Q_OBJECT
+private:
+    QString model_Type;
+    uchar *color_map;   // visual color table/map
+
+    QString image_path;
+    QStringList images_path;
+    QString video_path;
+
+public:
+    float det_Threshold;  // Target detection threshold
+    int infer_Delay;    // Continuous inference interval
+
+    bool doing_Infer;  // Sign reasoning
+    bool break_Infer;  // Terminate continuous reasoning/video reasoning
+
+    bool dataLoaded;  // Whether data has been loaded
+
+    QImage* label1_image;
+    QImage* label2_image;
+
+    Mat* image1;
+    Mat* image2;
+
+
+private:
+    // Model inference API
+    Det_ModelPredict det_ModelPredict;
+    Seg_ModelPredict seg_ModelPredict;
+    Cls_ModelPredict cls_ModelPredict;
+    Mask_ModelPredict mask_ModelPredict;
+
+private:
+    // don`t use
+    QPushButton *btnStop;  // Point to the external termination button
+    QPushButton *btnInfer;  // Point to the external reasoning button
+    QLabel *labelImage1;   // Image display area-Label--left
+    QLabel *labelImage2;   // Image display area-Label--right
+
+public:
+    void setStopBtn(QPushButton *btn);
+    void setInferBtn(QPushButton *btn);
+
+    void setDetThreshold(float threshold);
+    void setInferDelay(int delay);
+    uchar* get_color_map_list(int num_classes=256);
+
+public:
+    explicit InferThread(QObject *parent = nullptr);
+    void setModelType(QString & model_type);  // Setting the model type of reasoning - use the correct model reasoning interface
+    void setInputImage(QString & image_path);
+    void setInputImages(QStringList & images_path);
+    void setInputVideo(QString & video_path);
+    void setInferFuncs(Det_ModelPredict det_Inferfunc, Seg_ModelPredict seg_Inferfunc,
+                       Cls_ModelPredict cls_Inferfunc, Mask_ModelPredict mask_Inferfunc);
+    void runInferDet();
+    void runInferSeg();
+    void runInferCls();
+    void runInferMask();
+    void run() override; // Execute this thread
+
+private:
+    bool is_InferImage();
+    bool is_InferImages();
+    bool is_InferVideo();
+    QString makeLabelInfo(QString label, int id, float score);
+
+// Detecting the inference interface
+public:
+    void Det_Image();
+    void Det_Images();
+    void Det_Video();
+
+// Semantic segmentation reasoning interface
+public:
+    void Seg_Image();
+    void Seg_Images();
+    void Seg_Video();
+
+// Classification inference interface
+public:
+    void Cls_Image();
+    void Cls_Images();
+    void Cls_Video();
+
+// Instance split reasoning interface
+public:
+    void Mask_Image();
+    void Mask_Images();
+    void Mask_Video();
+
+signals:
+    void InferFinished(QImage* label1, QImage* label2);
+    void SetState_Btn_StopAndInfer(bool stop_state, bool infer_state);
+    void SetCostTime(double cost_time);
+
+public slots:
+};
+
+#endif // INFERTHREAD_H

+ 11 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/main.cpp

@@ -0,0 +1,11 @@
+#include "mainwindow.h"
+#include <QApplication>
+
+int main(int argc, char *argv[])
+{
+    QApplication a(argc, argv);
+    MainWindow w;
+    w.show();
+
+    return a.exec();
+}

+ 577 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.cpp

@@ -0,0 +1,577 @@
+#include <QMessageBox>
+#include <QException>
+#include <QFileDialog>
+#include <QPixmap>
+#include <QDebug>
+
+#include "mainwindow.h"
+#include "ui_mainwindow.h"
+
+#include "opencv2/opencv.hpp"
+#include "opencv2/core/core.hpp"
+#include "opencv2/highgui/highgui.hpp"
+#include "opencv2/imgproc/imgproc.hpp"
+using namespace cv;
+
+void MainWindow::Init_SystemState()
+{
+    qDebug() << "Now Using Opencv Version: " << CV_VERSION << "\n";
+//    qDebug() << cv::getBuildInformation().c_str() << "\n"; // out build infomation
+
+    has_Init = false;
+    doing_Infer = false;
+    is_paddlex = false;
+    is_mask = false;
+
+    model_Envs[0] = "cpu";
+    model_Envs[1] = "gpu";
+    model_Kinds[0] = "det";
+    model_Kinds[1] = "seg";
+    model_Kinds[2] = "clas";
+    model_Kinds[3] = "mask";
+    model_Kinds[4] = "paddlex";
+
+    model_Env = "cpu";
+    model_Kind = "det";
+    gpu_Id = 0;
+
+    det_threshold = 0.5;
+    infer_Delay = 50;
+
+    ui->cBoxEnv->setCurrentIndex(0); // Initialize the environment to CPU
+    ui->cBoxKind->setCurrentIndex(0); // Initialization type Det
+    ui->labelImage1->setStyleSheet("background-color:white;"); // Set the background to initialize the image display area
+    ui->labelImage2->setStyleSheet("background-color:white;");
+
+    // Dynamic link library startup
+    inferLibrary = new QLibrary("/home/nvidia/Desktop/Deploy_infer/infer_lib/libmodel_infer");
+    if(!inferLibrary->load()){
+        //Loading failed
+        qDebug() << "Load libmodel_infer.so is failed!";
+        qDebug() << inferLibrary->errorString();
+    }
+    else
+    {
+        qDebug() << "Load libmodel_infer.so is OK!";
+    }
+
+    // Export model loading / destruction interface in dynamic library
+    initModel = (InitModel)inferLibrary->resolve("InitModel");
+    destructModel = (DestructModel)inferLibrary->resolve("DestructModel");
+    // Export A model insertion interface in dynamic graph
+    det_ModelPredict = (Det_ModelPredict)inferLibrary->resolve("Det_ModelPredict");
+    seg_ModelPredict = (Seg_ModelPredict)inferLibrary->resolve("Seg_ModelPredict");
+    cls_ModelPredict = (Cls_ModelPredict)inferLibrary->resolve("Cls_ModelPredict");
+    mask_ModelPredict = (Mask_ModelPredict)inferLibrary->resolve("Mask_ModelPredict");
+
+    // Thread initialization - Configures the reasoning function
+    inferThread = new InferThread(this);
+    inferThread->setInferFuncs(det_ModelPredict, seg_ModelPredict, cls_ModelPredict, mask_ModelPredict);
+    inferThread->setStopBtn(ui->btnStop);
+    inferThread->setInferBtn(ui->btnInfer);
+    inferThread->setDetThreshold(det_threshold);
+    inferThread->setInferDelay(infer_Delay);
+
+    // Configure signals and slots
+    connect(inferThread, SIGNAL(InferFinished(QImage*, QImage*)),
+            this, SLOT(ImageUpdate(QImage*, QImage*)),
+            Qt::BlockingQueuedConnection);
+    connect(inferThread, SIGNAL(SetState_Btn_StopAndInfer(bool , bool )),
+            this, SLOT(Btn_StopAndInfer_StateUpdate(bool , bool )),
+            Qt::BlockingQueuedConnection);
+    connect(inferThread, SIGNAL(SetCostTime(double )),
+            this, SLOT(CostTimeUpdate(double )),
+            Qt::BlockingQueuedConnection);
+}
+
+void MainWindow::Init_SystemShow()
+{
+    ui->btnDistory->setEnabled(false);  // There is no initial initialization of the model and the destroy button is invalid
+    ui->btnInfer->setEnabled(false);
+    ui->btnStop->setEnabled(false);
+}
+
+MainWindow::MainWindow(QWidget *parent) :
+    QMainWindow(parent),
+    ui(new Ui::MainWindow)
+{
+    ui->setupUi(this);
+
+    this->Init_SystemState(); // Initialization state at startup
+    this->Init_SystemShow();  // Initialize button state at startup
+}
+
+MainWindow::~MainWindow()
+{
+    // Wait for thread to end
+    if (inferThread->doing_Infer)
+    {
+        inferThread->break_Infer=false;
+        while(inferThread->doing_Infer==true); // Wait for thread to stop
+    }
+    // Destruction of the model
+    if (has_Init == true)
+    {
+        destructModel();
+    }
+    // Unload library
+    inferLibrary->unload();
+
+    delete ui;  // Finally close the screen
+}
+
+// Model initialization
+void MainWindow::on_btnInit_clicked()
+{
+    QString dialog_title = "Load model";
+    QStringList filters = {"*.pdmodel", "*.pdiparams", "*.yml", "*.yaml"};
+    QDir model_dir(QFileDialog::getExistingDirectory(this,
+                                                     dialog_title));
+    QFileInfoList model_files = model_dir.entryInfoList(filters);  // All valid file names obtained by filtering
+
+    switch (model_files.count()) {
+    case 0:
+        return;
+    case 3:  // det,seg,clas--Load
+        for (int i = 0; i < 3; i++)
+        {
+            QString tag = model_files[i].fileName().split('.')[1];
+            if (tag == "pdmodel") model_path = model_files[i].filePath();
+            else if (tag == "pdiparams") param_path = model_files[i].filePath();
+            else if (tag == "yml" || tag == "yaml") config_path = model_files[i].filePath();
+        }
+        break;
+    case 4: // paddlex和mask--Load
+        for (int i = 0; i < 4; i++)
+        {
+            QString tag = model_files[i].fileName().split('.')[1];
+            if (tag == "pdmodel") model_path = model_files[i].filePath();
+            else if (tag == "pdiparams") param_path = model_files[i].filePath();
+            else if (tag == "yml" || tag == "yaml")
+            {
+                if (model_files[i].fileName() == "model.yml" || model_files[i].fileName() == "model.yaml")
+                    config_path = model_files[i].filePath();
+            }
+        }
+        break;
+    default: // Other undefined situations
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Make sure the following files are correctly included in the models folder:\n*.pdmodel, *.pdiparams, *.yml/*.yaml."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+        return;
+    }
+
+    char paddlex_model_type[10]="";
+    try
+    {
+        if (has_Init == true)
+        {
+            destructModel();  // Destroy the model and initialize it
+            if (model_Kind == "paddlex")  // Paddlex model special identification
+            {
+                is_paddlex = true;
+            }
+            if (model_Kind == "mask")  // MASKRCNN from Paddlex
+            {
+                model_Kind = "paddlex";
+                is_mask = true;
+            }
+
+            // input string to char*/[]
+            initModel(model_Kind.toLocal8Bit().data(),
+                      model_path.toLocal8Bit().data(),
+                      param_path.toLocal8Bit().data(),
+                      config_path.toLocal8Bit().data(),
+                      model_Env=="gpu" ? true: false,
+                      gpu_Id, paddlex_model_type);
+
+            if (is_paddlex && is_mask==false)  // Replace the actual type of the Paddlex model
+            {
+                model_Kind = QString::fromLocal8Bit(paddlex_model_type);  // to real type
+                is_paddlex = false;
+            }
+            if (is_paddlex==false && is_mask)  // Revert to MASKRCNN type
+            {
+                model_Kind = "mask";  // to mask type
+                is_mask = false;
+            }
+        }
+        else
+        {
+            if (model_Kind == "paddlex")
+            {
+                is_paddlex = true;
+            }
+            if (model_Kind == "mask")
+            {
+                model_Kind = "paddlex";
+                is_mask = true;
+            }
+
+            // input string to char*/[]
+            initModel(model_Kind.toLocal8Bit().data(),
+                      model_path.toLocal8Bit().data(),
+                      param_path.toLocal8Bit().data(),
+                      config_path.toLocal8Bit().data(),
+                      model_Env=="gpu" ? true: false,
+                      gpu_Id, paddlex_model_type);
+
+            if (is_paddlex && is_mask==false)
+            {
+                model_Kind = QString::fromLocal8Bit(paddlex_model_type);  // to real type
+                is_paddlex = false;
+            }
+            if (is_paddlex==false && is_mask)
+            {
+                model_Kind = "mask";  // to mask type
+                is_mask = false;
+            }
+
+            has_Init = true; // Initialization is complete
+        }
+    }
+    catch (QException &e) // Failed to initialize a message
+    {
+        QMessageBox::information(this,
+            tr("Initialization failed"),
+            tr("1.Please ensure that the model folder correctly contains the following files:\n*.pdmodel, *.pdiparams, *.yml/*.yaml.\n2.Please ensure that the model type is consistent with the loading model."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+
+        if (is_paddlex)  // Paddlex is not initialized, restore type
+        {
+            model_Kind = "paddlex";
+            is_paddlex = false;
+        }
+        if (is_mask)  // Mask is not initialized, restore type
+        {
+            model_Kind = "mask";
+            is_mask = false;
+        }
+
+        return;
+    }
+
+    inferThread->setModelType(model_Kind); // Set the reasoning interface type
+    // Initialization Successful Tips
+    QMessageBox::information(this,
+        tr("Initialization successful"),
+        QString("Model type: ")+model_Kind+QString(", Runtime environment: ")+model_Env+QString("."),
+        QMessageBox::Ok,
+        QMessageBox::Ok);
+    ui->btnInit->setText("模型已初始化");
+
+    ui->btnInfer->setEnabled(true);  // Open reasoning function
+    ui->btnDistory->setEnabled(true); // Open the destruction function
+    ui->cBoxEnv->setEnabled(false);   // Close the choice of running environment
+    ui->cBoxKind->setEnabled(false);   // Turn off the run type selection
+    ui->lEditGpuId->setEnabled(false);  // Disable GPU specification
+}
+
+// Model destruction
+void MainWindow::on_btnDistory_clicked()
+{
+    if (inferThread->doing_Infer)  // Reasoning, it cannot be destroyed, issued a hint
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("The model is being reasoning, can't be destroyed!\n(after reasoning the completion / termination, the model is destroyed."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+        return;
+    }
+
+    if (has_Init == true)
+    {
+        destructModel();
+        has_Init = false;
+
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("The model has been destroyed."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+        ui->btnInit->setText("初始化模型");
+
+        ui->btnInfer->setEnabled(false);
+        ui->btnDistory->setEnabled(false);
+        ui->cBoxEnv->setEnabled(true);
+        ui->cBoxKind->setEnabled(true);
+        ui->lEditGpuId->setEnabled(true);
+    }
+    else
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Not initialized, no need to destroy."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+    }
+}
+
+// Loading pictures
+void MainWindow::on_btnLoadImg_clicked()
+{
+    QString dialog_title = "Loading pictures";
+    QString filters = "Loading pictures(*.jpg *.jpeg *.png *.JPEG);;";
+    QUrl img_path = QFileDialog::getOpenFileUrl(this,
+                                                   dialog_title, QUrl(), filters);
+    if (img_path.isEmpty())
+    {
+        return;
+    }
+
+    img_file = img_path.url().split("//")[1];
+    qDebug() << "Input Video Path:" << img_file << "\n";
+
+    // Picture reading
+    cv::Mat image = cv::imread(img_file.toLocal8Bit().toStdString());  //BGR
+    cv::cvtColor(image, image, COLOR_BGR2RGB);  // BGR --> RGB
+    cv::resize(image, image, cv::Size(image.cols/4*4,image.rows/4*4));
+    QImage image_from_mat((const uchar*)image.data, image.cols, image.rows, QImage::Format_RGB888);
+    QPixmap pixmap(QPixmap::fromImage(image_from_mat));
+
+    // Display images
+    ui->labelImage1->setPixmap(pixmap);
+    ui->labelImage1->setScaledContents(true);  // Full Label
+
+    inferThread->setInputImage(img_file);  // Introduced into the reasoning data
+    img_files = QStringList();
+    video_file = "";
+
+    ui->btnLoadImg->setText("图片已加载");
+    ui->btnLoadImgs->setText("加载文件夹");
+    ui->btnLoadVideo->setText("加载视频");
+}
+
+// Loading images folder
+void MainWindow::on_btnLoadImgs_clicked()
+{
+    QString dialog_title = "Loading images folder";
+    QStringList filters = {"*.jpg", "*.jpeg", "*.png", "*.JPEG"};
+    QDir img_dir(QFileDialog::getExistingDirectory(this,
+                                                     dialog_title));
+    QFileInfoList img_paths = img_dir.entryInfoList(filters);
+
+    if (img_paths.isEmpty())
+    {
+        return;
+    }
+
+    img_files.clear();
+    for (int i = 0; i < img_paths.count(); i++)
+    {
+        img_files.append(img_paths[i].filePath());
+    }
+
+    qDebug() << img_files[0] << "\n";
+
+    // Display the first image
+    cv::Mat image = cv::imread(img_files[0].toLocal8Bit().toStdString());  //BGR
+    cv::cvtColor(image, image, COLOR_BGR2RGB);  // BGR --> RGB
+    cv::resize(image, image, cv::Size(image.cols/4*4,image.rows/4*4));
+    QImage image_from_mat((const uchar*)image.data, image.cols, image.rows, QImage::Format_RGB888);
+    QPixmap pixmap(QPixmap::fromImage(image_from_mat));
+
+
+    ui->labelImage1->setPixmap(pixmap);
+    ui->labelImage1->setScaledContents(true);
+
+    inferThread->setInputImages(img_files);
+    img_file = "";
+    video_file = "";
+
+    ui->btnLoadImg->setText("加载图片");
+    ui->btnLoadImgs->setText("文件夹已加载");
+    ui->btnLoadVideo->setText("加载视频");
+}
+
+// Load the video stream
+void MainWindow::on_btnLoadVideo_clicked()
+{
+    QString dialog_title = "Load video stream";
+    QString filters = "video(*.mp4 *.MP4);;";
+    QUrl video_path = QFileDialog::getOpenFileUrl(this,
+                                                   dialog_title, QUrl(), filters);
+    if (video_path.isEmpty())
+    {
+        return;
+    }
+
+    video_file = video_path.url().split("//")[1];
+    qDebug() << "Input Video Path:" << video_file.toStdString().c_str() << "\n";
+
+    // Display the first image of the video
+    VideoCapture capture;
+    Mat frame;
+    capture.open(video_file.toLocal8Bit().toStdString()); // Read video
+    if(!capture.isOpened())
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("1.Video read failed, please check if the video is complete, is it MP4.\n2.(Maybe)Opencv doesn't support the video!"),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+        return;
+    }
+    capture >> frame;  // Get the first frame
+    cvtColor(frame, frame, COLOR_BGR2RGB);  // BGR --> RGB
+    QImage image((const uchar*)frame.data, frame.cols, frame.rows, QImage::Format_RGB888);
+    ui->labelImage1->setPixmap(QPixmap::fromImage(image));
+    ui->labelImage1->setScaledContents(true);
+    capture.release();
+
+    inferThread->setInputVideo(video_file);
+    img_file = "";
+    img_files = QStringList();
+
+    ui->btnLoadImg->setText("加载图片");
+    ui->btnLoadImgs->setText("加载文件夹");
+    ui->btnLoadVideo->setText("视频已加载");
+}
+
+// Model reasoning - multiple threads
+void MainWindow::on_btnInfer_clicked()
+{
+    if (inferThread->dataLoaded != true)  // Data is not loaded
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Please load the data, then reinforce."),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+        return;
+    }
+    if (has_Init == true && inferThread->doing_Infer == false)
+    {
+        ui->btnStop->setEnabled(true);     // Stop button start
+        ui->btnInfer->setEnabled(false);     // Inference button off
+
+        // Perform the corresponding type of reasoning
+        inferThread->start();
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Model reasoning"),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+    }
+}
+
+// Reasoning to terminate
+void MainWindow::on_btnStop_clicked()
+{
+    if (inferThread->doing_Infer == true)
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Stop model reasoning"),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+
+        inferThread->break_Infer = true;  // Termination Reasoning -> Auto Go to doing_Infer==false
+    }
+    else
+    {
+        QMessageBox::information(this,
+            tr("Prompt"),
+            tr("Not reasoning, no need to terminate the model reasoning"),
+            QMessageBox::Ok,
+            QMessageBox::Ok);
+    }
+}
+
+// Select the model running environment
+void MainWindow::on_cBoxEnv_currentIndexChanged(int index)
+{
+    if (has_Init) // Has been initialized, this operation is invalid
+    {
+        ui->cBoxEnv->setCurrentIndex(old_model_Env);
+        return;
+    }
+    model_Env = model_Envs[index];
+    old_model_Env = index;  // Retain this result
+}
+
+// Set the model type
+void MainWindow::on_cBoxKind_currentIndexChanged(int index)
+{
+    if (has_Init) // Has been initialized, this operation is invalid
+    {
+        ui->cBoxKind->setCurrentIndex(old_model_Kind);
+        return;
+    }
+    model_Kind = model_Kinds[index];
+    old_model_Kind = index;
+}
+
+// Set detection threshold
+void MainWindow::on_sBoxThreshold_valueChanged(double arg1)
+{
+    if (inferThread->doing_Infer)
+    {
+        ui->sBoxThreshold->setValue(old_det_threshold);
+        return;
+    }
+    det_threshold = (float)arg1;
+    inferThread->setDetThreshold(det_threshold);
+
+    old_det_threshold = det_threshold;
+}
+
+// set gpu_id
+void MainWindow::on_lEditGpuId_textChanged(const QString &arg1)
+{
+    if (has_Init) // Has been initialized, this operation is invalid
+    {
+        ui->lEditGpuId->setText(QString::number(old_gpu_Id));
+        return;
+    }
+    gpu_Id = arg1.toInt(); // If you enter a normal number, resolve to the specified number; otherwise 0
+    old_gpu_Id = gpu_Id;
+}
+
+// Continuous reasoning interval duration setting
+void MainWindow::on_sBoxDelay_valueChanged(const QString &arg1)
+{
+    if (inferThread->doing_Infer)
+    {
+        ui->sBoxDelay->setValue(old_infer_Delay);
+        return;
+    }
+    infer_Delay = arg1.toInt(); // If you enter a normal number, resolve to the specified number; otherwise 0
+    inferThread->setInferDelay(infer_Delay);
+
+    old_infer_Delay = infer_Delay;
+}
+
+
+/*  slot funcs blob  */
+// update image show
+void MainWindow::ImageUpdate(QImage* label1, QImage* label2)
+{
+    if (inferThread->doing_Infer)
+    {
+        ui->labelImage1->clear();
+        ui->labelImage1->setPixmap(QPixmap::fromImage(*label1));
+        ui->labelImage1->setScaledContents(true); // Full Label
+
+        ui->labelImage2->clear();
+        ui->labelImage2->setPixmap(QPixmap::fromImage(*label2));
+        ui->labelImage2->setScaledContents(true); // Full Label
+    }
+}
+
+// update btn Enable_state
+void MainWindow::Btn_StopAndInfer_StateUpdate(bool stop_state, bool infer_state)
+{
+    ui->btnStop->setEnabled(stop_state);
+    ui->btnInfer->setEnabled(infer_state);
+}
+
+// update label value to show cost time
+void MainWindow::CostTimeUpdate(double cost_time)
+{
+    ui->labelCostTime->setText(QString::number(cost_time));
+}

+ 126 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.h

@@ -0,0 +1,126 @@
+#ifndef MAINWINDOW_H
+#define MAINWINDOW_H
+
+#include <QMainWindow>
+#include <QString>
+#include <QLibrary>
+
+#include "inferthread.h"
+
+// Model initialization and destruction API
+typedef void (*InitModel)(const char*,
+                          const char*,
+                          const char*,
+                          const char*,
+                          bool , int , char* );
+typedef void (*DestructModel)();
+// Model reasoning API: det, seg, clas, mask
+typedef void (*Det_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 float* , int* , char* );
+typedef void (*Seg_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 unsigned char* );
+typedef void (*Cls_ModelPredict)(const unsigned char* ,
+                                 int , int , int ,
+                                 float* , char* , int* );
+typedef void (*Mask_ModelPredict)(const unsigned char* ,
+                                  int , int , int ,
+                                  float* , unsigned char* ,
+                                  int* , char* );
+
+namespace Ui {
+class MainWindow;
+}
+
+class MainWindow : public QMainWindow
+{
+    Q_OBJECT
+
+private: // Some logo or data storage variables
+    bool has_Init;  // Whether it is initialized
+    bool doing_Infer; // Whether it is in reason
+    QString model_Envs[2]; // Running environment collection
+    QString model_Kinds[5]; // Model type collection
+    bool is_paddlex;  // Special record PADDLEX model
+    bool is_mask;     // Specially recorded Mask model under Paddlex
+    QString model_Env;  // Current operating environment
+    int old_model_Env; // Last run environment
+    QString model_Kind; // Current model type
+    int old_model_Kind; // The last model type
+    int gpu_Id;  // Current GPU_ID
+    int old_gpu_Id;  // Last GPU_ID
+
+    float det_threshold; // Target detection detection threshold
+    float old_det_threshold; // The detection threshold of the last target detection
+    int infer_Delay; // Continuous reasoning interval
+    int old_infer_Delay; // The last continuous reasoning interval
+
+    // Model file path
+    QString model_path;
+    QString param_path;
+    QString config_path;
+
+    // Inferential data path
+    QString img_file;
+    QStringList img_files;
+    QString video_file;
+
+    // Link pointer to a dynamic library
+    QLibrary *inferLibrary;
+    // Model initialization and destruction API
+    InitModel initModel;
+    DestructModel destructModel;
+    // Model reasoning API
+    Det_ModelPredict det_ModelPredict;
+    Seg_ModelPredict seg_ModelPredict;
+    Cls_ModelPredict cls_ModelPredict;
+    Mask_ModelPredict mask_ModelPredict;
+
+private:
+    InferThread *inferThread;
+
+public: // Some basic methods
+    void Init_SystemState();  // Initialize state and data variables
+    void Init_SystemShow();  // Initialize the state of the visual key: enable and disable
+
+public:
+    explicit MainWindow(QWidget *parent = 0);
+    ~MainWindow();
+
+private slots:
+    void on_btnInit_clicked();
+
+    void on_btnDistory_clicked();
+
+    void on_btnLoadImg_clicked();
+
+    void on_btnLoadImgs_clicked();
+
+    void on_btnLoadVideo_clicked();
+
+    void on_btnInfer_clicked();
+
+    void on_btnStop_clicked();
+
+    void on_cBoxEnv_currentIndexChanged(int index);
+
+    void on_cBoxKind_currentIndexChanged(int index);
+
+    void on_sBoxThreshold_valueChanged(double arg1);
+
+    void on_lEditGpuId_textChanged(const QString &arg1);
+
+    void on_sBoxDelay_valueChanged(const QString &arg1);
+
+
+private slots:
+    void ImageUpdate(QImage* label1, QImage* label2);
+    void Btn_StopAndInfer_StateUpdate(bool stop_state, bool infer_state);
+    void CostTimeUpdate(double cost_time);
+
+private:
+    Ui::MainWindow *ui;
+};
+
+#endif // MAINWINDOW_H

+ 829 - 0
deploy/cpp/docs/jetson-deploy/Deploy_infer/mainwindow.ui

@@ -0,0 +1,829 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<ui version="4.0">
+ <class>MainWindow</class>
+ <widget class="QMainWindow" name="MainWindow">
+  <property name="geometry">
+   <rect>
+    <x>0</x>
+    <y>0</y>
+    <width>880</width>
+    <height>640</height>
+   </rect>
+  </property>
+  <property name="minimumSize">
+   <size>
+    <width>880</width>
+    <height>640</height>
+   </size>
+  </property>
+  <property name="maximumSize">
+   <size>
+    <width>880</width>
+    <height>640</height>
+   </size>
+  </property>
+  <property name="contextMenuPolicy">
+   <enum>Qt::NoContextMenu</enum>
+  </property>
+  <property name="windowTitle">
+   <string>PaddleX可视化推理界面</string>
+  </property>
+  <widget class="QWidget" name="centralWidget">
+   <widget class="QWidget" name="verticalLayoutWidget_4">
+    <property name="geometry">
+     <rect>
+      <x>40</x>
+      <y>10</y>
+      <width>794</width>
+      <height>644</height>
+     </rect>
+    </property>
+    <layout class="QVBoxLayout" name="verticalLayout_7">
+     <item>
+      <layout class="QVBoxLayout" name="verticalLayout_6">
+       <item>
+        <widget class="Line" name="line_2">
+         <property name="orientation">
+          <enum>Qt::Horizontal</enum>
+         </property>
+        </widget>
+       </item>
+       <item>
+        <layout class="QHBoxLayout" name="horizontalLayout_10">
+         <item>
+          <layout class="QVBoxLayout" name="verticalLayout_4">
+           <item>
+            <spacer name="verticalSpacer_3">
+             <property name="orientation">
+              <enum>Qt::Vertical</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>20</width>
+               <height>40</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <layout class="QHBoxLayout" name="horizontalLayout_6">
+             <item>
+              <widget class="QLabel" name="label_3">
+               <property name="maximumSize">
+                <size>
+                 <width>100</width>
+                 <height>40</height>
+                </size>
+               </property>
+               <property name="lineWidth">
+                <number>1</number>
+               </property>
+               <property name="text">
+                <string>实时推理耗时</string>
+               </property>
+              </widget>
+             </item>
+            </layout>
+           </item>
+           <item>
+            <layout class="QHBoxLayout" name="horizontalLayout_7">
+             <item>
+              <spacer name="horizontalSpacer_38">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+               <property name="sizeType">
+                <enum>QSizePolicy::Maximum</enum>
+               </property>
+               <property name="sizeHint" stdset="0">
+                <size>
+                 <width>25</width>
+                 <height>20</height>
+                </size>
+               </property>
+              </spacer>
+             </item>
+             <item>
+              <widget class="QLabel" name="labelCostTime">
+               <property name="maximumSize">
+                <size>
+                 <width>100</width>
+                 <height>40</height>
+                </size>
+               </property>
+               <property name="text">
+                <string>0.00</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="QLabel" name="label_5">
+               <property name="maximumSize">
+                <size>
+                 <width>40</width>
+                 <height>40</height>
+                </size>
+               </property>
+               <property name="text">
+                <string>ms</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <spacer name="horizontalSpacer_39">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+               <property name="sizeType">
+                <enum>QSizePolicy::Maximum</enum>
+               </property>
+               <property name="sizeHint" stdset="0">
+                <size>
+                 <width>5</width>
+                 <height>20</height>
+                </size>
+               </property>
+              </spacer>
+             </item>
+            </layout>
+           </item>
+          </layout>
+         </item>
+         <item>
+          <layout class="QHBoxLayout" name="horizontalLayout_4">
+           <item>
+            <widget class="Line" name="line_8">
+             <property name="orientation">
+              <enum>Qt::Vertical</enum>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_5">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <layout class="QVBoxLayout" name="verticalLayout_2">
+             <item>
+              <widget class="QPushButton" name="btnInit">
+               <property name="text">
+                <string>初始化模型</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="Line" name="line_9">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="QPushButton" name="btnDistory">
+               <property name="text">
+                <string>销毁模型</string>
+               </property>
+              </widget>
+             </item>
+            </layout>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_3">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <layout class="QHBoxLayout" name="horizontalLayout_3">
+             <item>
+              <widget class="Line" name="line_5">
+               <property name="orientation">
+                <enum>Qt::Vertical</enum>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="QPushButton" name="btnLoadImg">
+               <property name="text">
+                <string>加载图片</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <spacer name="horizontalSpacer_2">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+               <property name="sizeHint" stdset="0">
+                <size>
+                 <width>40</width>
+                 <height>20</height>
+                </size>
+               </property>
+              </spacer>
+             </item>
+             <item>
+              <widget class="QPushButton" name="btnLoadImgs">
+               <property name="text">
+                <string>加载文件夹</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <spacer name="horizontalSpacer_19">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+               <property name="sizeHint" stdset="0">
+                <size>
+                 <width>40</width>
+                 <height>20</height>
+                </size>
+               </property>
+              </spacer>
+             </item>
+             <item>
+              <widget class="QPushButton" name="btnLoadVideo">
+               <property name="text">
+                <string>加载视频流</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="Line" name="line_6">
+               <property name="orientation">
+                <enum>Qt::Vertical</enum>
+               </property>
+              </widget>
+             </item>
+            </layout>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_4">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <layout class="QVBoxLayout" name="verticalLayout">
+             <item>
+              <widget class="QPushButton" name="btnInfer">
+               <property name="text">
+                <string>模型推理</string>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="Line" name="line_10">
+               <property name="orientation">
+                <enum>Qt::Horizontal</enum>
+               </property>
+              </widget>
+             </item>
+             <item>
+              <widget class="QPushButton" name="btnStop">
+               <property name="text">
+                <string>推理终止</string>
+               </property>
+              </widget>
+             </item>
+            </layout>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_6">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <widget class="Line" name="line_7">
+             <property name="orientation">
+              <enum>Qt::Vertical</enum>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <layout class="QVBoxLayout" name="verticalLayout_5">
+             <item>
+              <spacer name="verticalSpacer_4">
+               <property name="orientation">
+                <enum>Qt::Vertical</enum>
+               </property>
+               <property name="sizeHint" stdset="0">
+                <size>
+                 <width>20</width>
+                 <height>40</height>
+                </size>
+               </property>
+              </spacer>
+             </item>
+             <item>
+              <layout class="QHBoxLayout" name="horizontalLayout_8">
+               <item>
+                <widget class="QLabel" name="label_6">
+                 <property name="maximumSize">
+                  <size>
+                   <width>100</width>
+                   <height>40</height>
+                  </size>
+                 </property>
+                 <property name="text">
+                  <string>连续推理间隔</string>
+                 </property>
+                </widget>
+               </item>
+              </layout>
+             </item>
+             <item>
+              <layout class="QHBoxLayout" name="horizontalLayout_9">
+               <item>
+                <spacer name="horizontalSpacer_40">
+                 <property name="orientation">
+                  <enum>Qt::Horizontal</enum>
+                 </property>
+                 <property name="sizeType">
+                  <enum>QSizePolicy::Maximum</enum>
+                 </property>
+                 <property name="sizeHint" stdset="0">
+                  <size>
+                   <width>20</width>
+                   <height>20</height>
+                  </size>
+                 </property>
+                </spacer>
+               </item>
+               <item>
+                <widget class="QSpinBox" name="sBoxDelay">
+                 <property name="minimumSize">
+                  <size>
+                   <width>42</width>
+                   <height>0</height>
+                  </size>
+                 </property>
+                 <property name="maximumSize">
+                  <size>
+                   <width>50</width>
+                   <height>20</height>
+                  </size>
+                 </property>
+                 <property name="minimum">
+                  <number>50</number>
+                 </property>
+                 <property name="maximum">
+                  <number>1000</number>
+                 </property>
+                 <property name="singleStep">
+                  <number>50</number>
+                 </property>
+                </widget>
+               </item>
+               <item>
+                <widget class="QLabel" name="label_7">
+                 <property name="maximumSize">
+                  <size>
+                   <width>20</width>
+                   <height>20</height>
+                  </size>
+                 </property>
+                 <property name="text">
+                  <string>ms</string>
+                 </property>
+                </widget>
+               </item>
+               <item>
+                <spacer name="horizontalSpacer_41">
+                 <property name="orientation">
+                  <enum>Qt::Horizontal</enum>
+                 </property>
+                 <property name="sizeType">
+                  <enum>QSizePolicy::Maximum</enum>
+                 </property>
+                 <property name="sizeHint" stdset="0">
+                  <size>
+                   <width>10</width>
+                   <height>20</height>
+                  </size>
+                 </property>
+                </spacer>
+               </item>
+              </layout>
+             </item>
+            </layout>
+           </item>
+          </layout>
+         </item>
+        </layout>
+       </item>
+       <item>
+        <widget class="Line" name="line_4">
+         <property name="orientation">
+          <enum>Qt::Horizontal</enum>
+         </property>
+        </widget>
+       </item>
+       <item>
+        <layout class="QHBoxLayout" name="horizontalLayout_11">
+         <item>
+          <spacer name="horizontalSpacer_9">
+           <property name="orientation">
+            <enum>Qt::Horizontal</enum>
+           </property>
+           <property name="sizeHint" stdset="0">
+            <size>
+             <width>40</width>
+             <height>20</height>
+            </size>
+           </property>
+          </spacer>
+         </item>
+         <item>
+          <layout class="QHBoxLayout" name="horizontalLayout_5">
+           <item>
+            <widget class="QLabel" name="label_10">
+             <property name="text">
+              <string>运行环境:</string>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <widget class="QComboBox" name="cBoxEnv">
+             <item>
+              <property name="text">
+               <string>CPU</string>
+              </property>
+             </item>
+             <item>
+              <property name="text">
+               <string>GPU</string>
+              </property>
+             </item>
+            </widget>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_12">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <widget class="Line" name="line_11">
+             <property name="orientation">
+              <enum>Qt::Vertical</enum>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <widget class="QLabel" name="label_9">
+             <property name="text">
+              <string>模型类型:</string>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <widget class="QComboBox" name="cBoxKind">
+             <item>
+              <property name="text">
+               <string>det</string>
+              </property>
+             </item>
+             <item>
+              <property name="text">
+               <string>seg</string>
+              </property>
+             </item>
+             <item>
+              <property name="text">
+               <string>clas</string>
+              </property>
+             </item>
+             <item>
+              <property name="text">
+               <string>mask</string>
+              </property>
+             </item>
+             <item>
+              <property name="text">
+               <string>paddlex</string>
+              </property>
+             </item>
+            </widget>
+           </item>
+           <item>
+            <widget class="Line" name="line_12">
+             <property name="orientation">
+              <enum>Qt::Vertical</enum>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_8">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+           <item>
+            <widget class="QLabel" name="label_8">
+             <property name="maximumSize">
+              <size>
+               <width>55</width>
+               <height>55</height>
+              </size>
+             </property>
+             <property name="text">
+              <string>检测阈值:</string>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <widget class="QDoubleSpinBox" name="sBoxThreshold">
+             <property name="maximumSize">
+              <size>
+               <width>50</width>
+               <height>40</height>
+              </size>
+             </property>
+             <property name="decimals">
+              <number>1</number>
+             </property>
+             <property name="maximum">
+              <double>0.900000000000000</double>
+             </property>
+             <property name="singleStep">
+              <double>0.100000000000000</double>
+             </property>
+             <property name="value">
+              <double>0.500000000000000</double>
+             </property>
+            </widget>
+           </item>
+           <item>
+            <spacer name="horizontalSpacer_7">
+             <property name="orientation">
+              <enum>Qt::Horizontal</enum>
+             </property>
+             <property name="sizeHint" stdset="0">
+              <size>
+               <width>40</width>
+               <height>20</height>
+              </size>
+             </property>
+            </spacer>
+           </item>
+          </layout>
+         </item>
+         <item>
+          <spacer name="horizontalSpacer_10">
+           <property name="orientation">
+            <enum>Qt::Horizontal</enum>
+           </property>
+           <property name="sizeHint" stdset="0">
+            <size>
+             <width>40</width>
+             <height>20</height>
+            </size>
+           </property>
+          </spacer>
+         </item>
+        </layout>
+       </item>
+      </layout>
+     </item>
+     <item>
+      <widget class="Line" name="line_3">
+       <property name="orientation">
+        <enum>Qt::Horizontal</enum>
+       </property>
+      </widget>
+     </item>
+     <item>
+      <layout class="QHBoxLayout" name="horizontalLayout">
+       <item>
+        <spacer name="horizontalSpacer_14">
+         <property name="orientation">
+          <enum>Qt::Horizontal</enum>
+         </property>
+         <property name="sizeHint" stdset="0">
+          <size>
+           <width>40</width>
+           <height>20</height>
+          </size>
+         </property>
+        </spacer>
+       </item>
+       <item>
+        <widget class="QLabel" name="label">
+         <property name="text">
+          <string>GPU_id</string>
+         </property>
+        </widget>
+       </item>
+       <item>
+        <widget class="QLineEdit" name="lEditGpuId">
+         <property name="maximumSize">
+          <size>
+           <width>30</width>
+           <height>16777215</height>
+          </size>
+         </property>
+         <property name="text">
+          <string>0</string>
+         </property>
+         <property name="maxLength">
+          <number>30</number>
+         </property>
+        </widget>
+       </item>
+       <item>
+        <spacer name="horizontalSpacer_15">
+         <property name="orientation">
+          <enum>Qt::Horizontal</enum>
+         </property>
+         <property name="sizeHint" stdset="0">
+          <size>
+           <width>40</width>
+           <height>20</height>
+          </size>
+         </property>
+        </spacer>
+       </item>
+      </layout>
+     </item>
+     <item>
+      <widget class="Line" name="line">
+       <property name="orientation">
+        <enum>Qt::Horizontal</enum>
+       </property>
+      </widget>
+     </item>
+     <item>
+      <spacer name="verticalSpacer">
+       <property name="orientation">
+        <enum>Qt::Vertical</enum>
+       </property>
+       <property name="sizeHint" stdset="0">
+        <size>
+         <width>20</width>
+         <height>40</height>
+        </size>
+       </property>
+      </spacer>
+     </item>
+     <item>
+      <spacer name="horizontalSpacer_11">
+       <property name="orientation">
+        <enum>Qt::Horizontal</enum>
+       </property>
+       <property name="sizeHint" stdset="0">
+        <size>
+         <width>40</width>
+         <height>20</height>
+        </size>
+       </property>
+      </spacer>
+     </item>
+     <item>
+      <layout class="QHBoxLayout" name="horizontalLayout_2">
+       <item>
+        <widget class="QLabel" name="labelImage1">
+         <property name="minimumSize">
+          <size>
+           <width>380</width>
+           <height>380</height>
+          </size>
+         </property>
+         <property name="maximumSize">
+          <size>
+           <width>380</width>
+           <height>380</height>
+          </size>
+         </property>
+         <property name="autoFillBackground">
+          <bool>true</bool>
+         </property>
+         <property name="text">
+          <string>ImageLabel</string>
+         </property>
+         <property name="alignment">
+          <set>Qt::AlignCenter</set>
+         </property>
+        </widget>
+       </item>
+       <item>
+        <spacer name="horizontalSpacer">
+         <property name="orientation">
+          <enum>Qt::Horizontal</enum>
+         </property>
+         <property name="sizeHint" stdset="0">
+          <size>
+           <width>40</width>
+           <height>20</height>
+          </size>
+         </property>
+        </spacer>
+       </item>
+       <item>
+        <widget class="QLabel" name="labelImage2">
+         <property name="minimumSize">
+          <size>
+           <width>380</width>
+           <height>380</height>
+          </size>
+         </property>
+         <property name="maximumSize">
+          <size>
+           <width>380</width>
+           <height>380</height>
+          </size>
+         </property>
+         <property name="autoFillBackground">
+          <bool>true</bool>
+         </property>
+         <property name="text">
+          <string>ImageLabel</string>
+         </property>
+         <property name="alignment">
+          <set>Qt::AlignCenter</set>
+         </property>
+        </widget>
+       </item>
+      </layout>
+     </item>
+     <item>
+      <spacer name="horizontalSpacer_13">
+       <property name="orientation">
+        <enum>Qt::Horizontal</enum>
+       </property>
+       <property name="sizeHint" stdset="0">
+        <size>
+         <width>40</width>
+         <height>20</height>
+        </size>
+       </property>
+      </spacer>
+     </item>
+     <item>
+      <spacer name="verticalSpacer_2">
+       <property name="orientation">
+        <enum>Qt::Vertical</enum>
+       </property>
+       <property name="sizeHint" stdset="0">
+        <size>
+         <width>20</width>
+         <height>40</height>
+        </size>
+       </property>
+      </spacer>
+     </item>
+    </layout>
+   </widget>
+  </widget>
+  <widget class="QStatusBar" name="statusBar"/>
+ </widget>
+ <layoutdefault spacing="6" margin="11"/>
+ <resources/>
+ <connections/>
+</ui>

+ 475 - 0
deploy/cpp/docs/jetson-deploy/README.md

@@ -0,0 +1,475 @@
+# 基于QT的Jetson Xavier部署Demo
+
+> 该项目中的QT设计代码,也支持其它平台下QT的运行——但需要保证各平台下拥有opencv的编译库(动态链接库与头文件(include)库),以及正常编译得到的动态链接库(*.a 或 *.so).
+
+项目文档目录:
+
+- <a href="### 1 环境准备">1 环境准备</a>
+- <a href="### 2 配置编译脚本">2 配置编译脚本</a>
+  - <a href="#### 2.1 修改`jetson_build.sh`编译参数">2.1 修改`jetson_build.sh`编译参数</a>
+  - <a href="#### 2.2 修改`CMakeList.txt`参数">2.2 修改`CMakeList.txt`参数</a>
+  - <a href="#### 2.3 修改`yaml.cmake`参数">2.3 修改`yaml.cmake`参数</a>
+- <a href="### 3 代码编译(生成模型预测的动态链接库)">3 代码编译(生成模型预测的动态链接库)</a>
+  - <a href="#### 3.1 修改`model_infer.cpp`文件">3.1 修改`model_infer.cpp`文件</a>
+  - <a href="#### 3.2 修改`CMakeList.txt`文件">3.2 修改`CMakeList.txt`文件</a>
+  - <a href="#### 3.3 执行`jetson_build.sh`编译">3.3 执行`jetson_build.sh`编译</a>
+- <a href="### 4 启动并配置QT工程(移植流程)">4 启动并配置QT工程(移植流程)</a>
+  - <a href="#### 4.1 启动QT工程项目">4.1 启动QT工程项目</a>
+  - <a href="#### 4.2 载入动态链接库">4.2 载入动态链接库</a>
+  - <a href="#### 4.3 配置QT的Opencv路径">4.3 配置QT的Opencv路径</a>
+- <a href="### 5 启动QT可视化界面">5 启动QT可视化界面</a>
+  - <a href="#### 5.1 功能介绍">5.1 功能介绍</a>
+  - <a href="#### 5.2 使用说明">5.2 使用说明</a>
+  - <a href="#### 5.3 使用效果展示">5.3 使用效果展示</a>
+- <a href="### 6 QT开发注解">6 QT开发注解</a>
+  - <a href="#### 6.1 利用`QDebug`实现运行日志输出">6.1 利用`QDebug`实现运行日志输出</a>
+  - <a href="#### 6.2 本项目的组织结构">6.2 本项目的组织结构</a>
+  - <a href="#### 6.3 控制读取图片的推理大小">6.3 控制读取图片的推理大小</a>
+  - <a href="#### 6.4 子线程控制主线程控件的建议">6.4 子线程控制主线程控件的建议</a>
+  - <a href="#### 6.5 修改model_infer.cpp函数后的动态链接库导入指导">6.5 修改model_infer.cpp函数后的动态链接库导入指导</a>
+  - <a href="#### 6.6 QT动态链接库的导入说明">6.6 QT动态链接库的导入说明</a>
+  - <a href="#### 6.7 移植小贴士">6.7 移植小贴士</a>
+- <a href="### 本项目的Demo的GUI也同时支持Linux、Windows上进行使用,但请自行编译好opencv,安装QT,移植流程区别如下。">本项目的Demo的GUI也同时支持Linux、Windows上进行使用,但请自行编译好opencv,安装QT,移植流程区别如下。</a>
+
+
+在新版本的PaddleX中,对于CPP的部署代码方式做了非常大的变化:
+* 支持用户将PaddleDetection PaddleSeg PaddleClas训练出来的模型通过一套部署代码实现快速部署,实现了多个模型的打通。
+* 对于一个视觉任务里面既有检测,又有分割来说,极大的简化了使用的流程。
+
+下面我们具体以**Jetson Xavier**系统为例,基于PaddleX的这套CPP(deploy/),说明一下如何实现QT的部署
+
+项目使用基本环境说明:
+
+* CUDA10.2  Cudnn 8
+* Jetson原生opencv4.1.1 / opencv3.4.6等(原生应该都支持,自己编译的opencv则建议为3.4.6)
+* Jetpack4.4: nv-jetson-cuda10.2-cudnn8-trt7(xavier)——PaddleInference 10.2的预测库(2.1.1版本)
+* Cmake 3.10.2
+* QT5.9.5
+* QMake3.1
+
+> 查看Jetpack版本: `cat /usr/local/cuda/include/cudnn.h | grep CUDNN_MAJOR -A 2`
+> (4.3可以使用4.4版本的预测库)
+> 
+> 查看QT版本: `qmake -v`
+> 
+> 查看CUDA版本: `cat /usr/local/cuda/version.txt`
+> 
+> 查看Cudnn版本: `cat /usr/include/cudnn_version.h | grep CUDNN_MAJOR -A 2`
+
+## 1 环境准备<a id="## 1 环境准备"/>
+
+* 下载好PaddleX代码和PaddleInference预测库
+* 下载QT以及QT-Creater
+
+1. 下载QT:
+`sudo apt-get install qt5-default qtcreator -y`
+
+2. 下载PaddleX+PaddleInference预测库:
+> 可查看文档: 
+
+- [基于PaddleInference的推理-Jetson环境编译](../../deploy/cpp/docs/compile/paddle/jetson.md)
+
+3. 查看cmake版本: `cmake -version`
+
+> 保证版本大于3.5即可,如小于,可尝试安装`cmake3.10.2`。
+
+**QT-Creater的启动,可以通过`应用搜索`,输入`QT`,在出现的应用图标中选中QTCreater,点击即可启动!**
+
+--------
+
+## 2 配置编译脚本<a id="## 2 配置编译脚本"/>
+
+> 如果已经查看`Jetson环境编译`文档后,并已经成功编译出可执行程序后,可跳过该部分的`deploy/cpp/scripts/jetson_build.sh`、`deploy/cpp/CMakeList.txt`以及`deploy/cpp/cmake/yaml.cmake`的修改说明.
+
+> 以下3部分的修改完全同[基于PaddleInference的推理-Jetson环境编译](../../deploy/cpp/docs/compile/paddle/jetson.md)一样,可前往参阅。
+
+### 2.1 修改`jetson_build.sh`编译参数<a id="### 2.1 修改`jetson_build.sh`编译参数"/>
+
+根据自己的系统环境,修改`PaddleX/deploy/cpp/script/jetson_build.sh`脚本中的参数,主要修改的参数为以下几个
+| 参数          | 说明                                                                                 |
+| :------------ | :----------------------------------------------------------------------------------- |
+| WITH_GPU      | ON或OFF,表示是否使用GPU,当下载的为CPU预测库时,设为OFF                             |
+| PADDLE_DIR    | 预测库所在路径,默认为`PaddleX/deploy/cpp/paddle_inference`目录下                    |
+| CUDA_LIB      | cuda相关lib文件所在的目录路径 -- 请注意jetson预装的cuda所在路径(如:/usr/local/cuda/lib64) |
+| CUDNN_LIB     | cudnn相关lib文件所在的目录路径 -- 请注意jetson预装的cuda所在路径(如:/usr/lib/aarch64-linux-gnu)    |
+| WITH_TENSORRT | ON或OFF,表示是否使用开启TensorRT                                                    |
+| TENSORRT_DIR  | TensorRT 的路径,如果开启TensorRT开关WITH_TENSORRT,需修改为您实际安装的TensorRT路径     |
+| WITH_ENCRYPTION      | ON或OFF,表示是否开启加密模块                             |
+| OPENSSL_DIR    | OPENSSL所在路径,解密所需。默认为`PaddleX/deploy/cpp/deps/penssl-1.1.0k`目录下        |
+
+> **要注意相关参数路径不要有误——特别是CUDA_LIB以及CUDNN_LIB,如果需要启动TensorRt,也需指定当前的路径。**
+
+<div>
+  <img src="./images/deploy_build_sh.png">
+  </div>
+
+> 不需要添加oepncv路径,在jetson中编译可直接使用环境本身预装的opencv进行deploy编译——具体配置在Step4中。
+
+### 2.2 修改`CMakeList.txt`参数<a id="### 2.2 修改`CMakeList.txt`参数"/>
+
+> 该修改仅适合Jetson系统的部署编译。
+
+根据自己的系统环境,修改`PaddleX/deploy/cpp/CMakeLists.txt`脚本中的参数,主要修改的参数为以下几个:位于其中注释`#OPENCV`之后的部分
+| 参数          | 说明                                                                                 |
+| :------------ | :----------------------------------------------------------------------------------- |
+| set(OpenCV_INCLUDE_DIRS "/usr/include/opencv")      | 配置Jetson预置opencv的include路径    |
+| file(GLOB OpenCV_LIBS /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX})    | 配置opencv动态链接库*.so    |
+
+替换具体如下:(xavier为例)
+
+1. /usr/include/opencv --> /usr/include/opencv4
+  > 具体路径,以部署环境中opencv的include路径为准。
+  > opencv4 中包含: opencv, opencv2
+
+2. /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX} --> /usr/lib/libopencv_*${CMAKE_SHARED_LIBRARY_SUFFIX}
+  > 具体路径,以部署环境中opencv的*.so路径为准, 主要修改libopencv_前的路径。
+
+<div>
+  <img src="./images/cmakelist_set.png">
+  </div>
+  
+### 2.3 修改`yaml.cmake`参数<a id="### 2.3 修改`yaml.cmake`参数"/>
+
+由于Jetson环境下编译还需要yaml,所以这里需要手动下载yaml包,保证编译的正常运行。
+
+> 1. 点击[下载yaml依赖包](https://bj.bcebos.com/paddlex/deploy/deps/yaml-cpp.zip),无需解压
+> 2. 修改`PaddleX/deploy/cpp/cmake/yaml.cmake`文件,将`URL https://bj.bcebos.com/paddlex/deploy/deps/yaml-cpp.zip`中网址替换为第3步中下载的路径,如改为`URL /Users/Download/yaml-cpp.zip`
+
+**这里yaml存放路径为了确保使用最好保证全英文路径**
+eg:
+
+<div>
+  <img src="./images/yaml_cmakelist.png">
+  </div>
+  
+> 其它支持的加密操作以及TensorRT,可参考[Linux环境编译指南](./linux.md).
+
+-------
+
+## 3 代码编译(生成模型预测的动态链接库)<a id="## 3 代码编译(生成模型预测的动态链接库)"/>
+
+该部分需要修改两个地方,以保证动态链接库的正常生成。
+
+> 接下来的操作,请在执行以下命令正常生成可执行程序后再往下继续配置,以确保修改前的工作是正确可执行的。
+> 
+```
+sh script/jetson_build.sh
+```
+
+> 编译时,如果存在cmake多线程问题——请前往`jetson_build.sh`末尾,将`make -j8`改为`make`或者小于8.
+>
+> 编译后会在`PaddleX/deploy/cpp/build/demo`目录下生成`model_infer`、`multi_gpu_model_infer`和`batch_infer`等几个可执行二进制文件.
+
+### 3.1 修改`model_infer.cpp`文件<a id="### 3.1 修改`model_infer.cpp`文件"/>
+
+接下来我们需要在QT中去调用动态链接库,所以原有的`deploy/cpp/demo/model_infer.cpp`文件生成的可执行文件不能支持QT去进行调用,所以需要对其进行一定的修改。
+
+现已将修改后满足需求的文件放于本文档处,即——**将当前文件的`model_infer.cpp`用于替换`deploy/cpp/demo/model_infer.cpp`即可.**
+
+- 其中主要开辟了多个共享接口函数,分别对应:模型初始化,模型推理(用于推理PaddleDetection、PaddleSeg、PaddleClas、PaddleX产出的部署模型),模型销毁。
+
+> 该model_infer.cpp仅支持使用一个模型——用于单线程调用
+>
+> 如需多线程推理,同时创建多个模型用于预测推理,可以参考该项目下cpp的实现: [PaddleDeploy在C#端的多线程推理demo](https://github.com/cjh3020889729/PaddleDeploy-CSharp-ManyThreadInfer-Demo)
+
+- 模型初始化接口: `InitModel(const char* model_type, const char* model_filename, const char* params_filename, const char* cfg_file, bool use_gpu, int gpu_id, char* paddlex_model_type)`
+
+- 目标检测推理接口: `Det_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* output, int* nBoxesNum, char* LabelList)`
+
+- 语义分割推理接口: `Seg_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, unsigned char* output)`
+
+- 图像识别推理接口: `Cls_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* score, char* category, int* category_id)`
+
+- 实例分割推理接口: `Mask_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* box_output, unsigned char* mask_output, int* nBoxesNum, char* LabelList)`
+
+- 模型销毁接口: `DestructModel()`
+
+> 详细说明,请查看[`model_infer.cpp`](./model_infer.cpp)的实现.
+
+
+### 3.2 修改`CMakeList.txt`文件<a id="### 3.2 修改`CMakeList.txt`文件"/>
+
+以上完成了`model_infer.cpp`的修改后,需要修改当前目录下的`CMakeList.txt`文件,位于: `deploy/cpp/demo/CMakeList.txt`.
+
+需要修改的内容,如图所示:
+
+<div>
+  <img src="./images/infer_demo_cmakelist.png">
+  </div>
+
+用`ADD_library`那句脚本替换下划线那句脚本即可,也可直接使用本项目提供的`CMakeList.txt`进行替换`deploy/cpp/demo/CMakeList.txt`.
+
+### 3.3 执行`jetson_build.sh`编译<a id="### 3.3 执行`jetson_build.sh`编译"/>
+
+运行以下命令行, 编译完成即可获得所需推理接口的动态链接库(libmodel_infer.so)
+
+```
+sh script/jetson_build.sh
+```
+
+运行完成,会在`PaddleX/deploy/cpp/build/lib`中生成`libmodel_infer.so`动态链接库。
+
+## 4 启动并配置QT工程(移植流程)<a id="## 4 启动并配置QT工程(移植流程)"/>
+
+以下步骤请确保QT安装完成,且可以正常启动QT桌面项目。
+
+### 4.1 启动QT工程项目<a id="### 4.1 启动QT工程项目"/>
+
+1. 首先打开`Qtcreator`,新建项目,取名为`Deploy_infer`,选择项目路径为自己所熟悉的路径即可,然后一直顺着往下不必多设置或者勾选什么,直到`finished`。<div>
+  <img src="./images/qt_start_new_pro.png">
+  </div><div>
+  <img src="./images/qt_create_guipro.png">
+  </div><div>
+  <img src="./images/qt_set_proname.png">
+  </div>
+  
+> 如果存在多个编译环境供QT使用,请确保QT使用的编译环境与前边生成动态链接库的编译环境一致——或者至少保证生成的所有库都允许相互调用,即避免出现32位与64位不兼容等情况。
+
+2. 进入QT崭新的工程项目后,工程项目中会存在`*.pro`,`Headers`,`Sources`,`Forms`四个主要组件,其中后三个为项目分支(目录)<div>
+  <img src="./images/project_list.png">
+  </div>
+  
+3. 右键点击项目,选择`Add New`, 进入子界面选择`c++`,选中`Class`, 点击`choise`.<div>
+  <img src="./images/qt_add_newfile.png">
+  </div>
+  
+4. 在新出来的子界面中,输入`InferThread`作为**Class name**, 然后一直往下生成即可.<div>
+  <img src="./images/qt_create_class.png">
+  </div>
+  
+5. 将本项目`Deploy_infer`中的`inferthread.cpp`与`inferthread.h`中的内容分别复制过去即可.
+6. 然后,再将本项目`Deploy_infer`中的`mainwindow.cpp`与`mainwindow.h`中的内容也复制过去.
+7. 最后,将本项目的`mainwindow.ui`替换新建的QT-GUI项目的ui文件.
+
+> 此时,QT项目的移植就完成了——之所以新建项目,看起来比较复杂,是为了避免直接移植导致的QT版本不匹配,发生一些意料之外的问题。
+> 
+> 此时QT项目中,会出现标红的错误,原因可能如下:
+- 1. 还未导入动态链接库
+- 2. 还未导入opencv的编译好的库
+- 因此,现在暂时不用担心标红的问题
+
+### 4.2 载入动态链接库<a id="### 4.2 载入动态链接库"/>
+
+在创建的QT项目文件夹下新建`infer_lib`文件夹,将生成的`libmodel_infer.so`文件移入其中即可。
+<div>
+  <img src="./images/qt_project_inferlib.png">
+  </div>
+  
+**然后打开`Qtcreator`, 打开项目选择本QT项目启动**,可观察到QT工程目录结构如图所示:
+<div>
+  <img src="./images/project_list.png">
+  </div>
+
+双击`mainwindow.cpp`进入,**确保第一个函数中,导入的动态链接库绝对路径无误!**
+<div>
+  <img src="./images/dong_tai_lianjieku.png">
+  </div>
+
+> 此时QT项目中,会出现标红的错误,原因可能如下:
+- 1. 还未导入opencv的编译好的库
+- 因此,现在暂时不用担心标红的问题
+
+### 4.3 配置QT的Opencv路径<a id="### 4.3 配置QT的Opencv路径"/>
+
+> 请保证编译opencv的编译环境与当前QT使用的编译环境一致,否则可能无法导入opencv中的函数.
+
+该部分主要配置两个部分: `Include Path` 和 `Lib Path`.
+
+双击`Deploy_infer.pro`进入,按照如图所示写入当前环境下系统原装预编译的`Opencv路径`:
+
+<div>
+  <img src="./images/pro_set_libpath.png">
+  </div>
+
+- `INCLUDEPATH` : 表示`opencv`的头文件所在路径,包含头文件根目录,`opencv`子目录以及`opencv2`子目录
+- `LIBS` : 表示`opencv`的动态链接库`so文件`所在路径,这里使用`正则匹配`,自动匹配路径下的所有`opencv.so`文件
+
+> 在本测试Jetson环境上,**预编译opencv由于没有同ffmpeg一同编译**,因此不支持视频流的处理与预测(无法打开视频文件,如mp4,avi等)
+> 
+> 如有需要可在此时另外编译opencv,不覆盖原预编译的opencv版本,仅用于QT程序进行图像/视频的读取和简单处理。
+> 
+> 此时编译的新opencv,在编译时要选择编译参数使其支持QT、GL以及ffmpeg.(ffmpeg可能需要自行编译)
+> 
+> 该方案的编译指导,可参考网上的`linux下opencv与ffmpeg联合编译`资料。
+>
+> 因此,本QT的Demo在`Jetson Xavier`上,使用`原生opencv`,仅支持图片以及连续图片的预测,视频预测需要自行编译新的opencv,以支持视频读取——只要opencv编译成功,以上编译动态链接库的`opencv路径`仅需相应修改即可(**记得更改后重新编译生成**),同时修改QT导入的`opencv库路径`就可以正常使用该可视化Demo了。
+
+> 此时QT项目中,标红的错误应该会消失——可以稍微等一下QT项目更新项目缓存,记得保存修改!
+
+**PS:**
+
+在运行项目时,请注意,如果报`cv::xxx`未定义,可能为以下情况:
+- 1. opencv头文件引入有误,当前采用导入的形式,而非使用系统路径,因此,应该保证引入符号为`""`,而非`<>`.
+
+<div>
+  <img src="./images/import_opencv.png">
+  </div>
+
+- 2. 如果出现导入pro中的`LIBS`有误,显示没有该文件(No Such File or Path)等报错,请查看opencv动态链接库是否配置正确.
+- 3. 当前测试中,所有路径均为`英文路径`,因此在使用`中文路径`时如遇报错,请换成`英文路径`重新导入.
+- 4. 以上情况外,还可能时QT编译环境与opencv编译的环境不同所导致,QT无法引用opencv1的动态链接库中的内容.
+
+> `Jetson`下仅测试了英文路径,`Windows`上中英文路径均无误。
+
+
+## 5 启动QT可视化界面<a id="## 5 启动QT可视化界面"/>
+
+启动后界面如下:
+<div>
+  <img src="./images/show_menu.png">
+  </div>
+
+### 5.1 功能介绍<a id="### 5.1 功能介绍"/>
+
+- 1.可加载PaddleSeg, PaddleClas, PaddleDetection以及PaddleX导出的部署模型, 分别对应模型选择中的: seg, clas, det, paddlex
+- 2.目前也支持`GPU`下加载MaskRCNN进行实例分割可视化推理,需选择模型: mask
+- 3.支持CPU与GPU推理,同时支持指定GPU运行 —— 当前在单卡上测试默认为0运行正常,非法指定不存在的id无法初始化模型;且可能引发异常导致程序崩溃
+- 4.支持单张图片(png, jpg)、图片文件夹、**视频流(mp4)推理(Jetson原装opencv下不支持,Windows测试无误)**
+- 5.支持目标检测时,设定检测结果显示阈值
+- 6.支持图片文件夹推理(即连续图片推理)时,设定连续推理间隔,方便观察预测效果
+- 7.支持推理中断:图片文件夹推理过程+视频流推理过程
+
+### 5.2 使用说明<a id="### 5.2 使用说明"/>
+
+- 1.选择模型类型:det、seg、clas、mask、paddlex
+- 2.选择运行环境:CPU、GPU
+- 3.点击初始化模型,选择模型文件夹即可 —— 文件夹格式如下
+   - inference_model
+       - *.yml
+       - *.pdmodel
+       - *.pdiparams
+       - paddlex的模型含有两个yml,其余套件导出只有一个yml/yaml
+- 4.加载图片/图片文件夹/视频流
+- 5.模型推理
+- 6.(非单张推理时支持)执行提前推理中断
+- 7.加载新模型,需要先点击销毁模型,然后再设置模型类型以及运行环境,最后重新初始化新模型
+- 8.在目标检测过程中,可设置检测阈值
+- 9.在文件夹推理过程中,可设置连续推理间隔时间
+- 10.可通过查看左上角实时推理耗时来查看模型预处理+推理-后处理的时间**(Jetson上,CPU推理过慢,建议直接使用GPU推理)**
+- 11.可编辑GPU_id,设置初始化时模型运行在指定GPU上——请根据实际硬件设置,默认为0
+
+### 5.3 使用效果展示<a id="### 5.3 使用效果展示"/>
+
+**CPU推理**
+<div>
+  <img src="./images/cpu_infer.png">
+  </div>
+
+**GPU推理**
+<div>
+  <img src="./images/gpu_infer.png">
+  </div>
+  
+-----
+  
+## 6 QT开发注解<a id="## 6 QT开发注解"/>
+
+> 一些方便大家修改Demo界面源码,以实现一些适配工作。
+
+### 6.1 利用`QDebug`实现运行日志输出<a id="### 6.1 利用`QDebug`实现运行日志输出"/>
+
+首先在mainwindow.cpp中导入`#include <QDebug>`,以支持QT程序运行过程中输出一些人为添加的日志输出。
+
+实例代码:
+<div>
+  <img src="./images/debug_incode.png">
+  </div>
+
+输出如下:
+<div>
+  <img src="./images/output_info.png">
+  </div>
+
+### 6.2 本项目的组织结构<a id="### 6.2 本项目的组织结构"/>
+
+工程目录:
+<div>
+  <img src="./images/project_list.png">
+  </div>
+
+- inferthread.cpp/inferthread.h :`推理子线程` —— 包含具体的推理执行函数的实现,以及运行控件状态信号的发出(比如,更新图片显示,与控件的使能与关闭)
+- mainwindow.cpp/mainwindow.h :`主线程` —— 包含界面布局控制的实现,推理文件加载(也包含动态链接库)的实现,以及推理线程的启动与中止信号的发出(比如,启动推离线程)
+
+### 6.3 控制读取图片的推理大小<a id="### 6.3 控制读取图片的推理大小"/>
+
+由于QT可视化界面可能用于不同的系统,因此对于可分配运行内存而言是有一定的考量的,因此对于读取的图片在进行推理以及显示前进行一定的缩放,能够减少内存的消耗。
+
+> 不过,显示时要保证图像宽高均可被4整除,否则控件图片显示可能有误。
+
+每一个具体推理的函数里,都有以下操作,对推理前的图片进行缩放:
+<div>
+  <img src="./images/tupian_suofang.png">
+  </div>
+
+### 6.4 子线程控制主线程控件的建议<a id="### 6.4 子线程控制主线程控件的建议"/>
+
+不要直接使用子线程对主线程控件进行控制,避免导致线程报错,线程问题不易debug——因此,多用信号与槽来实现交互。
+
+实现思路: 
+1. 在子线程需要对主线程中控件进行控制时,发送一个信号
+2. 主线程在消息循环机制中持续运行时,接收信号,执行对应的槽,实现控件的控制
+
+具体实现流程:
+1. 在子线程的.h文件中,编写函数声明即可——不用具体实现。
+2. 在主线程的.h文件中,先编写槽函数的声明,然后在.cpp中去实现它,完成信号的传递。
+3. 在编写好槽函数定义后,进行connect完成槽函数与信号的链接。
+
+**PS:**
+子线程信号声明如下:
+<div>
+  <img src="./images/thread_signal.png">
+  </div>
+  
+主线程槽声明如下:
+<div>
+  <img src="./images/main_slot.png">
+  </div>
+
+### 6.5 修改model_infer.cpp函数后的动态链接库导入指导<a id="### 6.5 修改model_infer.cpp函数后的动态链接库导入指导"/>
+
+1.进入`inferthread.h`与`mainwindow.h`中,使用`typdef定义函数指针`,按照`函数参数格式`进行定义。
+
+即:model_infer.cpp中函数怎么定义的,这边就构建一个相同定义支持的函数指针。
+
+2.`inferthread.h`中的定义展示:
+<div>
+  <img src="./images/qt_inferthreadcpp_set.png">
+  </div>
+
+3.`mainwindow.h`中的定义展示:
+<div>
+  <img src="./images/qt_mainwindowset.png">
+  </div>
+
+### 6.6 QT动态链接库的导入说明<a id="### 6.6 QT动态链接库的导入说明"/>
+
+本项目只需要导入一个自己定义的动态链接库,因此使用QLibrary进行导入,该方法不适用于多动态库的导入,容易代码量多导致混淆。
+
+该项目的导入如下:
+<div>
+  <img src="./images/qt_importppipeline.png">
+  </div>
+
+### 6.7 移植小贴士<a id="### 6.7 移植小贴士"/>
+
+1. 运行环境改变时,注意运行环境对编译的库的影响,配置正确的opencv路径以及编译器版本
+
+  2. 对于linux,QT默认使用系统gcc进行编译,不用选择编译器
+  3. 对于windows,QT使用MinGW或者MSVC等,此时需要注意QT使用的编译器与opencv、model_infer动态链接库等的编译器是否一致,或者是否可以兼容.
+
+2. 避免QT版本移植问题,先创建一个新的项目,然后拷贝已有的旧项目中的ui、cpp与h文件到新项目,同时修改pro文件中的配置一致即可完成跨版本跨平台的移植。
+
+
+## 本项目的Demo的GUI也同时支持Linux、Windows上进行使用,但请自行编译好opencv,安装QT,移植流程区别如下。<a id="## 本项目的Demo的GUI也同时支持Linux、Windows上进行使用,但请自行编译好opencv,安装QT,移植流程区别如下。"/>
+
+> 注意deploy编译所需的流程,可参考PaddleX模型推理部署deploy在Linux以及Windows上的[编译指南](https://github.com/cjh3020889729/PaddleX/tree/develop/deploy/cpp).
+
+> 注意区分不同平台上,动态链接库的命名区别: windows-*.a , linux/jetson-*.so
+
+- Windows平台移植测试完成 -- 自行编译opencv3.4.6/4.1.1,需保证支持QT,GL
+  - windows上编译opencv可参考: [为qt编译opencv](http://159.138.37.243/article/z634863434/89950961)
+  - 如cmake的configure中出现红字,说找不到ffmpeg相关包,属于网络问题,无法下载该相关dll,需要自行下载后进行相关处理,可参考: [ffmpeg下载失败处理方法](https://www.cxyzjd.com/article/pyt1234567890/106525475)
+- Jetson Xavier平台移植测试完成 -- 预编译opencv4.1.1,已支持QT和GL
+- Linux平台移植界面测试完成 -- opencv以及模型推理所需的动态链接库(可按照该项目的`CMakeList.txt`与`model_infer.cpp`替换原文件,然后按照[linux编译方法](../../deploy/cpp/docs/compile/paddle/linux.md)进行编译)去自行生成。
+
+

BIN
deploy/cpp/docs/jetson-deploy/images/cmakelist_set.png


BIN
deploy/cpp/docs/jetson-deploy/images/cpu_infer.png


BIN
deploy/cpp/docs/jetson-deploy/images/debug_incode.png


BIN
deploy/cpp/docs/jetson-deploy/images/deploy_build_sh.png


BIN
deploy/cpp/docs/jetson-deploy/images/dong_tai_lianjieku.png


BIN
deploy/cpp/docs/jetson-deploy/images/gpu_infer.png


BIN
deploy/cpp/docs/jetson-deploy/images/import_opencv.png


BIN
deploy/cpp/docs/jetson-deploy/images/infer_demo_cmakelist.png


BIN
deploy/cpp/docs/jetson-deploy/images/main_slot.png


BIN
deploy/cpp/docs/jetson-deploy/images/output_info.png


BIN
deploy/cpp/docs/jetson-deploy/images/pro_set_libpath.png


BIN
deploy/cpp/docs/jetson-deploy/images/project_list.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_add_newfile.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_create_class.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_create_guipro.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_importppipeline.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_inferthreadcpp_set.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_mainwindowset.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_project_inferlib.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_set_proname.png


BIN
deploy/cpp/docs/jetson-deploy/images/qt_start_new_pro.png


BIN
deploy/cpp/docs/jetson-deploy/images/show_menu.png


BIN
deploy/cpp/docs/jetson-deploy/images/thread_import_opencv.png


BIN
deploy/cpp/docs/jetson-deploy/images/thread_signal.png


BIN
deploy/cpp/docs/jetson-deploy/images/tupian_suofang.png


BIN
deploy/cpp/docs/jetson-deploy/images/yaml_cmakelist.png


+ 312 - 0
deploy/cpp/docs/jetson-deploy/model_infer.cpp

@@ -0,0 +1,312 @@
+#include <string>
+#include <vector>
+
+#include "model_deploy/common/include/paddle_deploy.h"
+
+// Global model pointer
+PaddleDeploy::Model* model;
+
+/*
+* Model initialization / registration API
+* 
+* model_type: det,seg,clas,paddlex
+* 
+* model_filename: Model file path
+* 
+* params_filename: Parameter file path
+* 
+* cfg_file: Configuration file path
+* 
+* use_gpu: Whether to use GPU
+* 
+* gpu_id: Specify GPU x
+* 
+* paddlex_model_type: When Model_Type is paddlx, the type of actual Paddlex model returned - det, seg, clas
+* 
+*/
+extern "C" void InitModel(const char* model_type, const char* model_filename, const char* params_filename, const char* cfg_file, bool use_gpu, int gpu_id, char* paddlex_model_type)
+{
+	// create model
+	model = PaddleDeploy::CreateModel(model_type);  //FLAGS_model_type
+
+	// model init
+	model->Init(cfg_file);
+
+	// inference engine init
+	PaddleDeploy::PaddleEngineConfig engine_config;
+	engine_config.model_filename = model_filename;
+	engine_config.params_filename = params_filename;
+	engine_config.use_gpu = use_gpu;
+	engine_config.gpu_id = gpu_id;
+	bool init = model->PaddleEngineInit(engine_config);
+	if (init)
+	{
+		std::cout << "init model success" << std::endl;
+	}
+
+	// det, seg, clas, paddlex
+	if (strcmp(model_type, "paddlex") == 0) // If it is a PADDLEX model, return the specifically supported model type: det, seg, clas
+	{
+		// detector
+		if (model->yaml_config_["model_type"].as<std::string>() == std::string("detector"))
+		{
+			strcpy(paddlex_model_type, "det");
+		}
+		else if (model->yaml_config_["model_type"].as<std::string>() == std::string("segmenter"))
+		{
+			strcpy(paddlex_model_type, "seg");
+		}
+		else if (model->yaml_config_["model_type"].as<std::string>() == std::string("classifier"))
+		{
+			strcpy(paddlex_model_type, "clas");
+		}
+	}
+} 
+
+
+/*
+* Detection inference API
+* 
+* img: input for predicting.
+*
+* nWidth: width of img.
+*
+* nHeight: height of img.
+*
+* nChannel: channel of img.
+*
+* output: result of pridict ,include category_id£¬score£¬coordinate¡£
+*
+* nBoxesNum£º number of box
+*
+* LabelList: label list of result
+* 
+* extern "C"
+*/
+extern "C" void Det_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* output, int* nBoxesNum, char* LabelList)
+{
+	// prepare data
+	std::vector<cv::Mat> imgs;
+
+	int nType = 0;
+	if (nChannel == 3)
+	{
+		nType = CV_8UC3;
+	}
+	else
+	{
+		std::cout << "Only support 3 channel image." << std::endl;
+		return;
+	}
+
+	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
+	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
+	imgs.push_back(std::move(input));
+
+	// predict
+	std::vector<PaddleDeploy::Result> results;
+	model->Predict(imgs, &results, 1);
+
+	// nBoxesNum[0] = results.size();  // results.size() is returning batch_size
+	nBoxesNum[0] = results[0].det_result->boxes.size();  // Get the predicted Bounding Box number of a single image
+	std::string label = "";
+	//std::cout << "res: " << results[num] << std::endl;
+	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // Get the data for all the boxes
+	{
+		label = label + results[0].det_result->boxes[i].category + " ";
+		// labelindex
+		output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // Category ID
+		// score
+		output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // Score
+		//// box
+		output[i * 6 + 2] = results[0].det_result->boxes[i].coordinate[0]; // x1, y1, x2, y2
+		output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // Upper left and lower right vertices
+		output[i * 6 + 4] = results[0].det_result->boxes[i].coordinate[2];
+		output[i * 6 + 5] = results[0].det_result->boxes[i].coordinate[3];
+	}
+	memcpy(LabelList, label.c_str(), strlen(label.c_str()));
+}
+
+
+/*
+* Segmented inference 
+* 
+* img: input for predicting.
+*
+* nWidth: width of img.
+*
+* nHeight: height of img.
+*
+* nChannel: channel of img.
+*
+* output: result of pridict ,include label_map
+* 
+* extern "C"
+*/
+extern "C" void Seg_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, unsigned char* output)
+{
+	// prepare data
+	std::vector<cv::Mat> imgs;
+
+	int nType = 0;
+	if (nChannel == 3)
+	{
+		nType = CV_8UC3;
+	}
+	else
+	{
+		std::cout << "Only support 3 channel image." << std::endl;
+		return;
+	}
+
+	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
+	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
+	imgs.push_back(std::move(input));
+
+	// predict
+	std::vector<PaddleDeploy::Result> results;
+	model->Predict(imgs, &results, 1);
+
+	std::vector<uint8_t> result_map = results[0].seg_result->label_map.data; // vector<uint8_t> -- Result Map
+	// Copy output result to the output back -- from vector<uint8_t> to unsigned char *
+	memcpy(output, &result_map[0], result_map.size() * sizeof(uchar));
+}
+
+
+/*
+* Recognition inference API
+* 
+* img: input for predicting.
+*
+* nWidth: width of img.
+*
+* nHeight: height of img.
+*
+* nChannel: channel of img.
+*
+* score: result of pridict ,include score
+* 
+* category: result of pridict ,include category_string
+* 
+* category_id: result of pridict ,include category_id
+* 
+* extern "C" 
+*/
+extern "C" void Cls_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* score, char* category, int* category_id)
+{
+	// prepare data
+	std::vector<cv::Mat> imgs;
+
+	int nType = 0;
+	if (nChannel == 3)
+	{
+		nType = CV_8UC3;
+	}
+	else
+	{
+		std::cout << "Only support 3 channel image." << std::endl;
+		return;
+	}
+
+	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
+	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
+	imgs.push_back(std::move(input));
+
+	// predict
+	std::vector<PaddleDeploy::Result> results;
+	model->Predict(imgs, &results, 1);
+
+	*category_id = results[0].clas_result->category_id;
+	// Copy output category result to output -- string --> char* 
+	memcpy(category, results[0].clas_result->category.c_str(), strlen(results[0].clas_result->category.c_str()));
+	// Copy output probability value
+	*score = results[0].clas_result->score;
+}	
+
+
+/*
+* MaskRCNN Reasoning 
+* 
+* img: input for predicting.
+*
+* nWidth: width of img.
+*
+* nHeight: height of img.
+*
+* nChannel: channel of img.
+*
+* box_output: result of pridict ,include label+score+bbox
+*
+* mask_output: result of pridict ,include label_map
+*
+* nBoxesNum: result of pridict ,include BoxesNum
+* 
+* LabelList: result of pridict ,include LabelList
+* 
+* extern "C"
+*/
+extern "C" void Mask_ModelPredict(const unsigned char* img, int nWidth, int nHeight, int nChannel, float* box_output, unsigned char* mask_output, int* nBoxesNum, char* LabelList)
+{
+	// prepare data
+	std::vector<cv::Mat> imgs;
+
+	int nType = 0;
+	if (nChannel == 3)
+	{
+		nType = CV_8UC3;
+	}
+	else
+	{
+		std::cout << "Only support 3 channel image." << std::endl;
+		return;
+	}
+
+	cv::Mat input = cv::Mat::zeros(cv::Size(nWidth, nHeight), nType);
+	memcpy(input.data, img, nHeight * nWidth * nChannel * sizeof(uchar));
+	imgs.push_back(std::move(input));
+
+	// predict
+	std::vector<PaddleDeploy::Result> results;
+	model->Predict(imgs, &results, 1);
+
+	nBoxesNum[0] = results[0].det_result->boxes.size();  // Get the predicted Bounding Box number of a single image
+	std::string label = "";
+
+	for (int i = 0; i < results[0].det_result->boxes.size(); i++)  // Get the data for all the boxes
+	{
+		// prediction results
+		label = label + results[0].det_result->boxes[i].category + " ";
+		// labelindex
+		box_output[i * 6 + 0] = results[0].det_result->boxes[i].category_id; // Category ID
+		// score
+		box_output[i * 6 + 1] = results[0].det_result->boxes[i].score;  // Score
+		//// box
+		box_output[i * 6 + 2] = results[0].det_result->boxes[i].coordinate[0]; // x1, y1, x2, y2
+		box_output[i * 6 + 3] = results[0].det_result->boxes[i].coordinate[1]; // Upper left and lower right vertices
+		box_output[i * 6 + 4] = results[0].det_result->boxes[i].coordinate[2];
+		box_output[i * 6 + 5] = results[0].det_result->boxes[i].coordinate[3];
+		
+		// Mask prediction results
+		for (int j = 0; j < results[0].det_result->boxes[i].mask.data.size(); j++)
+		{
+			if (mask_output[j] == 0)
+			{
+				mask_output[j] = results[0].det_result->boxes[i].mask.data[j];
+			}
+		}
+
+	}
+	memcpy(LabelList, label.c_str(), strlen(label.c_str()));
+}
+
+
+/*
+* Model destruction API
+* 
+* extern "C" 
+*/
+extern "C" void DestructModel()
+{
+	delete model;
+	std::cout << "destruct model success" << std::endl;
+}

+ 3 - 3
deploy/cpp/docs/manufacture_sdk/README.md

@@ -1,6 +1,6 @@
 # PaddleClas模型部署
 
-当前支持PaddleClas release/2.1分支导出的模型进行部署。本文档以ResNet50模型为例,讲述从release-2.1分支导出模型并用PaddleX 进行cpp部署整个流程。 PaddleClas相关详细文档可以查看[官网文档](https://github.com/PaddlePaddle/PaddleClas/blob/release/2.1/README_cn.md)
+当前支持PaddleClas release/2.2分支导出的模型进行部署。本文档以ResNet50模型为例,讲述从release/2.2分支导出模型并用PaddleX 进行cpp部署整个流程。 PaddleClas相关详细文档可以查看[官网文档](https://github.com/PaddlePaddle/PaddleClas/blob/release/2.2/README_cn.md)
 
 
 
@@ -57,8 +57,8 @@ ResNet50
 参考编译文档
 
 - [Linux系统上编译指南](../compile/paddle/linux.md)
-- [Windows系统上编译指南](../compile/paddle/windows.md)
-
+- [Windows系统上编译指南(生成exe)](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成dll供C#调用)](../../../../examples/C%23_deploy/)
 
 
 ## 步骤三 模型预测

+ 3 - 2
deploy/cpp/docs/models/paddledetection.md

@@ -1,6 +1,6 @@
 # PaddleDetection模型部署
 
-当前支持PaddleDetection release/0.5和release/2.1分支导出的模型进行部署(仅支持FasterRCNN/MaskRCNN/PPYOLO/PPYOLOv2/YOLOv3)。PaddleDetection相关详细文档可以查看[官网文档](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.1)。
+当前支持PaddleDetection release/0.5和release/2.2分支导出的模型进行部署(仅支持FasterRCNN/MaskRCNN/PPYOLO/PPYOLOv2/YOLOv3)。PaddleDetection相关详细文档可以查看[官网文档](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.2)。
 
 下面主要以YoloV3为例,讲解从模型导出到部署的整个流程。
 
@@ -50,7 +50,8 @@ yolov3_darknet
 参考编译文档
 
 - [Linux系统上编译指南](../compile/paddle/linux.md)
-- [Windows系统上编译指南](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成exe)](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成dll供C#调用)](../../../../examples/C%23_deploy/)
 
 
 

+ 3 - 2
deploy/cpp/docs/models/paddleseg.md

@@ -1,6 +1,6 @@
 # PaddleSeg模型部署
 
-当前支持PaddleSeg release/2.1分支训练的模型进行导出及部署。本文档以[Deeplabv3P](https://github.com/PaddlePaddle/PaddleSeg/blob/release/v2.0/configs/deeplabv3p)模型为例,讲述从release-2.1版本导出模型并进行cpp部署整个流程。 PaddleSeg相关详细文档查看[官网文档](https://github.com/PaddlePaddle/PaddleSeg/blob/release/2.1/README_CN.md)
+当前支持PaddleSeg release/2.2分支训练的模型进行导出及部署。本文档以[Deeplabv3P](https://github.com/PaddlePaddle/PaddleSeg/blob/release/v2.2/configs/deeplabv3p)模型为例,讲述从release/2.2版本导出模型并进行cpp部署整个流程。 PaddleSeg相关详细文档查看[官网文档](https://github.com/PaddlePaddle/PaddleSeg/blob/release/2.2/README_CN.md)
 
 ## 步骤一 部署模型导出
 
@@ -39,7 +39,8 @@ output
 参考编译文档
 
 - [Linux系统上编译指南](../compile/paddle/linux.md)
-- [Windows系统上编译指南](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成exe)](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成dll供C#调用)](../../../../examples/C%23_deploy/)
 
 ## 步骤三 模型预测
 

+ 4 - 5
deploy/cpp/docs/models/paddlex.md

@@ -5,7 +5,7 @@
 
 ## 步骤一 部署模型导出
 
-请参考[PaddlX模型导出文档](https://github.com/PaddlePaddle/PaddleX/blob/develop/docs/apis/export_model.md)
+请参考[PaddleX模型导出文档](https://github.com/PaddlePaddle/PaddleX/blob/develop/docs/apis/export_model.md)
 
 
 ## 步骤二 编译
@@ -13,8 +13,8 @@
 参考编译文档
 
 - [Linux系统上编译指南](../compile/paddle/linux.md)
-- [Windows系统上编译指南](../compile/paddle/windows.md)
-
+- [Windows系统上编译指南(生成exe)](../compile/paddle/windows.md)
+- [Windows系统上编译指南(生成dll供C#调用)](../../../../examples/C%23_deploy/)
 
 ## 步骤三 模型预测
 
@@ -54,5 +54,4 @@ Classify(809    sunscreen   0.939211)
 
 - [单卡加载模型预测示例](../demo/model_infer.md)
 - [多卡加载模型预测示例](../demo/multi_gpu_model_infer.md)
-- [PaddleInference集成TensorRT加载模型预测示例](../../demo/tensorrt_infer.md)
-- [模型加密预测示例](./docs/demo/decrypt_infer.md)
+- [PaddleInference集成TensorRT加载模型预测示例](../demo/tensorrt_infer.md)

+ 39 - 0
deploy/cpp/scripts/jetson_build.sh

@@ -0,0 +1,39 @@
+# 是否使用GPU(即是否使用 CUDA)
+WITH_GPU=ON
+# 使用MKL or openblas
+WITH_MKL=OFF
+# 是否集成 TensorRT(仅WITH_GPU=ON 有效)
+WITH_PADDLE_TENSORRT=OFF
+# TensorRT 的路径,如果需要集成TensorRT,需修改为您实际安装的TensorRT路径
+TENSORRT_DIR=$(pwd)/TensorRT/
+# Paddle 预测库路径, 请修改为您实际安装的预测库路径
+PADDLE_DIR=$(pwd)/paddle_inference
+# Paddle 的预测库是否使用静态库来编译
+# 使用TensorRT时,Paddle的预测库通常为动态库
+WITH_STATIC_LIB=OFF
+# CUDA 的 lib 路径
+CUDA_LIB=/usr/local/cuda/lib64
+# CUDNN 的 lib 路径
+CUDNN_LIB=/usr/lib/aarch64-linux-gnu
+# 是否加密
+WITH_ENCRYPTION=OFF
+# OPENSSL 路径
+OPENSSL_DIR=$(pwd)/deps/openssl-1.1.0k
+
+
+# 以下无需改动
+rm -rf build
+mkdir -p build
+cd build
+sudo cmake .. \
+    -DWITH_GPU=${WITH_GPU} \
+    -DWITH_MKL=${WITH_MKL} \
+    -DWITH_PADDLE_TENSORRT=${WITH_PADDLE_TENSORRT} \
+    -DTENSORRT_DIR=${TENSORRT_DIR} \
+    -DPADDLE_DIR=${PADDLE_DIR} \
+    -DWITH_STATIC_LIB=${WITH_STATIC_LIB} \
+    -DCUDA_LIB=${CUDA_LIB} \
+    -DCUDNN_LIB=${CUDNN_LIB} \
+    -DWITH_ENCRYPTION=${WITH_ENCRYPTION} \
+    -DOPENSSL_DIR=${OPENSSL_DIR}
+make -j8

+ 0 - 1026
deploy/resources/resnet50_imagenet.yml

@@ -1,1026 +0,0 @@
-model_format: Paddle
-toolkit: PaddleClas
-toolkit_version: unknown
-input_tensor_name: inputs
-transforms:
-  BGR2RGB: ~
-  ResizeByShort:
-    target_size: 256
-    interp: 1
-    use_scale: false
-  CenterCrop:
-    width: 224
-    height: 224
-  Convert:
-    dtype: float
-  Normalize:
-    mean:
-      - 0.485
-      - 0.456
-      - 0.406
-    std:
-      - 0.229
-      - 0.224
-      - 0.225
-  Permute: ~
-labels:
-  - kit_fox
-  - English_setter
-  - Siberian_husky
-  - Australian_terrier
-  - English_springer
-  - grey_whale
-  - lesser_panda
-  - Egyptian_cat
-  - ibex
-  - Persian_cat
-  - cougar
-  - gazelle
-  - porcupine
-  - sea_lion
-  - malamute
-  - badger
-  - Great_Dane
-  - Walker_hound
-  - Welsh_springer_spaniel
-  - whippet
-  - Scottish_deerhound
-  - killer_whale
-  - mink
-  - African_elephant
-  - Weimaraner
-  - soft-coated_wheaten_terrier
-  - Dandie_Dinmont
-  - red_wolf
-  - Old_English_sheepdog
-  - jaguar
-  - otterhound
-  - bloodhound
-  - Airedale
-  - hyena
-  - meerkat
-  - giant_schnauzer
-  - titi
-  - three-toed_sloth
-  - sorrel
-  - black-footed_ferret
-  - dalmatian
-  - black-and-tan_coonhound
-  - papillon
-  - skunk
-  - Staffordshire_bullterrier
-  - Mexican_hairless
-  - Bouvier_des_Flandres
-  - weasel
-  - miniature_poodle
-  - Cardigan
-  - malinois
-  - bighorn
-  - fox_squirrel
-  - colobus
-  - tiger_cat
-  - Lhasa
-  - impala
-  - coyote
-  - Yorkshire_terrier
-  - Newfoundland
-  - brown_bear
-  - red_fox
-  - Norwegian_elkhound
-  - Rottweiler
-  - hartebeest
-  - Saluki
-  - grey_fox
-  - schipperke
-  - Pekinese
-  - Brabancon_griffon
-  - West_Highland_white_terrier
-  - Sealyham_terrier
-  - guenon
-  - mongoose
-  - indri
-  - tiger
-  - Irish_wolfhound
-  - wild_boar
-  - EntleBucher
-  - zebra
-  - ram
-  - French_bulldog
-  - orangutan
-  - basenji
-  - leopard
-  - Bernese_mountain_dog
-  - Maltese_dog
-  - Norfolk_terrier
-  - toy_terrier
-  - vizsla
-  - cairn
-  - squirrel_monkey
-  - groenendael
-  - clumber
-  - Siamese_cat
-  - chimpanzee
-  - komondor
-  - Afghan_hound
-  - Japanese_spaniel
-  - proboscis_monkey
-  - guinea_pig
-  - white_wolf
-  - ice_bear
-  - gorilla
-  - borzoi
-  - toy_poodle
-  - Kerry_blue_terrier
-  - ox
-  - Scotch_terrier
-  - Tibetan_mastiff
-  - spider_monkey
-  - Doberman
-  - Boston_bull
-  - Greater_Swiss_Mountain_dog
-  - Appenzeller
-  - Shih-Tzu
-  - Irish_water_spaniel
-  - Pomeranian
-  - Bedlington_terrier
-  - warthog
-  - Arabian_camel
-  - siamang
-  - miniature_schnauzer
-  - collie
-  - golden_retriever
-  - Irish_terrier
-  - affenpinscher
-  - Border_collie
-  - hare
-  - boxer
-  - silky_terrier
-  - beagle
-  - Leonberg
-  - German_short-haired_pointer
-  - patas
-  - dhole
-  - baboon
-  - macaque
-  - Chesapeake_Bay_retriever
-  - bull_mastiff
-  - kuvasz
-  - capuchin
-  - pug
-  - curly-coated_retriever
-  - Norwich_terrier
-  - flat-coated_retriever
-  - hog
-  - keeshond
-  - Eskimo_dog
-  - Brittany_spaniel
-  - standard_poodle
-  - Lakeland_terrier
-  - snow_leopard
-  - Gordon_setter
-  - dingo
-  - standard_schnauzer
-  - hamster
-  - Tibetan_terrier
-  - Arctic_fox
-  - wire-haired_fox_terrier
-  - basset
-  - water_buffalo
-  - American_black_bear
-  - Angora
-  - bison
-  - howler_monkey
-  - hippopotamus
-  - chow
-  - giant_panda
-  - American_Staffordshire_terrier
-  - Shetland_sheepdog
-  - Great_Pyrenees
-  - Chihuahua
-  - tabby
-  - marmoset
-  - Labrador_retriever
-  - Saint_Bernard
-  - armadillo
-  - Samoyed
-  - bluetick
-  - redbone
-  - polecat
-  - marmot
-  - kelpie
-  - gibbon
-  - llama
-  - miniature_pinscher
-  - wood_rabbit
-  - Italian_greyhound
-  - lion
-  - cocker_spaniel
-  - Irish_setter
-  - dugong
-  - Indian_elephant
-  - beaver
-  - Sussex_spaniel
-  - Pembroke
-  - Blenheim_spaniel
-  - Madagascar_cat
-  - Rhodesian_ridgeback
-  - lynx
-  - African_hunting_dog
-  - langur
-  - Ibizan_hound
-  - timber_wolf
-  - cheetah
-  - English_foxhound
-  - briard
-  - sloth_bear
-  - Border_terrier
-  - German_shepherd
-  - otter
-  - koala
-  - tusker
-  - echidna
-  - wallaby
-  - platypus
-  - wombat
-  - revolver
-  - umbrella
-  - schooner
-  - soccer_ball
-  - accordion
-  - ant
-  - starfish
-  - chambered_nautilus
-  - grand_piano
-  - laptop
-  - strawberry
-  - airliner
-  - warplane
-  - airship
-  - balloon
-  - space_shuttle
-  - fireboat
-  - gondola
-  - speedboat
-  - lifeboat
-  - canoe
-  - yawl
-  - catamaran
-  - trimaran
-  - container_ship
-  - liner
-  - pirate
-  - aircraft_carrier
-  - submarine
-  - wreck
-  - half_track
-  - tank
-  - missile
-  - bobsled
-  - dogsled
-  - bicycle-built-for-two
-  - mountain_bike
-  - freight_car
-  - passenger_car
-  - barrow
-  - shopping_cart
-  - motor_scooter
-  - forklift
-  - electric_locomotive
-  - steam_locomotive
-  - amphibian
-  - ambulance
-  - beach_wagon
-  - cab
-  - convertible
-  - jeep
-  - limousine
-  - minivan
-  - Model_T
-  - racer
-  - sports_car
-  - go-kart
-  - golfcart
-  - moped
-  - snowplow
-  - fire_engine
-  - garbage_truck
-  - pickup
-  - tow_truck
-  - trailer_truck
-  - moving_van
-  - police_van
-  - recreational_vehicle
-  - streetcar
-  - snowmobile
-  - tractor
-  - mobile_home
-  - tricycle
-  - unicycle
-  - horse_cart
-  - jinrikisha
-  - oxcart
-  - bassinet
-  - cradle
-  - crib
-  - four-poster
-  - bookcase
-  - china_cabinet
-  - medicine_chest
-  - chiffonier
-  - table_lamp
-  - file
-  - park_bench
-  - barber_chair
-  - throne
-  - folding_chair
-  - rocking_chair
-  - studio_couch
-  - toilet_seat
-  - desk
-  - pool_table
-  - dining_table
-  - entertainment_center
-  - wardrobe
-  - Granny_Smith
-  - orange
-  - lemon
-  - fig
-  - pineapple
-  - banana
-  - jackfruit
-  - custard_apple
-  - pomegranate
-  - acorn
-  - hip
-  - ear
-  - rapeseed
-  - corn
-  - buckeye
-  - organ
-  - upright
-  - chime
-  - drum
-  - gong
-  - maraca
-  - marimba
-  - steel_drum
-  - banjo
-  - cello
-  - violin
-  - harp
-  - acoustic_guitar
-  - electric_guitar
-  - cornet
-  - French_horn
-  - trombone
-  - harmonica
-  - ocarina
-  - panpipe
-  - bassoon
-  - oboe
-  - sax
-  - flute
-  - daisy
-  - yellow_lady's_slipper
-  - cliff
-  - valley
-  - alp
-  - volcano
-  - promontory
-  - sandbar
-  - coral_reef
-  - lakeside
-  - seashore
-  - geyser
-  - hatchet
-  - cleaver
-  - letter_opener
-  - plane
-  - power_drill
-  - lawn_mower
-  - hammer
-  - corkscrew
-  - can_opener
-  - plunger
-  - screwdriver
-  - shovel
-  - plow
-  - chain_saw
-  - cock
-  - hen
-  - ostrich
-  - brambling
-  - goldfinch
-  - house_finch
-  - junco
-  - indigo_bunting
-  - robin
-  - bulbul
-  - jay
-  - magpie
-  - chickadee
-  - water_ouzel
-  - kite
-  - bald_eagle
-  - vulture
-  - great_grey_owl
-  - black_grouse
-  - ptarmigan
-  - ruffed_grouse
-  - prairie_chicken
-  - peacock
-  - quail
-  - partridge
-  - African_grey
-  - macaw
-  - sulphur-crested_cockatoo
-  - lorikeet
-  - coucal
-  - bee_eater
-  - hornbill
-  - hummingbird
-  - jacamar
-  - toucan
-  - drake
-  - red-breasted_merganser
-  - goose
-  - black_swan
-  - white_stork
-  - black_stork
-  - spoonbill
-  - flamingo
-  - American_egret
-  - little_blue_heron
-  - bittern
-  - crane
-  - limpkin
-  - American_coot
-  - bustard
-  - ruddy_turnstone
-  - red-backed_sandpiper
-  - redshank
-  - dowitcher
-  - oystercatcher
-  - European_gallinule
-  - pelican
-  - king_penguin
-  - albatross
-  - great_white_shark
-  - tiger_shark
-  - hammerhead
-  - electric_ray
-  - stingray
-  - barracouta
-  - coho
-  - tench
-  - goldfish
-  - eel
-  - rock_beauty
-  - anemone_fish
-  - lionfish
-  - puffer
-  - sturgeon
-  - gar
-  - loggerhead
-  - leatherback_turtle
-  - mud_turtle
-  - terrapin
-  - box_turtle
-  - banded_gecko
-  - common_iguana
-  - American_chameleon
-  - whiptail
-  - agama
-  - frilled_lizard
-  - alligator_lizard
-  - Gila_monster
-  - green_lizard
-  - African_chameleon
-  - Komodo_dragon
-  - triceratops
-  - African_crocodile
-  - American_alligator
-  - thunder_snake
-  - ringneck_snake
-  - hognose_snake
-  - green_snake
-  - king_snake
-  - garter_snake
-  - water_snake
-  - vine_snake
-  - night_snake
-  - boa_constrictor
-  - rock_python
-  - Indian_cobra
-  - green_mamba
-  - sea_snake
-  - horned_viper
-  - diamondback
-  - sidewinder
-  - European_fire_salamander
-  - common_newt
-  - eft
-  - spotted_salamander
-  - axolotl
-  - bullfrog
-  - tree_frog
-  - tailed_frog
-  - whistle
-  - wing
-  - paintbrush
-  - hand_blower
-  - oxygen_mask
-  - snorkel
-  - loudspeaker
-  - microphone
-  - screen
-  - mouse
-  - electric_fan
-  - oil_filter
-  - strainer
-  - space_heater
-  - stove
-  - guillotine
-  - barometer
-  - rule
-  - odometer
-  - scale
-  - analog_clock
-  - digital_clock
-  - wall_clock
-  - hourglass
-  - sundial
-  - parking_meter
-  - stopwatch
-  - digital_watch
-  - stethoscope
-  - syringe
-  - magnetic_compass
-  - binoculars
-  - projector
-  - sunglasses
-  - loupe
-  - radio_telescope
-  - bow
-  - cannon
-  - assault_rifle
-  - rifle
-  - projectile
-  - computer_keyboard
-  - typewriter_keyboard
-  - crane
-  - lighter
-  - abacus
-  - cash_machine
-  - slide_rule
-  - desktop_computer
-  - hand-held_computer
-  - notebook
-  - web_site
-  - harvester
-  - thresher
-  - printer
-  - slot
-  - vending_machine
-  - sewing_machine
-  - joystick
-  - switch
-  - hook
-  - car_wheel
-  - paddlewheel
-  - pinwheel
-  - potter's_wheel
-  - gas_pump
-  - carousel
-  - swing
-  - reel
-  - radiator
-  - puck
-  - hard_disc
-  - sunglass
-  - pick
-  - car_mirror
-  - solar_dish
-  - remote_control
-  - disk_brake
-  - buckle
-  - hair_slide
-  - knot
-  - combination_lock
-  - padlock
-  - nail
-  - safety_pin
-  - screw
-  - muzzle
-  - seat_belt
-  - ski
-  - candle
-  - jack-o'-lantern
-  - spotlight
-  - torch
-  - neck_brace
-  - pier
-  - tripod
-  - maypole
-  - mousetrap
-  - spider_web
-  - trilobite
-  - harvestman
-  - scorpion
-  - black_and_gold_garden_spider
-  - barn_spider
-  - garden_spider
-  - black_widow
-  - tarantula
-  - wolf_spider
-  - tick
-  - centipede
-  - isopod
-  - Dungeness_crab
-  - rock_crab
-  - fiddler_crab
-  - king_crab
-  - American_lobster
-  - spiny_lobster
-  - crayfish
-  - hermit_crab
-  - tiger_beetle
-  - ladybug
-  - ground_beetle
-  - long-horned_beetle
-  - leaf_beetle
-  - dung_beetle
-  - rhinoceros_beetle
-  - weevil
-  - fly
-  - bee
-  - grasshopper
-  - cricket
-  - walking_stick
-  - cockroach
-  - mantis
-  - cicada
-  - leafhopper
-  - lacewing
-  - dragonfly
-  - damselfly
-  - admiral
-  - ringlet
-  - monarch
-  - cabbage_butterfly
-  - sulphur_butterfly
-  - lycaenid
-  - jellyfish
-  - sea_anemone
-  - brain_coral
-  - flatworm
-  - nematode
-  - conch
-  - snail
-  - slug
-  - sea_slug
-  - chiton
-  - sea_urchin
-  - sea_cucumber
-  - iron
-  - espresso_maker
-  - microwave
-  - Dutch_oven
-  - rotisserie
-  - toaster
-  - waffle_iron
-  - vacuum
-  - dishwasher
-  - refrigerator
-  - washer
-  - Crock_Pot
-  - frying_pan
-  - wok
-  - caldron
-  - coffeepot
-  - teapot
-  - spatula
-  - altar
-  - triumphal_arch
-  - patio
-  - steel_arch_bridge
-  - suspension_bridge
-  - viaduct
-  - barn
-  - greenhouse
-  - palace
-  - monastery
-  - library
-  - apiary
-  - boathouse
-  - church
-  - mosque
-  - stupa
-  - planetarium
-  - restaurant
-  - cinema
-  - home_theater
-  - lumbermill
-  - coil
-  - obelisk
-  - totem_pole
-  - castle
-  - prison
-  - grocery_store
-  - bakery
-  - barbershop
-  - bookshop
-  - butcher_shop
-  - confectionery
-  - shoe_shop
-  - tobacco_shop
-  - toyshop
-  - fountain
-  - cliff_dwelling
-  - yurt
-  - dock
-  - brass
-  - megalith
-  - bannister
-  - breakwater
-  - dam
-  - chainlink_fence
-  - picket_fence
-  - worm_fence
-  - stone_wall
-  - grille
-  - sliding_door
-  - turnstile
-  - mountain_tent
-  - scoreboard
-  - honeycomb
-  - plate_rack
-  - pedestal
-  - beacon
-  - mashed_potato
-  - bell_pepper
-  - head_cabbage
-  - broccoli
-  - cauliflower
-  - zucchini
-  - spaghetti_squash
-  - acorn_squash
-  - butternut_squash
-  - cucumber
-  - artichoke
-  - cardoon
-  - mushroom
-  - shower_curtain
-  - jean
-  - carton
-  - handkerchief
-  - sandal
-  - ashcan
-  - safe
-  - plate
-  - necklace
-  - croquet_ball
-  - fur_coat
-  - thimble
-  - pajama
-  - running_shoe
-  - cocktail_shaker
-  - chest
-  - manhole_cover
-  - modem
-  - tub
-  - tray
-  - balance_beam
-  - bagel
-  - prayer_rug
-  - kimono
-  - hot_pot
-  - whiskey_jug
-  - knee_pad
-  - book_jacket
-  - spindle
-  - ski_mask
-  - beer_bottle
-  - crash_helmet
-  - bottlecap
-  - tile_roof
-  - mask
-  - maillot
-  - Petri_dish
-  - football_helmet
-  - bathing_cap
-  - teddy
-  - holster
-  - pop_bottle
-  - photocopier
-  - vestment
-  - crossword_puzzle
-  - golf_ball
-  - trifle
-  - suit
-  - water_tower
-  - feather_boa
-  - cloak
-  - red_wine
-  - drumstick
-  - shield
-  - Christmas_stocking
-  - hoopskirt
-  - menu
-  - stage
-  - bonnet
-  - meat_loaf
-  - baseball
-  - face_powder
-  - scabbard
-  - sunscreen
-  - beer_glass
-  - hen-of-the-woods
-  - guacamole
-  - lampshade
-  - wool
-  - hay
-  - bow_tie
-  - mailbag
-  - water_jug
-  - bucket
-  - dishrag
-  - soup_bowl
-  - eggnog
-  - mortar
-  - trench_coat
-  - paddle
-  - chain
-  - swab
-  - mixing_bowl
-  - potpie
-  - wine_bottle
-  - shoji
-  - bulletproof_vest
-  - drilling_platform
-  - binder
-  - cardigan
-  - sweatshirt
-  - pot
-  - birdhouse
-  - hamper
-  - ping-pong_ball
-  - pencil_box
-  - pay-phone
-  - consomme
-  - apron
-  - punching_bag
-  - backpack
-  - groom
-  - bearskin
-  - pencil_sharpener
-  - broom
-  - mosquito_net
-  - abaya
-  - mortarboard
-  - poncho
-  - crutch
-  - Polaroid_camera
-  - space_bar
-  - cup
-  - racket
-  - traffic_light
-  - quill
-  - radio
-  - dough
-  - cuirass
-  - military_uniform
-  - lipstick
-  - shower_cap
-  - monitor
-  - oscilloscope
-  - mitten
-  - brassiere
-  - French_loaf
-  - vase
-  - milk_can
-  - rugby_ball
-  - paper_towel
-  - earthstar
-  - envelope
-  - miniskirt
-  - cowboy_hat
-  - trolleybus
-  - perfume
-  - bathtub
-  - hotdog
-  - coral_fungus
-  - bullet_train
-  - pillow
-  - toilet_tissue
-  - cassette
-  - carpenter's_kit
-  - ladle
-  - stinkhorn
-  - lotion
-  - hair_spray
-  - academic_gown
-  - dome
-  - crate
-  - wig
-  - burrito
-  - pill_bottle
-  - chain_mail
-  - theater_curtain
-  - window_shade
-  - barrel
-  - washbasin
-  - ballpoint
-  - basketball
-  - bath_towel
-  - cowboy_boot
-  - gown
-  - window_screen
-  - agaric
-  - cellular_telephone
-  - nipple
-  - barbell
-  - mailbox
-  - lab_coat
-  - fire_screen
-  - minibus
-  - packet
-  - maze
-  - pole
-  - horizontal_bar
-  - sombrero
-  - pickelhaube
-  - rain_barrel
-  - wallet
-  - cassette_player
-  - comic_book
-  - piggy_bank
-  - street_sign
-  - bell_cote
-  - fountain_pen
-  - Windsor_tie
-  - volleyball
-  - overskirt
-  - sarong
-  - purse
-  - bolo_tie
-  - bib
-  - parachute
-  - sleeping_bag
-  - television
-  - swimming_trunks
-  - measuring_cup
-  - espresso
-  - pizza
-  - breastplate
-  - shopping_basket
-  - wooden_spoon
-  - saltshaker
-  - chocolate_sauce
-  - ballplayer
-  - goblet
-  - gyromitra
-  - stretcher
-  - water_bottle
-  - dial_telephone
-  - soap_dispenser
-  - jersey
-  - school_bus
-  - jigsaw_puzzle
-  - plastic_bag
-  - reflex_camera
-  - diaper
-  - Band_Aid
-  - ice_lolly
-  - velvet
-  - tennis_ball
-  - gasmask
-  - doormat
-  - Loafer
-  - ice_cream
-  - pretzel
-  - quilt
-  - maillot
-  - tape_player
-  - clog
-  - iPod
-  - bolete
-  - scuba_diver
-  - pitcher
-  - matchstick
-  - bikini
-  - sock
-  - CD_player
-  - lens_cap
-  - thatch
-  - vault
-  - beaker
-  - bubble
-  - cheeseburger
-  - parallel_bars
-  - flagpole
-  - coffee_mug
-  - rubber_eraser
-  - stole
-  - carbonara
-  - dumbbell

+ 30 - 173
docs/CHANGELOG.md

@@ -5,200 +5,57 @@
 - [推理部署问题](#推理部署问题)
 
 ## GUI相关问题
+**Q:**  GUI卡死后怎么解决?
 
-<details>
-  <summary>Q1:  GUI在使用过程中卡死该怎么解决?</summary>
+**A:**  卡死后点击一下这个按钮即可恢复正常。
+<p align="center">
+  <img src="./images/FAQ1.png"  alt="QR" align="middle" />
+</p>
 
-> **A:** 卡死后点击一下这个按钮即可恢复正常
->  <p align="center">
->     <img src="./images/gui_FAQ1.png"  alt="QR" align="middle" />
->   </p>
-</details>
 
+**Q:**  GUI训练时报错怎么办?
 
-<details>
-  <summary>Q2:  GUI训练时报错怎么办?</summary>
+**A:**  首先打开当前项目的日志文件,查看报错信息。
 
-> **A:** 首先打开当前项目的日志文件,查看报错信息。
->
-> 例如此前将PaddleX GUI的工作空间设置在`D:/work_space`下,则根据在GUI上的项目ID和任务ID找到当前任务的日志文件,例如`D:/work_space/projects/P0001/T0001/err.log/err.log`和`D:/work_space/projects/P0001/T0001/err.log/out.log`
->
-> 如果无法定位出问题,可进一步查看PaddleX GUI的系统日志:例如在`C:/User/User_name/.paddlex/logs/paddlex.log`
->
-> 查看上述三个日志文件,基本可以定位出是否是显存不足、或者是数据路径不对等问题。如果是显存不足,请调低batch_size(需同时按比例调低学习率等参数)。其他无法解决的问题,可以前往GitHub[提ISSUE](https://github.com/PaddlePaddle/PaddleX/issues),描述清楚问题会有工程师及时回复。
-</details>
+例如此前将PaddleX GUI的工作空间设置在`D:/work_space`下,则根据在GUI上的项目ID和任务ID找到当前任务的日志文件,例如`D:/work_space/projects/P0001/T0001/err.log/err.log`和`D:/work_space/projects/P0001/T0001/err.log/out.log`
 
-<details>
-<summary>Q3: GUI卡死在启动界面,但之前可以正常使用</summary>
+如果无法定位出问题,可进一步查看PaddleX GUI的系统日志:例如在`C:/User/User_name/.paddlex/logs/paddlex.log`
 
-> GUI之前可以正常使用,但是本次运行一直处于启动界面,无法进入到主界面,如下所示:
-> <p align="center">
->   <img src="./images/gui_FAQ3.png" width = "400" alt="QR" align="middle" />
-> </p>  
->
-> **A:** 可以尝试先删除PaddleX GUI的日志目录,例如`C:/User/User_name/.paddlex`,然后再重新更新打开GUI。  
-</details>
+查看上述三个日志文件,基本可以定位出是否是显存不足、或者是数据路径不对等问题。如果是显存不足,请调低batch_size(需同时按比例调低学习率等参数)。其他无法解决的问题,可以前往GitHub[提ISSUE](https://github.com/PaddlePaddle/PaddleX/issues),描述清楚问题会有工程师及时回复。
 
-<details>
-  <summary>Q4: 如何及时更新PaddleX GUI 2.0内置的PaddleX API?</summary>
+## API训练相关问题
+**Q:**  loss为nan时怎么办?
 
-> **A:** 在PaddleX GUI 2.0的执行文件`PaddleX.exe`的同级目录下有个名为`paddlex`的文件夹,请将该文件夹替换成PaddleX github develop分支下的paddlex,即https://github.com/PaddlePaddle/PaddleX/tree/develop/paddlex
-</details>
+**A:**  loss为nan表示梯度爆炸,导致loss为无穷大。这时候,需要将学习率(learning rate)调小,或者增大批大小(batch_size)。
 
-<details>
-<summary>Q5: PaddleX GUI 2.0如何无缝切换PaddleX API训练?</summary>
 
-> **A:** 在PaddleX GUI 2.0中已经生成的项目,如果想要换成PaddleX API进行训练,可进行如下步骤:
->
-> 1. 找到该项目所在的工作空间,例如`D:/work_space/projects/P0001/T0001/`,该路径下有个名为`script.py`的训练脚本,此脚本包含该项目中的数据集路径和模型参数设置
-> 2. 安装PaddleX API,参考[安装文档](../install.md#1-paddlex-api开发模式安装)
-> 3. 开启终端,进入到`script.py`所在路径(例如`D:/work_space/projects/P0001/T0001/`),然后运行训练脚本:
-> ```
-> python script.py
-> ```
-</details>
+**Q:**  YOLO系列为什么要训练这么久?
 
-<details>
-<summary>Q6: Exception: A space is defined as the separator, but it exists in image or label name ...</summary>
+**A:**  像yolo系列的数据增强比较多,所以训练的epoch要求要多一点,具体在不同的数据集上的时候,训练参数需要调整一下。比如我们先前示例给出ppyolo,ppyolov2的训练参数都是针对COCO数据集换算到单卡上的配置,但是在昆虫这份数据集上的效果并不好,后来我们进行了调整,您可以参考我们调整的参数相应调整自己的参数,具体调了哪些可以看我们之前的[pr](https://github.com/PaddlePaddle/PaddleX/pull/853/files)。
 
-> **A:** 图片路径里面存在空格,因为空格作为图片和标注文件之间的空格符,所以需要把路径里的空格删掉。另外需要注意,路径中不要有中文
-</details>
 
-## API训练相关问题
+**Q:**  用命令行跑 `.\paddlex_inference\detector.exe` 这个指令没有什么提示,也没有输出,怎么回事?
 
-<details>
- <summary>Q1: loss为nan时怎么办?</summary>
+**A:**  可能是缺少dll,双击执行一下out目录下的detector.exe或model_infer.exe,会有提示。
 
-> **A:** loss为nan表示梯度爆炸,导致loss为无穷大。这时候,需要将学习率(learning rate)调小,或者增大批大小(batch_size)。
-</details>
 
-<details>
- <summary>Q2:  YOLO系列为什么要训练这么久?</summary>
+## 推理部署问题
+**Q:**  如何在程序中手动释放inference model和占用的显存?
 
-> **A:** 像yolo系列的数据增强比较多,所以训练的epoch要求要多一点,具体在不同的数据集上的时候,训练参数需要调整一下。比如我们先前示例给出ppyolo,ppyolov2的训练参数都是针对COCO数据集换算到单卡上的配置,但是在昆虫这份数据集上的效果并不好,后来我们进行了调整,您可以参考我们调整的参数相应调整自己的参数,具体调了哪些可以看我们之前的[pr](https://github.com/PaddlePaddle/PaddleX/pull/853/files)。
-</details>
+**A:**  在主进程中初始化predictor,然后在线程里完成图片的预测,这样使用是没有问题的。线程退出后显存不会释放,主进程退出才会释放显存。线程退出后,后续显存是可以复用的,不会一直增长。
 
-<details>
-  <summary>Q3: windows下shapely报错:lgeos = CDLL(os.path.join(sys.prefix, 'Library', 'bin', 'geos_c.dll')) OSError: [WinError 126] 找不到指定的模块</summary>
 
->  **A:** windows上,在conda环境内通过pip install shapely后会出现该问题,具体可以看shapely issue [Toblerity/Shapely#1032](https://github.com/Toblerity/Shapely/issues/1032)。解决办法:
->
-> 1. 卸载pip安装的shapely
->
-> ```
-> pip uninstall shapely
-> ```
->
-> 2. 然后用conda安装
->
-> ```
-> conda install shapely==1.7.1
->
-> ```
-</details>
+**Q:**  提高预测速度的策略都有哪些?
 
-<details><summary>Q4: windows下RCNN训练pycocotools报错:Expected 88 from C header, got 80 from PyObject</summary>
+**A:**  1. 可以考虑使用更加轻量的backbone;看看图像预处理和预测结果后处理有没有优化空间;相比于python推理预测,用C++会更快;同时对批量图片进行预测;可以尝试使用加速库,例如在CPU上部署时可以开启mkdldnn,或者使用用OpenVINO推理引擎加速性能,在Nvidia GPU上部署时可以使用TensorRT加速性能;
+2. 在测试性能时,需要注意给模型进行预热,例如先让模型预测100轮之后,再开始进行性能测试和记录,这样得到的性能指标才准确。
 
-> **A:** 将numpy版本更新至1.20.0+可解决该问题
-</details>
 
-## 推理部署问题
+**Q:**  预测结果如何可视化?
+
+**A:**  检测结果可以用`pdx.det.visualize`,分割结果可以用`pdx.seg.visualize`,API说明见[文档](https://github.com/PaddlePaddle/PaddleX/blob/release/2.0.0/docs/apis/prediction.md)
+
+
+**Q:**  如何用旧的部署代码部署新的模型?
 
-<details><summary>Q1:  如何在程序中手动释放inference model和占用的显存?</summary>
-
-> **A:** **只要初始化模型的那个进程结束了,占用的显存会自动释放**。
-> 当出现显存没有释放的情况时,可能是:在主进程中初始化模型,然后在线程里完成图片的预测,这样在使用上是没有问题的,但是线程退出后显存不会释放,需要等到主进程退出才会释放显存。当然,线程退出后虽然显存没有被释放,但是后续使用是可以继续复用的,显存不会一直增长。
-> 所以建议大家在子进程里做模型的初始化和预测,这样在子进程结束后会自动释放显存。
-</details>
-
-<details><summary>Q2:  提高预测速度的策略都有哪些?</summary>
-
-> **A:** 1. 可以考虑使用更加轻量的backbone;看看图像预处理和预测结果后处理有没有优化空间;相比于python推理预测,用C++会更快;同时对批量图片进行预测;可以尝试使用加速库,例如在CPU上部署时可以开启mkdldnn,或者使用用OpenVINO推理引擎加速性能,在Nvidia GPU上部署时可以使用TensorRT加速性能;
-> 2. 在测试性能时,需要注意给模型进行预热,例如先让模型预测100轮之后,再开始进行性能测试和记录,这样得到的性能指标才准确。
-</details>
-
-<details><summary>Q3:  使用python 部署时预测结果如何可视化?</summary>
-
-> **A:** 检测模型的预测结果可以用API`pdx.det.visualize`,分割模型的预测结果可以用API`pdx.seg.visualize`,预测可视化见[Python部署](../python_deploy.md), 可视化API说明见[文档](../apis/prediction.md),
-</details>
-
-<details>
-<summary>Q4:使用C++部署时预测结果如何可视化?</summary>
-
-> **A:** 在C++部署环节,我们也提供了简单的可视化接口便于大家调试。因预测结果可视化不是部署上线的必要步骤,所以我们给出的demo中没有默认链接可视化代码,如果需要通过可视化辅助调试,需要按照以下步骤链接可视化接口。这个可视化示例旨在帮助大家快速理解PaddleX Deploy的预测结果结构,如果是更为复杂的可视化功能,需要大家根据业务需求做进一步开发。
->
-> **step1: 在Cmakelist.txt中添加可视化代码路径**
->
-> 在https://github.com/PaddlePaddle/PaddleX/blob/develop/deploy/cpp/CMakeLists.txt#L52 之后加上:
->
-> ```
-> aux_source_directory(${PROJECT_SOURCE_DIR}/model_deploy/utils/src DETECTOR_SRC)
-> ```
->
-> **step2: 在目标检测和语义分割的预测demo中加入Visualize接口**
->
->  * 以昆虫检测([API tutorials](../../tutorialstrain/object_detection/faster_rcnn_r50_fpn.py)对应GUI中的样例工程)为例:
->
->  > 在https://github.com/PaddlePaddle/PaddleX/blob/develop/deploy/cpp/demo/model_infer.cpp#L19 后添加可视化代码的头文件:
->  > ```
->  > #include "model_deploy/utils/include/visualize.h"
->  > ```
->  > [可选]如果需要对低分框过滤,还可以加上过滤框代码的头文件:
->  > ```
->  > #include "model_deploy/utils/include/bbox_utils.h"
->  > ```
->  > 在https://github.com/PaddlePaddle/PaddleX/blob/develop/deploy/cpp/demo/model_infer.cpp#L57 后添加可视化代码:
->  > ```
->  > cv::Mat vis_det;
->  > Visualize(imgs[0], *(results[0].det_result), &vis_det, 6);
->  > cv::imwrite("vis_det.jpg", vis_det);
->  > ```
->  > > 6表示当前数据集的类别数,请根据实际情况设置
->
->  > [可选]如果需要对低分框过滤,添加过滤框和可视化代码:
->  > ```
->  > cv::Mat vis_det;
->  > std::vector<PaddleDeploy::Result> filter_results;
->  > float score_thresh = 0.5;
->  > FilterBbox(results, score_thresh, &filter_results);
->  > Visualize(imgs[0], *(filter_results[0].det_result), &vis_det, 6);
->  > cv::imwrite("vis_det.jpg", vis_det);
->  > ```
->  > > 6表示当前数据集的类别数,请根据实际情况设置
->  >
->  > 我们对昆虫数据集中的0205.jpg预测结果可视化如下:
->  > <p align="center">
->  >   <img src="./images/deploy_FAQ4_det.jpg"  width = "300" alt="QR" align="middle" />
->  > </p>
->
->  * 以视盘分割([API tutorials](../../tutorials/train/semantic_segmentation/deeplabv3p_resnet50_vd.py)对应GUI中的样例工程)为例:
->  > 在https://github.com/PaddlePaddle/PaddleX/blob/develop/deploy/cpp/demo/model_infer.cpp#L19 后添加可视化代码的头文件:
->  > ```
->  > #include "model_deploy/utils/include/visualize.h"
->  > ```
->  > 在https://github.com/PaddlePaddle/PaddleX/blob/develop/deploy/cpp/demo/model_infer.cpp#L57 后添加可视化代码:
->
->  > ```
->  > cv::Mat vis_seg;
->  > Visualize(imgs[0], *(results[0].seg_result), &vis_seg, 2);
->  > cv::imwrite("vis_seg.jpg", vis_seg);
->  > ```
->  > > 2表示当前数据集的类别数,请根据实际情况设置
->
->  > 我们对视盘数据集中的P0176.jpg预测结果可视化如下:
->  > <p align="center">
->  >   <img src="./images/deploy_FAQ4_seg.jpg"  width = "300" alt="QR" align="middle" />
->  > </p>
-
-</details>
-
-<details>
-<summary>Q5:  如何用1.x的部署代码部署2.0的模型?</summary>
-
-> **A:** 2.0版本的cpp部署支持新旧版本的paddlex/gui导出的模型进行部署,但是2.0版本的python部署目前不兼容1.x版本模型。2.0版本PaddleX GUI/API的训练功能对1.x版本也不兼容,1.x版本的训练脚本和模型只能安装1.x版本的PaddleX才能正常运行。
-</details>
-
-<details><summary>Q6:  用命令行跑 `.\paddlex_inference\detector.exe` 这个指令没有什么提示,也没有输出,是怎么回事?</summary>
-
-> **A:** 可能是缺少dll,双击执行一下out目录下的detector.exe或model_infer.exe,会有提示。
-</details>
+**A:**  2.0版本的cpp部署支持新旧版本的paddlex/gui导出的模型进行部署, 但是python不兼容。 GUI新旧版本也不兼容, 新版本只能加载新版本训练的模型。

Algunos archivos no se mostraron porque demasiados archivos cambiaron en este cambio