mirror of
https://github.com/PaddlePaddle/PaddleOCR.git
synced 2025-09-25 16:15:12 +00:00
docs: update paddle2onnx documentations (#14144)
This commit is contained in:
parent
d1bc41661f
commit
d3d7e85883
BIN
docs/ppocr/infer_deploy/images/img_12.jpg
Normal file
BIN
docs/ppocr/infer_deploy/images/img_12.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 722 KiB |
BIN
docs/ppocr/infer_deploy/images/img_12_result.jpg
Normal file
BIN
docs/ppocr/infer_deploy/images/img_12_result.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 561 KiB |
@ -1,139 +1,196 @@
|
|||||||
---
|
---
|
||||||
|
typora-copy-images-to: images
|
||||||
comments: true
|
comments: true
|
||||||
---
|
---
|
||||||
|
|
||||||
# Paddle2ONNX model transformation and prediction
|
# Paddle2ONNX Model Conversion and Prediction
|
||||||
|
|
||||||
This chapter describes how the PaddleOCR model is converted into an ONNX model and predicted based on the ONNXRuntime engine.
|
This chapter introduces how to convert PaddleOCR models to ONNX models and perform predictions based on the ONNXRuntime engine.
|
||||||
|
|
||||||
## 1. Environment preparation
|
## 1. Environment Setup
|
||||||
|
|
||||||
Need to prepare PaddleOCR, Paddle2ONNX model conversion environment, and ONNXRuntime prediction environment
|
You need to prepare the environments for PaddleOCR, Paddle2ONNX model conversion, and ONNXRuntime prediction.
|
||||||
|
|
||||||
### PaddleOCR
|
### PaddleOCR
|
||||||
|
|
||||||
Clone the PaddleOCR repository, use the release/2.6 branch, and install it.
|
Clone the PaddleOCR repository, use the main branch, and install it. Since the PaddleOCR repository is relatively large and cloning via `git clone` can be slow, this tutorial has already downloaded it.
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
git clone -b release/2.6 https://github.com/PaddlePaddle/PaddleOCR.git
|
git clone -b main https://github.com/PaddlePaddle/PaddleOCR.git
|
||||||
cd PaddleOCR && python3.7 setup.py install
|
cd PaddleOCR && python3 -m pip install -e .
|
||||||
```
|
```
|
||||||
|
|
||||||
### Paddle2ONNX
|
### Paddle2ONNX
|
||||||
|
|
||||||
Paddle2ONNX supports converting the PaddlePaddle model format to the ONNX model format. The operator currently supports exporting ONNX Opset 9~11 stably, and some Paddle operators support lower ONNX Opset conversion.
|
Paddle2ONNX supports converting models in the PaddlePaddle format to the ONNX format. Operators currently stably support exporting ONNX Opset versions 9~18, and some Paddle operators support conversion to lower ONNX Opsets. For more details, please refer to [Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX/blob/develop/README_en.md).
|
||||||
For more details, please refer to [Paddle2ONNX](https://github.com/PaddlePaddle/Paddle2ONNX/blob/develop/README_en.md)
|
|
||||||
|
|
||||||
- install Paddle2ONNX
|
- Install Paddle2ONNX
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
python3.7 -m pip install paddle2onnx
|
python3 -m pip install paddle2onnx
|
||||||
```
|
```
|
||||||
|
|
||||||
- install ONNXRuntime
|
- Install ONNXRuntime
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
# It is recommended to install version 1.9.0, and the version number can be changed according to the environment
|
python3 -m pip install onnxruntime
|
||||||
python3.7 -m pip install onnxruntime==1.9.0
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## 2. Model conversion
|
## 2. Model Conversion
|
||||||
|
|
||||||
### Paddle model download
|
### Download Paddle Models
|
||||||
|
|
||||||
There are two ways to obtain the Paddle model: Download the prediction model provided by PaddleOCR in [model_list](../model_list.en.md);
|
There are two ways to obtain Paddle static graph models: download the prediction models provided by PaddleOCR in the [model list](../model_list.en.md); or refer to the [Model Export Instructions](https://paddlepaddle.github.io/PaddleOCR/latest/ppocr/infer_deploy/python_infer.html#inference) to convert trained weights into inference models.
|
||||||
|
|
||||||
Take the PP-OCRv3 detection, recognition, and classification model as an example:
|
Using the PP-OCR series English detection, recognition, and classification models as examples:
|
||||||
|
|
||||||
```bash linenums="1"
|
=== "PP-OCRv3"
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/english/en_PP-OCRv3_det_infer.tar
|
|
||||||
cd ./inference && tar xf en_PP-OCRv3_det_infer.tar && cd ..
|
|
||||||
|
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/english/en_PP-OCRv3_rec_infer.tar
|
```bash linenums="1"
|
||||||
cd ./inference && tar xf en_PP-OCRv3_rec_infer.tar && cd ..
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/english/en_PP-OCRv3_det_infer.tar
|
||||||
|
cd ./inference && tar xf en_PP-OCRv3_det_infer.tar && cd ..
|
||||||
|
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/english/en_PP-OCRv3_rec_infer.tar
|
||||||
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
cd ./inference && tar xf en_PP-OCRv3_rec_infer.tar && cd ..
|
||||||
```
|
|
||||||
|
|
||||||
### Convert model
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
||||||
|
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
||||||
|
```
|
||||||
|
|
||||||
Convert Paddle inference model to ONNX model format using Paddle2ONNX:
|
=== "PP-OCRv4"
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
paddle2onnx --model_dir ./inference/en_PP-OCRv3_det_infer \
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/english/en_PP-OCRv3_det_infer.tar
|
||||||
--model_filename inference.pdmodel \
|
cd ./inference && tar xf en_PP-OCRv3_det_infer.tar && cd ..
|
||||||
--params_filename inference.pdiparams \
|
|
||||||
--save_file ./inference/det_onnx/model.onnx \
|
|
||||||
--opset_version 10 \
|
|
||||||
--input_shape_dict="{'x':[-1,3,-1,-1]}" \
|
|
||||||
--enable_onnx_checker True
|
|
||||||
|
|
||||||
paddle2onnx --model_dir ./inference/en_PP-OCRv3_rec_infer \
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv4/english/en_PP-OCRv4_rec_infer.tar
|
||||||
--model_filename inference.pdmodel \
|
cd ./inference && tar xf en_PP-OCRv4_rec_infer.tar && cd ..
|
||||||
--params_filename inference.pdiparams \
|
|
||||||
--save_file ./inference/rec_onnx/model.onnx \
|
|
||||||
--opset_version 10 \
|
|
||||||
--input_shape_dict="{'x':[-1,3,-1,-1]}" \
|
|
||||||
--enable_onnx_checker True
|
|
||||||
|
|
||||||
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
||||||
--model_filename ch_ppocr_mobile_v2.0_cls_infer/inference.pdmodel \
|
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
||||||
--params_filename ch_ppocr_mobile_v2.0_cls_infer/inference.pdiparams \
|
```
|
||||||
--save_file ./inferencecls_onnx/model.onnx \
|
|
||||||
--opset_version 10 \
|
|
||||||
--input_shape_dict="{'x':[-1,3,-1,-1]}" \
|
|
||||||
--enable_onnx_checker True
|
|
||||||
```
|
|
||||||
|
|
||||||
After execution, the ONNX model will be saved in `./inference/det_onnx/`, `./inference/rec_onnx/`, `./inference/cls_onnx/` paths respectively
|
### Model Conversion
|
||||||
|
|
||||||
- Note: For the OCR model, the conversion process must be in the form of dynamic shape, that is, add the option --input_shape_dict="{'x': [-1, 3, -1, -1]}", otherwise the prediction result may be the same as Predicting directly with Paddle is slightly different.
|
Use Paddle2ONNX to convert Paddle static graph models to the ONNX model format:
|
||||||
|
|
||||||
In addition, the following models do not currently support conversion to ONNX models: NRTR, SAR, RARE, SRN.
|
=== "PP-OCRv3"
|
||||||
|
|
||||||
If you have optimization needs for the exported ONNX model, we recommend using `onnxslim`.
|
```bash linenums="1"
|
||||||
|
paddle2onnx --model_dir ./inference/en_PP-OCRv3_det_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/det_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/en_PP-OCRv3_rec_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/rec_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/cls_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
```
|
||||||
|
|
||||||
|
=== "PP-OCRv4"
|
||||||
|
|
||||||
|
```bash linenums="1"
|
||||||
|
paddle2onnx --model_dir ./inference/en_PP-OCRv3_det_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/det_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/en_PP-OCRv4_rec_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/rec_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/cls_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
```
|
||||||
|
|
||||||
|
After execution, the ONNX models will be saved respectively under `./inference/det_onnx/`, `./inference/rec_onnx/`, and `./inference/cls_onnx/`.
|
||||||
|
|
||||||
|
- **Note**: For OCR models, dynamic shapes must be used during conversion; otherwise, the prediction results may slightly differ from directly using Paddle for prediction. Additionally, the following models currently do not support conversion to ONNX models: NRTR, SAR, RARE, SRN.
|
||||||
|
|
||||||
|
- **Note**: After [Paddle2ONNX version v1.2.3](https://github.com/PaddlePaddle/Paddle2ONNX/releases/tag/v1.2.3), dynamic shapes are supported by default, i.e., `float32[p2o.DynamicDimension.0,3,p2o.DynamicDimension.1,p2o.DynamicDimension.2]`. The option `--input_shape_dict` has been deprecated. If you need to adjust shapes, you can use the following command to adjust the input shape of the Paddle model.
|
||||||
|
|
||||||
|
```bash linenums="1"
|
||||||
|
python3 -m paddle2onnx.optimize --input_model inference/det_onnx/model.onnx \
|
||||||
|
--output_model inference/det_onnx/model.onnx \
|
||||||
|
--input_shape_dict "{'x': [-1,3,-1,-1]}"
|
||||||
|
```
|
||||||
|
|
||||||
|
If you have optimization requirements for the exported ONNX model, it is recommended to use [OnnxSlim](https://github.com/inisis/OnnxSlim) to optimize the model:
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
pip install onnxslim
|
pip install onnxslim
|
||||||
onnxslim model.onnx slim.onnx
|
onnxslim model.onnx slim.onnx
|
||||||
```
|
```
|
||||||
|
|
||||||
## 3. prediction
|
## 3. Inference and Prediction
|
||||||
|
|
||||||
Take the English OCR model as an example, use **ONNXRuntime** to predict and execute the following commands:
|
Using the Chinese OCR model as an example, you can perform prediction using ONNXRuntime by executing the following command:
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
python3.7 tools/infer/predict_system.py --use_gpu=False --use_onnx=True \
|
python3 tools/infer/predict_system.py --use_gpu=False --use_onnx=True \
|
||||||
--det_model_dir=./inference/det_onnx/model.onnx \
|
--det_model_dir=./inference/det_onnx/model.onnx \
|
||||||
--rec_model_dir=./inference/rec_onnx/model.onnx \
|
--rec_model_dir=./inference/rec_onnx/model.onnx \
|
||||||
--cls_model_dir=./inference/cls_onnx/model.onnx \
|
--cls_model_dir=./inference/cls_onnx/model.onnx \
|
||||||
--image_dir=doc/imgs_en/img_12.jpg \
|
--image_dir=./docs/ppocr/infer_deploy/images/img_12.jpg \
|
||||||
--rec_char_dict_path=ppocr/utils/en_dict.txt
|
--rec_char_dict_path=ppocr/utils/en_dict.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
Taking the English OCR model as an example, use **Paddle Inference** to predict and execute the following commands:
|
Taking the English OCR model as an example, you can perform prediction using Paddle Inference by executing the following command:
|
||||||
|
|
||||||
```bash linenums="1"
|
=== "PP-OCRv3"
|
||||||
python3.7 tools/infer/predict_system.py --use_gpu=False \
|
|
||||||
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
|
||||||
--rec_model_dir=./inference/en_PP-OCRv3_rec_infer \
|
|
||||||
--det_model_dir=./inference/en_PP-OCRv3_det_infer \
|
|
||||||
--image_dir=doc/imgs_en/img_12.jpg \
|
|
||||||
--rec_char_dict_path=ppocr/utils/en_dict.txt
|
|
||||||
```
|
|
||||||
|
|
||||||
After executing the command, the predicted identification information will be printed out in the terminal, and the visualization results will be saved under `./inference_results/`.
|
```bash linenums="1"
|
||||||
|
python3 tools/infer/predict_system.py --use_gpu=False \
|
||||||
|
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--rec_model_dir=./inference/en_PP-OCRv3_rec_infer \
|
||||||
|
--det_model_dir=./inference/en_PP-OCRv3_det_infer \
|
||||||
|
--image_dir=./docs/ppocr/infer_deploy/images/img_12.jpg\
|
||||||
|
--rec_char_dict_path=ppocr/utils/en_dict.txt
|
||||||
|
```
|
||||||
|
|
||||||
ONNXRuntime result:
|
=== "PP-OCRv4"
|
||||||
|
|
||||||

|
```bash linenums="1"
|
||||||
|
python3 tools/infer/predict_system.py --use_gpu=False \
|
||||||
|
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--rec_model_dir=./inference/en_PP-OCRv4_rec_infer \
|
||||||
|
--det_model_dir=./inference/en_PP-OCRv3_det_infer \
|
||||||
|
--image_dir=./docs/ppocr/infer_deploy/images/img_12.jpg \
|
||||||
|
--rec_char_dict_path=ppocr/utils/en_dict.txt
|
||||||
|
```
|
||||||
|
|
||||||
Paddle Inference result:
|
After executing the command, the terminal will print out the predicted recognition information, and the visualization results will be saved under `./inference_results/`.
|
||||||
|
|
||||||

|
**ONNXRuntime Execution Result:**
|
||||||
|
|
||||||
Using ONNXRuntime to predict, terminal output:
|

|
||||||
|
|
||||||
|
**Paddle Inference Execution Result:**
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
Using ONNXRuntime for prediction, terminal output:
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
[2022/10/10 12:06:28] ppocr DEBUG: dt_boxes num : 11, elapse : 0.3568880558013916
|
[2022/10/10 12:06:28] ppocr DEBUG: dt_boxes num : 11, elapse : 0.3568880558013916
|
||||||
@ -154,7 +211,7 @@ Using ONNXRuntime to predict, terminal output:
|
|||||||
[2022/10/10 12:06:31] ppocr INFO: The predict total time is 3.2482550144195557
|
[2022/10/10 12:06:31] ppocr INFO: The predict total time is 3.2482550144195557
|
||||||
```
|
```
|
||||||
|
|
||||||
Using Paddle Inference to predict, terminal output:
|
Using Paddle Inference for prediction, terminal output:
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
[2022/10/10 12:06:28] ppocr DEBUG: dt_boxes num : 11, elapse : 0.3568880558013916
|
[2022/10/10 12:06:28] ppocr DEBUG: dt_boxes num : 11, elapse : 0.3568880558013916
|
||||||
|
@ -41,47 +41,89 @@ Paddle2ONNX 支持将 PaddlePaddle 模型格式转化到 ONNX 模型格式,算
|
|||||||
|
|
||||||
### Paddle 模型下载
|
### Paddle 模型下载
|
||||||
|
|
||||||
有两种方式获取Paddle静态图模型:在 [model_list](../model_list.md) 中下载PaddleOCR提供的预测模型;
|
有两种方式获取Paddle静态图模型:在 [model_list](../model_list.md) 中下载PaddleOCR提供的预测模型;参考[模型导出说明](https://paddlepaddle.github.io/PaddleOCR/latest/ppocr/infer_deploy/python_infer.html#inference)把训练好的权重转为推理模型。
|
||||||
|
|
||||||
以 PP-OCRv3 中文检测、识别、分类模型为例:
|
以 PP-OCR 系列中文检测、识别、分类模型为例:
|
||||||
|
|
||||||
```bash linenums="1"
|
=== "PP-OCRv3"
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar
|
|
||||||
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && cd ..
|
|
||||||
|
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar
|
```bash linenums="1"
|
||||||
cd ./inference && tar xf ch_PP-OCRv3_rec_infer.tar && cd ..
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar
|
||||||
|
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && cd ..
|
||||||
|
|
||||||
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar
|
||||||
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
cd ./inference && tar xf ch_PP-OCRv3_rec_infer.tar && cd ..
|
||||||
```
|
|
||||||
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
||||||
|
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
||||||
|
```
|
||||||
|
|
||||||
|
=== "PP-OCRv4"
|
||||||
|
|
||||||
|
```bash linenums="1"
|
||||||
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv4/chinese/ch_PP-OCRv4_det_infer.tar
|
||||||
|
cd ./inference && tar xf ch_PP-OCRv4_det_infer.tar && cd ..
|
||||||
|
|
||||||
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv4/chinese/ch_PP-OCRv4_rec_infer.tar
|
||||||
|
cd ./inference && tar xf ch_PP-OCRv4_rec_infer.tar && cd ..
|
||||||
|
|
||||||
|
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar
|
||||||
|
cd ./inference && tar xf ch_ppocr_mobile_v2.0_cls_infer.tar && cd ..
|
||||||
|
```
|
||||||
|
|
||||||
### 模型转换
|
### 模型转换
|
||||||
|
|
||||||
使用 Paddle2ONNX 将Paddle静态图模型转换为ONNX模型格式:
|
使用 Paddle2ONNX 将Paddle静态图模型转换为ONNX模型格式:
|
||||||
|
|
||||||
```bash linenums="1"
|
=== "PP-OCRv3"
|
||||||
paddle2onnx --model_dir ./inference/ch_PP-OCRv3_det_infer \
|
|
||||||
--model_filename inference.pdmodel \
|
|
||||||
--params_filename inference.pdiparams \
|
|
||||||
--save_file ./inference/det_onnx/model.onnx \
|
|
||||||
--opset_version 11 \
|
|
||||||
--enable_onnx_checker True
|
|
||||||
|
|
||||||
paddle2onnx --model_dir ./inference/ch_PP-OCRv3_rec_infer \
|
```bash linenums="1"
|
||||||
--model_filename inference.pdmodel \
|
paddle2onnx --model_dir ./inference/ch_PP-OCRv3_det_infer \
|
||||||
--params_filename inference.pdiparams \
|
--model_filename inference.pdmodel \
|
||||||
--save_file ./inference/rec_onnx/model.onnx \
|
--params_filename inference.pdiparams \
|
||||||
--opset_version 11 \
|
--save_file ./inference/det_onnx/model.onnx \
|
||||||
--enable_onnx_checker True
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
paddle2onnx --model_dir ./inference/ch_PP-OCRv3_rec_infer \
|
||||||
--model_filename inference.pdmodel \
|
--model_filename inference.pdmodel \
|
||||||
--params_filename inference.pdiparams \
|
--params_filename inference.pdiparams \
|
||||||
--save_file ./inference/cls_onnx/model.onnx \
|
--save_file ./inference/rec_onnx/model.onnx \
|
||||||
--opset_version 11 \
|
--opset_version 11 \
|
||||||
--enable_onnx_checker True
|
--enable_onnx_checker True
|
||||||
```
|
|
||||||
|
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/cls_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
```
|
||||||
|
|
||||||
|
=== "PP-OCRv4"
|
||||||
|
|
||||||
|
```bash linenums="1"
|
||||||
|
paddle2onnx --model_dir ./inference/ch_PP-OCRv4_det_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/det_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/ch_PP-OCRv4_rec_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/rec_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
|
||||||
|
paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--model_filename inference.pdmodel \
|
||||||
|
--params_filename inference.pdiparams \
|
||||||
|
--save_file ./inference/cls_onnx/model.onnx \
|
||||||
|
--opset_version 11 \
|
||||||
|
--enable_onnx_checker True
|
||||||
|
```
|
||||||
|
|
||||||
执行完毕后,ONNX 模型会被分别保存在 `./inference/det_onnx/`,`./inference/rec_onnx/`,`./inference/cls_onnx/`路径下
|
执行完毕后,ONNX 模型会被分别保存在 `./inference/det_onnx/`,`./inference/rec_onnx/`,`./inference/cls_onnx/`路径下
|
||||||
|
|
||||||
@ -89,7 +131,7 @@ paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
|||||||
另外,以下几个模型暂不支持转换为 ONNX 模型:
|
另外,以下几个模型暂不支持转换为 ONNX 模型:
|
||||||
NRTR、SAR、RARE、SRN
|
NRTR、SAR、RARE、SRN
|
||||||
|
|
||||||
- 注意:[当前Paddle2ONNX版本(v1.2.3)](https://github.com/PaddlePaddle/Paddle2ONNX/releases/tag/v1.2.3)现已默认支持动态shape,即 `float32[p2o.DynamicDimension.0,3,p2o.DynamicDimension.1,p2o.DynamicDimension.2]`,选项 `--input_shape_dict` 已废弃。如果有shape调整需求可使用如下命令进行Paddle模型输入shape调整。
|
- 注意:[Paddle2ONNX 版本 v1.2.3](https://github.com/PaddlePaddle/Paddle2ONNX/releases/tag/v1.2.3)后已默认支持动态shape,即 `float32[p2o.DynamicDimension.0,3,p2o.DynamicDimension.1,p2o.DynamicDimension.2]`,选项 `--input_shape_dict` 已废弃。如果有shape调整需求可使用如下命令进行Paddle模型输入shape调整。
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
python3 -m paddle2onnx.optimize --input_model inference/det_onnx/model.onnx \
|
python3 -m paddle2onnx.optimize --input_model inference/det_onnx/model.onnx \
|
||||||
@ -97,7 +139,7 @@ paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
|||||||
--input_shape_dict "{'x': [-1,3,-1,-1]}"
|
--input_shape_dict "{'x': [-1,3,-1,-1]}"
|
||||||
```
|
```
|
||||||
|
|
||||||
如你对导出的 ONNX 模型有优化的需求,推荐使用 `onnxslim` 对模型进行优化:
|
如你对导出的 ONNX 模型有优化的需求,推荐使用 [onnxslim](https://github.com/inisis/OnnxSlim) 对模型进行优化:
|
||||||
|
|
||||||
```bash linenums="1"
|
```bash linenums="1"
|
||||||
pip install onnxslim
|
pip install onnxslim
|
||||||
@ -113,18 +155,30 @@ python3 tools/infer/predict_system.py --use_gpu=False --use_onnx=True \
|
|||||||
--det_model_dir=./inference/det_onnx/model.onnx \
|
--det_model_dir=./inference/det_onnx/model.onnx \
|
||||||
--rec_model_dir=./inference/rec_onnx/model.onnx \
|
--rec_model_dir=./inference/rec_onnx/model.onnx \
|
||||||
--cls_model_dir=./inference/cls_onnx/model.onnx \
|
--cls_model_dir=./inference/cls_onnx/model.onnx \
|
||||||
--image_dir=./deploy/lite/imgs/lite_demo.png
|
--image_dir=./docs/ppocr/infer_deploy/images/lite_demo.png
|
||||||
```
|
```
|
||||||
|
|
||||||
以中文OCR模型为例,使用 Paddle Inference 预测可执行如下命令:
|
以中文OCR模型为例,使用 Paddle Inference 预测可执行如下命令:
|
||||||
|
|
||||||
```bash linenums="1"
|
=== "PP-OCRv3"
|
||||||
python3 tools/infer/predict_system.py --use_gpu=False \
|
|
||||||
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
```bash linenums="1"
|
||||||
--rec_model_dir=./inference/ch_PP-OCRv3_rec_infer \
|
python3 tools/infer/predict_system.py --use_gpu=False \
|
||||||
--det_model_dir=./inference/ch_PP-OCRv3_det_infer \
|
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
--image_dir=./deploy/lite/imgs/lite_demo.png
|
--rec_model_dir=./inference/ch_PP-OCRv3_rec_infer \
|
||||||
```
|
--det_model_dir=./inference/ch_PP-OCRv3_det_infer \
|
||||||
|
--image_dir=./docs/ppocr/infer_deploy/images/lite_demo.png
|
||||||
|
```
|
||||||
|
|
||||||
|
=== "PP-OCRv4"
|
||||||
|
|
||||||
|
```bash linenums="1"
|
||||||
|
python3 tools/infer/predict_system.py --use_gpu=False \
|
||||||
|
--cls_model_dir=./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
||||||
|
--rec_model_dir=./inference/ch_PP-OCRv4_rec_infer \
|
||||||
|
--det_model_dir=./inference/ch_PP-OCRv4_det_infer \
|
||||||
|
--image_dir=./docs/ppocr/infer_deploy/images/lite_demo.png
|
||||||
|
```
|
||||||
|
|
||||||
执行命令后在终端会打印出预测的识别信息,并在 `./inference_results/` 下保存可视化结果。
|
执行命令后在终端会打印出预测的识别信息,并在 `./inference_results/` 下保存可视化结果。
|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user