Below three steps are my hands on solution for converting TF SSD models to TF lite models for object detection. Following are the steps and coverage of this topic
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md
Step2. Convert the TF graph to TF Lite compatible graph.
When you install latest version of tensorflow, you have a utility in the tensorflow -> research -> object detections folder . Name of the utility is export_tflite_ssd_graph.py
Step3. Generate the tflite file from the graph.
This can be also done using TOCO convertor, that comes with tensorflow installation.
You can use the following command to do so
- Download the model .tar file from github.
- Convert the graph to tensorflow lite compatible graph.
- Generate the tflite file from the graph
- Troubleshooting - I have added some common issues that you might face and the solution.
Assumptions.
- Tensorflow is already installed on your machine.
- You know which object detection model you will use.
- Below is done for ssd inception model v2.
- You can find information about arguments in the command below on google docs.
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md
Step2. Convert the TF graph to TF Lite compatible graph.
When you install latest version of tensorflow, you have a utility in the tensorflow -> research -> object detections folder . Name of the utility is export_tflite_ssd_graph.py
- You can use the following command to generate the TF lite graph in your folder (in the below case, I made a folder named "my_tflite_mobile_inceptionv2"
Create the TFLite compatible graph using command like
- python export_tflite_ssd_graph.py --pipeline_config_path ./ssd_inception_v2_coco_2018_01_28/pipeline.config --trained_checkpoint_prefix ./ssd_inception_v2_coco_2018_01_28/model.ckpt --output_directory ./ssd_inception_v2_coco_2018_01_28/my_tflite_mobile_inceptionv2
Step3. Generate the tflite file from the graph.
This can be also done using TOCO convertor, that comes with tensorflow installation.
You can use the following command to do so
Create the tflite using toco converter (comes with
tensorflow installtion)
- toco --graph_def_file="C:\Program Files\Python\Lib\site-packages\tensorflow\models\research\object_detection\ssd_inception_v2_coco_2018_01_28\my_tflite_mobile_inceptionv2\tflite_graph.pb" --output_file="C:\Program Files\Python\Lib\site-packages\tensorflow\models\research\object_detection\ssd_inception_v2_coco_2018_01_28\my_tflite_mobile_inceptionv2\tflite_mobile.tflite" --input_shapes=1,300,300,3 --input_arrays=normalized_input_image_tensor --output_arrays=TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3 --inference_type=FLOAT --allow_custom_ops
Troubleshooting some issues
Error :
Raise
ValueError('SSD Inception V2 feature extractor always uses'
ValueError: SSD
Inception V2 feature extractor always usesscope returned
Solution :
Go to
your pipeline.config and under feature_extractor right after conv_hyperparameters closing bracket paste the
line
override_base_feature_extractor_hyperparams:
true.
So the block
looks like this
activation: RELU_6
batch_norm {
decay: 0.999700009823
center: true
scale: true
epsilon: 0.0010000000475
train: true
}
}
override_base_feature_extractor_hyperparams: true
}
box_coder {
faster_rcnn_box_coder {
y_scale: 10.0
x_scale: 10.0
height_scale: 5.0
width_scale: 5.0
}
batch_norm {
decay: 0.999700009823
center: true
scale: true
epsilon: 0.0010000000475
train: true
}
}
override_base_feature_extractor_hyperparams: true
}
box_coder {
faster_rcnn_box_coder {
y_scale: 10.0
x_scale: 10.0
height_scale: 5.0
width_scale: 5.0
}
Comments
Post a Comment