Skip to content

tinypose QAT训练后导出推理模型报错 #9455

@topview4301

Description

@topview4301

问题确认 Search before asking

  • 我已经查询历史issue,没有发现相似的bug。I have searched the issues and found no similar bug report.

Bug组件 Bug Component

Export

Bug描述 Describe the Bug

完成tinypose QAT训练以后,eval正常,但是到处推理模型时报错,使用的命令如下:

(ppdet) $ make qat_export_infer 
export FLAGS_enable_pir_api=0; \
rm -rf output/qat_infer_model; \
python ../.PaddleDetection-2.8.1/tools/export_model.py \
        --config ./tinypose_128x96_plate.yml \
        --slim_config ./tinypose_qat.yml \
        -o weights=./qat_best_model/qat_best_model.pdparams \
        --output_dir=./qat_infer_model

报错信息如下:

(ppdet) $ make qat_export_infer 
export FLAGS_enable_pir_api=0; \
rm -rf output/qat_infer_model; \
python ../.PaddleDetection-2.8.1/tools/export_model.py \
        --config ./tinypose_128x96_plate.yml \
        --slim_config ./tinypose_qat.yml \
        -o weights=./qat_best_model/qat_best_model.pdparams \
        --output_dir=./qat_infer_model
[12/02 08:10:59] ppdet.utils.checkpoint INFO: Skipping import of the encryption module.
[12/02 08:11:41] ppdet.utils.checkpoint INFO: Finish loading model weights: ./qat_best_model/qat_best_model.pdparams


[12/02 08:14:56] ppdet.engine INFO: Export inference config file to ./qat_infer_model/tinypose_qat/infer_cfg.yml
Traceback (most recent call last):
  File "PaddleDetection-2.8.1/tools/export_model.py", line 122, in <module>
    main()
  File "PaddleDetection-2.8.1/tools/export_model.py", line 118, in main
    run(FLAGS, cfg)
  File "PaddleDetection-2.8.1/tools/export_model.py", line 80, in run
    trainer.export(FLAGS.output_dir, for_fd=FLAGS.for_fd)
  File "PaddleDetection-2.8.1/ppdet/engine/trainer.py", line 1318, in export
    self.cfg.slim.save_quantized_model(
  File "PaddleDetection-2.8.1/ppdet/slim/quant.py", line 55, in save_quantized_model
    self.quanter.save_quantized_model(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddleslim/dygraph/quant/qat.py", line 288, in save_quantized_model
    self.imperative_qat.save_quantized_model(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/quantization/imperative/qat.py", line 297, in save_quantized_model
    self._quantize_outputs.save_quantized_model(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/quantization/imperative/qat.py", line 538, in save_quantized_model
    paddle.jit.save(layer=model, path=path, input_spec=input_spec, **config)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/decorator.py", line 235, in fun
    return caller(func, *(extras + args), **kw)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/base/wrapped_decorator.py", line 26, in __impl__
    return wrapped_func(*args, **kwargs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/api.py", line 809, in wrapper
    func(layer, path, input_spec, **configs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/decorator.py", line 235, in fun
    return caller(func, *(extras + args), **kw)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/base/wrapped_decorator.py", line 26, in __impl__
    return wrapped_func(*args, **kwargs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/base/dygraph/base.py", line 68, in __impl__
    return func(*args, **kwargs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/api.py", line 1104, in save
    static_func.concrete_program_specify_input_spec(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/program_translator.py", line 986, in concrete_program_specify_input_spec
    concrete_program, _ = self.get_concrete_program(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/program_translator.py", line 875, in get_concrete_program
    concrete_program, partial_program_layer = self._program_cache[
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/program_translator.py", line 1648, in __getitem__
    self._caches[item_id] = self._build_once(item)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/program_translator.py", line 1575, in _build_once
    concrete_program = ConcreteProgram.from_func_spec(
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/decorator.py", line 235, in fun
    return caller(func, *(extras + args), **kw)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/base/wrapped_decorator.py", line 26, in __impl__
    return wrapped_func(*args, **kwargs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/base/dygraph/base.py", line 68, in __impl__
    return func(*args, **kwargs)
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/program_translator.py", line 1346, in from_func_spec
    error_data.raise_new_exception()
  File "miniconda3/envs/ppdet/lib/python3.9/site-packages/paddle/jit/dy2static/error.py", line 452, in raise_new_exception
    raise new_exception from None
AttributeError: In transformed code:

    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/meta_arch.py", line 59, in forward
        if self.training:
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/meta_arch.py", line 69, in forward
        for inp in inputs_list:
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/meta_arch.py", line 76, in forward
        outs.append(self.get_pred())
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/keypoint_hrnet.py", line 111, in get_pred
        res_lst = self._forward()
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/keypoint_hrnet.py", line 77, in _forward
        if self.training:
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/keypoint_hrnet.py", line 79, in _forward
        elif self.deploy:
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/keypoint_hrnet.py", line 87, in _forward
        if self.flip:
    File "PaddleDetection-2.8.1/ppdet/modeling/architectures/keypoint_hrnet.py", line 93, in _forward
        output_flipped = self.flip_back(output_flipped.numpy(),
                                        self.flip_perm)
        output_flipped = paddle.to_tensor(output_flipped.copy())
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
        if self.shift_heatmap:
            output_flipped[:, :, :, 1:] = output_flipped.clone(


    AttributeError: 'Variable' object has no attribute 'copy'

请问怎么修改才能导出推理模型?谢谢!

复现环境 Environment

-PaddlePaddle-gpu: 2.6.2
-PaddleDetection: 2.8.1
-PaddleSlim: 2.6.0
-Python: 3.9.0

Bug描述确认 Bug description confirmation

  • 我确认已经提供了Bug复现步骤、代码改动说明、以及环境信息,确认问题是可以复现的。I confirm that the bug replication steps, code change instructions, and environment information have been provided, and the problem can be reproduced.

是否愿意提交PR? Are you willing to submit a PR?

  • 我愿意提交PR!I'd like to help by submitting a PR!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions