You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I am trying to infer the quantized model SD2_1 using QNN 2.28 on a Samsung S24 phone。
I encountered an issue where the inference process using SDV1-5 gets stuck when inferring the Unet model released on qai-hub for SDV2_1.
The command I used is as follows:
cmd_exec_on_device = [PLATFORM_TOOLS_BIN_PATH + f'/adb', '-H', rh, '-s', device_id,
'shell', f'cd {target_device_dir} && ',
f'export LD_LIBRARY_PATH={target_device_dir} &&',
f' export ADSP_LIBRARY_PATH={target_device_dir} &&',
f' {target_device_dir}/qnn-net-run ',
f'--retrieve_context {model_context}',
f' --backend {target_device_dir}/libQnnHtp.so',
f' --input_list {target_device_dir}/input_list.txt',
f' --output_dir {target_device_dir} ',
f' --config_file {target_device_dir}/htp_backend_extensions.json ',
# f' > {target_device_dir}/log.log'
]
If I comment out the second to last line as follows, it can perform inference, but the performance data obtained from inferring the model at this point is not the best, and it will show a 'Context Free failure' error message.
cmd_exec_on_device = [PLATFORM_TOOLS_BIN_PATH + f'/adb', '-H', rh, '-s', device_id,
'shell', f'cd {target_device_dir} && ',
f'export LD_LIBRARY_PATH={target_device_dir} &&',
f' export ADSP_LIBRARY_PATH={target_device_dir} &&',
f' {target_device_dir}/qnn-net-run ',
f'--retrieve_context {model_context}',
f' --backend {target_device_dir}/libQnnHtp.so',
f' --input_list {target_device_dir}/input_list.txt',
f' --output_dir {target_device_dir} ',
# f' --config_file {target_device_dir}/htp_backend_extensions.json ',
# f' > {target_device_dir}/log.log'
]
What is causing this? Is there any solution to this problem?
the htp_backend_extensions.json is as follows:
{
"backend_extensions": {
"shared_library_path": "libQnnHtpNetRunExtensions.so",
"config_file_path": "htp_config.json"
}
}
Hello, I am trying to infer the quantized model SD2_1 using QNN 2.28 on a Samsung S24 phone。
I encountered an issue where the inference process using SDV1-5 gets stuck when inferring the Unet model released on qai-hub for SDV2_1.
The command I used is as follows:
cmd_exec_on_device = [PLATFORM_TOOLS_BIN_PATH + f'/adb', '-H', rh, '-s', device_id,
'shell', f'cd {target_device_dir} && ',
f'export LD_LIBRARY_PATH={target_device_dir} &&',
f' export ADSP_LIBRARY_PATH={target_device_dir} &&',
f' {target_device_dir}/qnn-net-run ',
f'--retrieve_context {model_context}',
f' --backend {target_device_dir}/libQnnHtp.so',
f' --input_list {target_device_dir}/input_list.txt',
f' --output_dir {target_device_dir} ',
f' --config_file {target_device_dir}/htp_backend_extensions.json ',
# f' > {target_device_dir}/log.log'
]
If I comment out the second to last line as follows, it can perform inference, but the performance data obtained from inferring the model at this point is not the best, and it will show a 'Context Free failure' error message.
cmd_exec_on_device = [PLATFORM_TOOLS_BIN_PATH + f'/adb', '-H', rh, '-s', device_id,
'shell', f'cd {target_device_dir} && ',
f'export LD_LIBRARY_PATH={target_device_dir} &&',
f' export ADSP_LIBRARY_PATH={target_device_dir} &&',
f' {target_device_dir}/qnn-net-run ',
f'--retrieve_context {model_context}',
f' --backend {target_device_dir}/libQnnHtp.so',
f' --input_list {target_device_dir}/input_list.txt',
f' --output_dir {target_device_dir} ',
# f' --config_file {target_device_dir}/htp_backend_extensions.json ',
# f' > {target_device_dir}/log.log'
]
What is causing this? Is there any solution to this problem?
the htp_backend_extensions.json is as follows:
{
"backend_extensions": {
"shared_library_path": "libQnnHtpNetRunExtensions.so",
"config_file_path": "htp_config.json"
}
}
the htp_config.json is as follows:
By the way , I can successfully infer the text_encoder model with the above configuration。
The text was updated successfully, but these errors were encountered: