Pidog latency on ChatGPT how to improve?

The PIDOG project is fantastic, and the build has been smooth.

However, I’m disappointed with the latency on ChatGPT. When I asked, “Who are you?” the response took 25 seconds.

There’s no difference in performance between GPT-4o-mini and GPT-3.5-turbo.

I’m using a Raspberry Pi 3 on LAN, with no firewall or VPN.

I’m wondering what steps I can take to improve the latency.

What results can I expect with a Raspberry Pi 4 and a USB 3 SSD?

Please share your experience.

Thank you!

Please make sure the network is stable.
Suggest you re-run the example to verify, does it respond for a long time each time.