Last edited by modhello001 In 8/5/2025 09:43 Editor Hi everyone, First-time poster here. I'm working on a custom UI layer for an embedded vision product using RK3568 + OpenHarmony 4.0. The interface is mostly gesture-driven and runs quite smoothly — until I add a specific bitmap-based animation triggered on touch. The Problem:When this animation is active, we notice a ~100ms touch response lag. The rendering is done through a standard EGL + GPU pipeline, and logs show no blocking calls. This only happens during the animation of one particular module — it involves loading fruit-style visuals (PNG sprites, ~80–100KB each). Other similar-sized bitmaps work fine. My Question:Has anyone else encountered this kind of behavior on embedded devices with vision-focused UI layers? I'm wondering:
Any insight from someone who's worked with OpenHarmony UI or similar ARM-based boards would be super helpful! 🔧 Reference Module (with the animation demo): I’ve isolated the animation and touch layer into a test module, just to get feedback on whether the issue is asset-related or structural. It’s here (hosted externally to keep this post clean):
Let me know if this feels like a familiar bottleneck or if you have tools you'd recommend for tracing input latency on OpenHarmony! Thanks in advance! 🙏 |
-
-
770.4 KB, Down times: 0