RNS-E in Audi A2

Post Reply
korni92
Posts: 6
Joined: Tue Nov 04, 2025 2:05 pm

RNS-E in Audi A2

Post by korni92 »

Hi
I am using Hudiy on a RNS-E and made some scripts to control Hudiy via the RNS-E button. In TV mode some buttons are published to the CAN. I also use certain messages and Hudiy API to switch between Day/Night mode or to shutdown the Pi.
At the moment I am working on showing data from the API fror music, naviagation and phone in the cluster of the car.
I have some trouble with the DDP protocol. I can clear the whole middle section of the screen, but on the middle section I can only write to lower half. No matter what values I write, I always end up on the lower half. Maybe someone has experinced this before.

Image
a8ksh4
Posts: 26
Joined: Thu Oct 23, 2025 6:08 pm

Re: RNS-E in Audi A2

Post by a8ksh4 »

That's rad. You could show all kinds of useful stats there.
korni92
Posts: 6
Joined: Tue Nov 04, 2025 2:05 pm

Re: RNS-E in Audi A2

Post by korni92 »

It was a positioning problem.

Image
korni92
Posts: 6
Joined: Tue Nov 04, 2025 2:05 pm

Re: RNS-E in Audi A2

Post by korni92 »

Some optimizations and it's ready for installation. I am also making a mounting bracket for the unit.

https://youtu.be/cIaL97tjTQE?si=lWCV7qKvPU-zAuBd

Image

Image

Image
Hiphouser
Posts: 4
Joined: Thu Nov 20, 2025 12:42 am

Re: RNS-E in Audi A2

Post by Hiphouser »

Do you have a detailed picture of you custom pi box?
I have found your 3d print Files.
But what composent have you used?

Nice project!
korni92
Posts: 6
Joined: Tue Nov 04, 2025 2:05 pm

Re: RNS-E in Audi A2

Post by korni92 »

My components:

- Pi 4
- Hifiberry DAC+
- CarPiHat
- The PCB from a cheap Ali Express composite RGBS RNS 810 camera converter.
hudiy
Site Admin
Posts: 373
Joined: Mon Jul 14, 2025 7:42 pm

Re: RNS-E in Audi A2

Post by hudiy »

Hello korni92,
Thank you for your feedback on Instagram. To follow up, here is a brief overview of how key events are handled in Hudiy:

Regular key events are delivered by labwc (window compositor in Raspberry Pi OS) through the Wayland protocol (communication mechanism between window compositor and UI app). They are received by Hudiy in the UI thread and then, depending on the current input focus, they are either forwarded asynchronously to Android Auto / CarPlay or handled by the Hudiy UI (in the UI thread).

In case of the API, the difference is that events are delivered to the UI thread through the TCP/WebSocket layer, and from that point on the processing path is exactly the same as for physical keyboard events.

In theory, injecting key events via the API should actually be faster, because there are fewer layers for the event to pass through before reaching the UI thread:

Keyboard path:
hardware -> driver (kernel) -> libinput -> labwc -> Wayland protocol -> Hudiy (UI thread)

API path:
API client -> TCP stack (kernel) -> Hudiy (UI thread)

All connections to the API endpoints (TCP/WebSocket) are handled asynchronously using epoll, which is a high-efficiency I/O dispatching mechanism in the Linux kernel.

WebSocket itself (if used instead of plain TCP) doesn’t introduce significant overhead - it’s just an additional lightweight layer on top of raw TCP/IP (similar to SSL).

If you can share the code you use to inject key events via the API, we can run it on our side and help analyze what the possible bottleneck might be.
korni92
Posts: 6
Joined: Tue Nov 04, 2025 2:05 pm

Re: RNS-E in Audi A2

Post by korni92 »

hudiy wrote: Thu Nov 27, 2025 3:36 pm Hello korni92,
Thank you for your feedback on Instagram. To follow up, here is a brief overview of how key events are handled in Hudiy:

Regular key events are delivered by labwc (window compositor in Raspberry Pi OS) through the Wayland protocol (communication mechanism between window compositor and UI app). They are received by Hudiy in the UI thread and then, depending on the current input focus, they are either forwarded asynchronously to Android Auto / CarPlay or handled by the Hudiy UI (in the UI thread).

In case of the API, the difference is that events are delivered to the UI thread through the TCP/WebSocket layer, and from that point on the processing path is exactly the same as for physical keyboard events.

In theory, injecting key events via the API should actually be faster, because there are fewer layers for the event to pass through before reaching the UI thread:

Keyboard path:
hardware -> driver (kernel) -> libinput -> labwc -> Wayland protocol -> Hudiy (UI thread)

API path:
API client -> TCP stack (kernel) -> Hudiy (UI thread)

All connections to the API endpoints (TCP/WebSocket) are handled asynchronously using epoll, which is a high-efficiency I/O dispatching mechanism in the Linux kernel.

WebSocket itself (if used instead of plain TCP) doesn’t introduce significant overhead - it’s just an additional lightweight layer on top of raw TCP/IP (similar to SSL).

If you can share the code you use to inject key events via the API, we can run it on our side and help analyze what the possible bottleneck might be.
I will have a look, you changed a lot on the API of Hudiy, which is awesome. No virtual environment, for example. It's so much faster and reliable than it was before.
Changing tracks, before there was noticeable delay in delivery the new information to show it on the cluster. Now it's so fast, I needed to add a delay on for the white DIS clusters not to overwhelm the processor of the old dash.

I dumped the code of the control via API, because it felt so much slower, especially when scrolling through playlists. There was a noticeable delay and sometimes it felt that it missed some scrolls, which I haven't noticed with uniput.
Post Reply