Keyboard volume control
-
alex124.rh
- Posts: 125
- Joined: Thu Jul 31, 2025 9:18 am
Re: Keyboard volume control
Does this mean if i’m making a script to decode the canbus buttons via car pi hat, i can’t map it to the keyboard inputs. and im limited to the api actions? thank you
Re: Keyboard volume control
Unfortunately, Wayland is very limited when it comes to keyboard emulation due to security reasons.alex124.rh wrote: Thu Aug 07, 2025 5:47 pm Does this mean if i’m making a script to decode the canbus buttons via car pi hat, i can’t map it to the keyboard inputs. and im limited to the api actions? thank you
The most reliable approach is to directly translate CAN bus signals into actions in Hudiy using its API.
Hudiy’s API is defined using Protocol Buffers, so you can use many different programming languages to interact with it.
According to the Protocol Buffers documentation, it supports languages such as C++, C#, Dart, Go, Java, JavaScript, Kotlin, Objective-C, Python and Ruby.
https://protobuf.dev/
-
alex124.rh
- Posts: 125
- Joined: Thu Jul 31, 2025 9:18 am
Re: Keyboard volume control
thanks for the response, i’m just concerned that the actions are limited compared to the plethora of actions available to the keyboard. prior to this thread i did look around and it indicated that an option was to use uinput. apparently it works in wayland? just curious
-
alex124.rh
- Posts: 125
- Joined: Thu Jul 31, 2025 9:18 am
Re: Keyboard volume control
let me know what you think / know of uinput, and if the actions available on api, offer the same control as keyboard
Re: Keyboard volume control
Using the API, you can inject key events into Hudiy.
You can also directly trigger actions. The API offers much more flexibility and control over Hudiy compared to standard keyboard input.
Plus, by using the API instead of keyboard emulation, you're interacting only with Hudiy - not with the entire system.
Please find some related documentation in the API below:
You can also directly trigger actions. The API offers much more flexibility and control over Hudiy compared to standard keyboard input.
Plus, by using the API instead of keyboard emulation, you're interacting only with Hudiy - not with the entire system.
Please find some related documentation in the API below:
Re: Keyboard volume control
thanks for the suggestion. I will need a bit of guidance for interfacing with HUDIY API. At the moment I have "commander" python script executing the keypress function (eg: h for home and etc) after getting i2c messages from arduino/teensy. all the keypress that HUDIY can understand works. now left the volume +, - and mute which associated to specific i2c messages. all these work with OpenAuto Pro using the i2c protocol. hence can I ask for an example on how to integrate it with HUDIY API using this https://github.com/wiboma/hudiy/blob/ma ... hAction.py ? should I extract the DispatchAction.py and insert it in my existing python script?
Re: Keyboard volume control
If you already have a working script that communicates with Hudiy via the API, then you just need to call (depending on the signal you expect from the device):
Just like in the example.
For your use case, action can be:
You can share your current script, and we’ll try to modify/refactor it if you’d like.
Code: Select all
def trigger_action(self, client, action):
dispatch_action = hudiy_api.DispatchAction()
dispatch_action.action = action
client.send(hudiy_api.MESSAGE_DISPATCH_ACTION, 0,
dispatch_action.SerializeToString())
self.trigger_action(client, "output_volume_up")
self.trigger_action(client, "output_volume_down")
self.trigger_action(client, "toggle_output_muted")
For your use case, action can be:
- output_volume_up
- output_volume_down
- toggle_output_muted
You can share your current script, and we’ll try to modify/refactor it if you’d like.
Re: Keyboard volume control
Thank you for offering to look at my python code as below. it listen to the i2c messages and trigger the keypress for most of it. however, if this code can be coded with HUDIY API in every actions needed, then I would prefer to move away from keypress. on top of this code, I have some handshake between teensy and pi to conduct the graceful shutdown. if you can help me integrate this to HUDIY API, then I can learn from this example for other customization like CAN bus trigger Reverse Camera display, day/night setting and so on. thanks in advance!!
Code: Select all
import smbus
from pynput.keyboard import Key, Controller
import lgpio
import time
import os
import threading
# ==================================================
# I2C constants
# ==================================================
I2C_BUS = 1
I2C_ADDRESS = 0x08
# Messages
ALIVE_MSG = 0x4C # Pi -> Teensy
COUNTDOWN_MSG = 0x43 # Pi -> Teensy (countdown active)
SHUTDOWN_REQ = 0x3B # Teensy -> Pi (shutdown request)
SHUTDOWN_ACK = 0x53 # Pi -> Teensy
SHUTTING_DOWN = 0x44 # Pi -> Teensy ("going down")
# Key control signals
NO_KEY = 0x00
HARDBOOT_SIGNAL = 0x52
MCU_RESET_SIGNAL = 0x54
GPIO13 = 13
# HDMI control
HDMI_OFF_DELAY = 10 # seconds before HDMI goes off in countdown
# ==================================================
# Key mappings
# ==================================================
KEYS = {
0x00: None,
0xCC: Key.f11, # Mute key
0xC9: Key.f8, # VolCW
0xC8: Key.f7, # VolCCW
0xDA: Key.up,
0xD9: Key.down,
0xD8: Key.left,
0xD7: Key.right,
0xB0: Key.enter,
0xB1: Key.esc,
0x48: 'H',
0x4A: 'J',
0x46: 'F',
0xC4: Key.f3, # Fav (with Ctrl pressed)
0x31: '1',
0x32: '2',
0x52: 'R',
0x54: 'T',
0x4C: 'L',
0x4D: 'M',
0x70: 'P',
0x6F: 'O',
0x6E: 'N',
0x76: 'V',
}
# ==================================================
# Globals
# ==================================================
keyboard = Controller()
bus = smbus.SMBus(I2C_BUS)
# GPIO setup (lgpio)
CHIP = 0
chip = lgpio.gpiochip_open(CHIP)
lgpio.gpio_claim_output(chip, GPIO13)
lgpio.gpio_write(chip, GPIO13, 1) # set HIGH initially
running = True
shutdown_initiated = False
in_countdown = False
# HDMI state
hdmi_active = True
hdmi_timer = 0
hdmi_lock = threading.Lock()
# ==================================================
# Helpers
# ==================================================
def press_key_with_delay(key, delay=0.05):
keyboard.press(key)
time.sleep(delay)
keyboard.release(key)
def execute_keystroke(key_code):
key = KEYS.get(key_code)
if key is None:
return
if key_code == 0xC4: # Ctrl + F3
with keyboard.pressed(Key.ctrl):
press_key_with_delay(Key.f3)
elif isinstance(key, Key):
press_key_with_delay(key)
elif isinstance(key, str):
keyboard.type(key)
# ==================================================
# HDMI Control (Bookworm / Pi 5)
# ==================================================
def hdmi_on():
global hdmi_active
with hdmi_lock:
if not hdmi_active:
print("[HDMI] Screen will restore on keypress")
hdmi_active = True
def hdmi_off():
global hdmi_active
with hdmi_lock:
if hdmi_active:
print("[HDMI] Blanking screen (any key will restore)")
os.system("kmsblank")
hdmi_active = False
def hdmi_countdown_worker():
global hdmi_timer
while running:
if in_countdown:
if hdmi_timer > 0:
print(f"[HDMI] Countdown: {hdmi_timer}s")
time.sleep(1)
hdmi_timer -= 1
elif hdmi_active:
hdmi_off()
time.sleep(1)
else:
time.sleep(1)
# ==================================================
# Key / Control handling
# ==================================================
def process_key_event(key_byte):
if key_byte == HARDBOOT_SIGNAL:
print("[Pi] Hardboot signal received, rebooting.")
os.system("sudo reboot")
elif key_byte == MCU_RESET_SIGNAL:
print("[Pi] MCU reset signal received, toggling GPIO13.")
lgpio.gpio_write(chip, GPIO13, 0)
time.sleep(0.2)
lgpio.gpio_write(chip, GPIO13, 1)
else:
print(f"[Pi] Key Received: {hex(key_byte)}")
execute_keystroke(key_byte)
# ==================================================
# Heartbeat Task
# ==================================================
def heartbeat_task():
global running, shutdown_initiated, in_countdown
while running and not shutdown_initiated:
try:
if in_countdown:
bus.write_byte(I2C_ADDRESS, COUNTDOWN_MSG)
print("[Pi] Sent COUNTDOWN (still alive, preparing).")
else:
bus.write_byte(I2C_ADDRESS, ALIVE_MSG)
print("[Pi] Sent ALIVE.")
except Exception as e:
print(f"[Pi] I2C write error: {e}")
time.sleep(5)
# ==================================================
# Listen for Teensy messages
# ==================================================
def listen_for_teensy():
global shutdown_initiated, in_countdown, hdmi_timer
while running and not shutdown_initiated:
try:
msg = bus.read_byte(I2C_ADDRESS)
if msg == SHUTDOWN_REQ:
print("[Pi] Teensy requested shutdown.")
try:
bus.write_byte(I2C_ADDRESS, SHUTDOWN_ACK)
print("[Pi] Sent SHUTDOWN ACK to Teensy.")
except Exception as e:
print(f"[Pi] I2C write error (ACK): {e}")
try:
bus.write_byte(I2C_ADDRESS, SHUTTING_DOWN)
print("[Pi] Sent SHUTTING DOWN signal.")
except Exception as e:
print(f"[Pi] I2C write error (shutdown msg): {e}")
shutdown_initiated = True
print("[Pi] Executing shutdown now.")
os.system("sudo shutdown now")
elif msg == COUNTDOWN_MSG:
if not in_countdown:
print("[Pi] Teensy entered COUNTDOWN mode.")
hdmi_timer = HDMI_OFF_DELAY
in_countdown = True
elif msg == NO_KEY:
if in_countdown:
print("[Pi] Countdown cancelled, back to ALIVE mode.")
hdmi_on()
in_countdown = False
else:
process_key_event(msg)
except Exception:
time.sleep(1)
# ==================================================
# Main
# ==================================================
if __name__ == "__main__":
t1 = threading.Thread(target=heartbeat_task, daemon=True)
t2 = threading.Thread(target=listen_for_teensy, daemon=True)
t3 = threading.Thread(target=hdmi_countdown_worker, daemon=True)
t1.start()
t2.start()
t3.start()
try:
while True:
time.sleep(1)
except KeyboardInterrupt:
running = False
print("[Pi] Stopped by user.")
finally:
# Cleanup GPIO
lgpio.gpiochip_close(chip)
Re: Keyboard volume control
I leverage chatgpt to help me understand the API and examples and then I'm able to incorporate the API and trigger action into my commander code. just to learn that some navigation code is triggered using Keyevent and some uses dispatch action.
-
noobychris
- Posts: 25
- Joined: Fri Aug 15, 2025 5:06 pm
- Location: Germany
- Contact:
Re: Keyboard volume control
Keyboard Emulation with latest Raspberry Pi OS 64 Bit (bookworm) on my Raspberry Pi 4B 4GB with python-uinput 1.0.1 in my pyrhon3 script is working finehudiy wrote: Thu Aug 07, 2025 3:13 pmOn Wayland (the default window system in Bookworm), keyboard emulation is very limited.wkl3968 wrote: Thu Aug 07, 2025 2:39 pm basically I have decoded the LIN BUS from the car controller and I',m using an arduino to emulate like a USB keyboard. is there a keyboard key stroke for commanding the hudiy like OpenAuto Pro? How would should I approach this?
Compared to X11, there’s no straightforward way to achieve this.
The best approach would be to move your LIN BUS decoder program to the Raspberry Pi and use the API to send the appropriate commands to Hudiy.
Alternatively, you could send the decoded signals from the Arduino to the Raspberry Pi via the serial port.
Then, on the Raspberry Pi, you could read these signals from the serial interface and map them to the appropriate actions using the Hudiy API.
Code: Select all
async def import_uinput():
global events, device, uinput
try:
events = (
uinput.KEY_1, uinput.KEY_2, uinput.KEY_UP, uinput.KEY_DOWN, uinput.KEY_LEFT, uinput.KEY_RIGHT,
uinput.KEY_ENTER, uinput.KEY_ESC, uinput.KEY_F2, uinput.KEY_B, uinput.KEY_N, uinput.KEY_V,
uinput.KEY_F12, uinput.KEY_M, uinput.KEY_X, uinput.KEY_C, uinput.KEY_LEFTCTRL, uinput.KEY_H, uinput.KEY_T
)
device = uinput.Device(events)
Code: Select all
elif msg == '373004000200' and back > 0: # RNS-E: return button released
if back <= 4:
device.emit(uinput.KEY_ESC, 1)
device.emit(uinput.KEY_ESC, 0)
back = 0