Categories
Stuff

Kivy on Raspberry Pi with a Touchscreen

This took way too long to get right 😢 Really all I want is writing a simple graphical fullscreen UI in Python that runs on the small touchscreen attached to it.

Fundamental Problems:

  • The touchscreen is connected through the GPIO port and not through the GPU. This means access is through the kernel FB driver only and there is no hardware acceleration of any kind.
  • SDL on Raspbian has not been compiled with proper touch-device support, specifically tslib is not enabled. This breaks pygame in interesting ways (touch input is all over the place wrong) and requires a custom built SDL.
  • SDL also has not been compiled with the kmsdrm backend, which apparently breaks Kivy when run on the console.
  • As far as I can tell Kivy requires GL support of some kind, which is only provided by the GPU and not by the FB device anyway – in other words – Kivy can not output directly to the touchscreen.
  • I’ve played with Kivy before and found it somewhat frustrating – there’s so much magic going on in the background that I found very difficult to grok. Also when things go wrong, error messages are not helpful. (For example, during console testing I had set some SDL environment vars. They were still set when I tried running Kivy under x11 and caused fascinating but completely misleading errors that had me look in all sorts of wrong places for hours… 😒 ) Still, Kivy seems to be the best of breed if I don’t want to hack basic fundamental UI things by hand.
  • My first thought was to run the UI straight on the console because it’s “simpler”, but I eventually realized that because of all the SDL and framebuffer snafu it would require recompiling a bunch. (Which is something I don’t like – I prefer my automation hosts to have rather pristine system setups with as much customization limited to the apps as possible.) And there’s also another issue – I can’t reasonably test console apps on my other systems. So running the apps under X11 would have the advantage of allowing to VNC into the machine for remote testing at least.

Approach Taken:

  • Basic enablement of touchscreen through /boot/config.txt:
    dtoverlay=rpi-display,speed=32000000,rotate=270
  • Use the builtin GPU to get full hardware acceleration support (whatever little the Raspberry Pi really has…) and render video output without a HDMI device connected. For this force HDMI resolution to the same as the touchscreen by adding this to /boot/config.txt:
    hdmi_force_hotplug=1
    hdmi_group=2
    hdmi_mode=87
    hdmi_cvt=320 240 60 1 0 0 0
  • Run raspi2fb as a service under systemd to constantly copy the output of the GPU onto the touchscreen FB device.
    • I also experimented with rpi-fbcp. The code is much simpler but by default runs at ~40hz or so update rate, which eats a lot of CPU time. It has fewer options and no systemd file either.
    • I looked at fbcp-ili9341 too, which is incredibly impressive with a lot of clever tricks to speed up the copy process, achieving incredible FPS and apparently allowing fluent games even. But the speed comes at the cost of missing touch support – that’s a no-go for me.
  • Configure X input for touchscreen, in (new) directory /etc/X11/xorg.conf.d/ create file 99-ads7846-cal.conf. (This is the “270° rotation” setup from here.)
Section "InputClass"
    Identifier "calibration"
    MatchProduct "ADS7846 Touchscreen"
    Option "EmulateThirdButton" "1"
    Option "EmulateThirdButtonButton" "3"
    Option "EmulateThirdButtonTimeout" "1500"
    Option "EmulateThirdButtonMoveThreshold" "30"
    Option "InvertX" "0"
    Option "InvertY" "1"
    Option "SwapAxes" "1"
    Option "TransformationMatrix" "0 1 0 -1 0 1 0 0 1"
EndSection
  • Configure system using raspi-config:
    • System -> Boot -> Desktop autologin
    • Interface -> VNC -> enable
    • Performance -> GPU Memory -> 64M (or something, just not the smallest amount or X won’t start)
    • Advanced -> GL Driver -> GL with Fake KMS

Now after rebooting the touchscreen shows a (tiny) X11 desktop and it should be possible to connect through VNC and see the same. Next step is installing Kivy.

  • Confusingly some Kivy dependencies that need to be installed are only listed after the rest of the process is described, making them easy to miss. 😒 Look them up first, for Kivy2 they’re here right now. Install big list for Buster first, then small list for “Desktop environment”.
  • Now back to the normal Kivy installation. No need to install and use virtualenv, the Python3 venv works fine.
  • Testing with e.g. share/kivy-examples/demo/touchtracer/main.py should work now, again both on the actual touchscreen as well as over VNC, here is the essential bits of the Kivy log:
[INFO ] [Image ] Providers: img_tex, img_dds, img_sdl2, img_pil (img_ffpyplayer ignored)
[INFO ] [Text ] Provider: sdl2(['text_pango'] ignored)
[INFO ] [Window ] Provider: sdl2(['window_egl_rpi'] ignored)
[INFO ] [GL ] Using the "OpenGL" graphics system
[INFO ] [GL ] Backend used
[INFO ] [GL ] OpenGL version
[INFO ] [GL ] OpenGL vendor
[INFO ] [GL ] OpenGL renderer
[INFO ] [GL ] OpenGL parsed version: 3, 1
[INFO ] [GL ] Shading version
[INFO ] [GL ] Texture max size <8192>
[INFO ] [GL ] Texture max units <32>
[INFO ] [Window ] auto add sdl2 input provider
[INFO ] [Window ] virtual keyboard not allowed, single mode, not docked
[INFO ] [GL ] NPOT texture support is available
[INFO ] [ProbeSysfs ] device match: /dev/input/event0
[INFO ] [HIDInput ] Read event from
[INFO ] [Base ] Start application main loop
[INFO ] [HIDMotionEvent] using
[INFO ] [HIDMotionEvent] range ABS X position is 0 - 4095
[INFO ] [HIDMotionEvent] range ABS Y position is 0 - 4095
[INFO ] [HIDMotionEvent] range ABS pressure is 0 - 255

Important to check that it is using sdl2 and the touch input is properly detected.

Followup: setting up kiosk mode

Leave a Reply

Your email address will not be published.