Part of a series: #squeak-phone
Here’s a quick demo video of the status of my quixotic project to get Squeak running as a kind of standalone userland on a modern cellphone:
(The sound is bad! I recorded it on my other cellphone…)
Currently, I develop by coding in Squeak on my Linux desktop, using my
graphics tablet as a kind of proxy for a touchscreen. I use the FFI
and AIO/OSProcess support in Squeak to read events from
/dev/input/event.... For event sources that present absolute axes, I
create instances of
HandMorph in the
World and animate them
according to the incoming events.
Every now and then, to test on the real hardware, I use
copy the changes and image files up to the cellphone, and then log in
to the phone over
ssh to restart the Cog VM.
At the moment, the image has enough smarts to figure out how to read the touchscreen and offer basic touchscreen click support. This lets me do simple things like open, move and close windows, and lets me save and/or quit the image.
Next steps are to make it harder to misclick – perhaps by increasing the size of some of the touch targets – and to think about coding up a simple onscreen keyboard.
Alternatively, on a parallel path, I’ve been reverse-engineering
(really nothing more sophisticated than
the Samsung protocols for booting and operating the cellular modem.
The code is short and simple. Perhaps instead of an onscreen keyboard
I’ll code up a quick dialer
Morph and get Squeak making phone calls.