NLnet grant announcement: https://github.com/Helium314/HeliBoard/issues/2226
NLnet project description: https://nlnet.nl/project/GestureTyping/Swipe-o-Scope repository: https://codeberg.org/eclexic/swipe-o-scope
CC BY-SA 4.0 license: https://creativecommons.org/licenses/by-sa/4.0/
Article on data anonymization: https://www.science.org/doi/10.1126/sciadv.adn7053
Contact me:
My Mastodon: https://mstdn.social/@theeclecticdyslexic
My Matrix: https://matrix.to/#/@eclexic:matrix.orgText based tutorial: https://github.com/Helium314/HeliBoard/wiki/Tutorial:-How-to-Contribute-Gesture-Data
It is recommended you install Heliboard through the F-Droid app, unless you know what you are doing!
How to install F-Droid: https://f-droid.org/en/docs/Get_F-Droid/
Heliboard on F-Droid: https://f-droid.org/en/packages/helium314.keyboard/
Heliboard on Github: https://github.com/Helium314/HeliBoard/releasesGesture typing library links from MindTheGApps:
ARM64: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/arm64/proprietary/product/lib64/libjni_latinimegoogle.so
ARM: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/arm/proprietary/product/lib/libjni_latinimegoogle.so
X86_64: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/x86_64/proprietary/product/lib64/libjni_latinimegoogle.so
Other ways to contribute:
Providing packaging scripts for Swipe-o-Scope:
- for Windows
- for macOS
- for Flatpak. This is a big task; Swipe-o-Scope uses QT modules that are not currently supported by the KDE SDK (QTGraphs module). Swipe-o-Scope is also written using the PySide6 python library, rather than in C++. To build Swipe-o-Scope for flatpak, you are probably going to have to talk with KDE developers and the PySide6 baseapp maintainer. You will need them to update the SDK and the baseapp to support PyQTGraph. This is all in addition to needing to know a little about building flatpaks.
- for Linux via means other than Flatpak (e.g. the AUR)
Providing input or code if you are knowledgeable about any of the following:
- gesture typing using hand-designed algorithms... (bonus points if you have worked on a paper or product that you could help us make an open implementation of WITHOUT violating anyone's intellectual property)
- gesture typing using neural nets and constrained compute, such as on mobile devices without TPUs... (unfortunately you may not be able to contribute here effectively until we have the data collected, organised, and released at the end of the collection period)
- the JNI in Android... (bonus points if you have a working knowledge of the AOSP Latin IME JNI library)
- natural language processing for next word prediction, specifically comparing the suitability of a set of candidate words against one another... (either by ngrams or any other low-compute method available to a mobile device with no internet connection)
- building a diverse small-to-medium sized multi-lingual corpus of natural language text we could legally use to simulate context... (not stealing copyrighted content in bulk, like certain companies.)
- making desktop apps more user-friendly... (Swipe-o-Scope doesn't currently give user feedback that would be helpful to anyone that doesn't feel comfortable in a terminal)
- being patient while performing thorough code review and audits of rust code... (the gesture recognition library is most likely to be in rust, despite me being more experienced in other languages, as I am making a bet on it still being popular in a decade or three)
Full disclosure: out of concern for copyright issues and code quality, we will not be accepting ANY LLM generated contributions to this project. Neither in the form of code, nor corpus text. Thank you!
If you work on another on-screen keyboard, have the ability to collect data from it, and you want to add to the data set - you can contact me about some more particulars of the file format. Not all requirements of the file format are obvious and should not be assumed!
We are still in early stages, and this project is likely to continue for quite a while. It may continue well past the end date of me getting paid by the NLnet. Don't hesitate to reach out if you think you can help in some other way! We can use all the help we can get; gesture typing is a hard problem with a very high ceiling. Every little improvement matters!
Remember, even sharing this project around will be helpful at the moment. I can't be boosting this all the places I maybe should be; I have to be working on code, and this video took long enough!
I thought it might get some reach when cross-posted here
https://github.com/HeliBorg/HeliBoard/discussions/1161
I'm pretty sure it was also mentioned in the video
Isn't that only to enable Swipe? What does that have to do with the awful autocorrect?
I understand the context here was about keyboard figuring out which word they meant. So that would be Swipe, no?
I'm pretty sure @AndrewZabar was talking about the normal autocorrect, as in, the suggestions you get while you type
I was referring to when my typing is not perfect, how it handles deciding what I meant to type. So like if I wanted to type this sentence, my fingers might have hit si likr uf I wsntrd ti tyow tuos semtebcd, my fingerd night have git Whereas on my iPhone or android with gboard it will have corrected all of it perfectly, the Heliboard app gets maybe 20% of it properly corrected, if I’m lucky.
If I could have the privacy and customizable aspects of Heliboard with the accuracy of gboard, I’d be happy happy.
Hm. My bad, then
Maybe this needs some data gathering too
All good