Enriching your App's UI with SimpleFingerGestures
Submitted by Arnav Gupta (@championswimmer) on Sunday, 12 October 2014
UI - full talks
Making your App stand out from competitiors can be tough. Often the UI makes the difference, especially how the users are interacting with the app. Users want intuituve and easy control over the app, and you want to cram in as many options as possible while keeping simple.
The most successful solution to improve UI is to implement touch gestures like (Un)pinch, 1/2/3/4-finger drags, swipes, flicks, single/double/triple-taps etc.
The open-souce and free to use SimpleFingerGesture library enables you to implement all such multi-touch gestures in your app UI with only a couple of lines of code.
I initially made SimpleFingerGesture just as a helper for some of my apps. I put it up on Github, as I by default do with most of my hobby projects. Lo and behold, 6 months later, around 50 apps on the play store use this library.
SimpleFingerGesture lets you implement some gestures like
1. Pinch, Unpinch (with 2 to 4 fingers)
2. 1-finger, 2-finger, 3-finger, 4-finger swipes (up, right, left, down)
3. double, triple taps
The functionality of the library itself can be extended to add support for other gestures. The code of the Library itself is also very easy to understand, and uses simple OnTouchListener API from Android.
Most of the popular apps like those from Google, Facebook, Twitter etc use gestures like pinch-zoom, swipe down to refresh etc. And now you can too. Adding support for each such gesture will be under 5 lines of code.
Basically I have a flexible structure in mind. If I get a 15 min slot for a crisp talk (which I am not averse to), I can cover
a) How to use my library
b) How to extend the functionality of my library
If I am allowed a 40 min slot (which I would absolutely love to get), I would like to cover, other than the aforementioned,
c) Where to use gestures, where to not.
d) Which gestures can be used where, based on the user’s intuitiveness
e) Small details in UI flow that can help you direct your users without letting them know you are manipulating them (dark patterns, and their converse)
An android development environment of your choice, and a device/emulator of your choice, running any version of android above 2.1 (eclair).
Currently pursuing a Bachelor’s degree in Electrical and Electronics Engineering at Delhi Technological University, while also working at Cube26 as an Android Framework Engineer.
I have been a Developer and Device Maintainer at CyanogenMod and AOKP, making the latest Android source work on Sony Xperia devices, while adding awesome usability features that make users fall in love. A couple of features I have written have also made their way into Google’s Android Open Source Project, and can be found in Kitkat and Android L.
I have been a Open Source community partner with Sony Mobile for the last two years, which basically means I get the latest Xperias to hack around with as soon as they are launched.
At Cube26, I have been part of the team that made many contextually smart UI/UX enhacements for the Micromax Canvas A290, A310, A315 series of phones.
I am also an open source enthusiast with contributions to Linux, GNOME, Arduino, Android and other open source projects.
I was invited as a speaker at Mobile Developer Summit 2014 hosted by Saltmarch Media.