Building Context Aware Apps
Submitted by N Vinay Shetty (@nvinayshetty) on Monday, 25 July 2016
Full talk (40 minutes)
With more than a lakh of new apps entering the store every month, mobile apps has risen like a rocket ship, but how many of them actually end up in the hands of most users?. With the enormous content and the swarm of notifications that these apps generate people are experiencing app overload!
Today’s smartphones have more computing power than the rockets that NASA used to send the man on to the moon. How to effectively leverage this computing power, and sensors to provide better app experience?.
In this talk, I will talk about how to better understand the user and delight him with the app experience by leveraging the processing power and built in sensors to create context aware and assistive apps.
In this talk I will dive deep into what it takes to build context aware android applications and how to make a transition from information age to experience age by using built in sensors in the smart phones. This talk will cover topics including but not limited to the use of awareness, snapshot and fence api to understand user’s location, time, weather, physical activity, beacons etc., to collectively derive the intelligence to deliver great user experience.
Vinaya Prasad is a full time android developer. He had worked in the various fields like healthcare, workforce management system, home automation and travel. Currently he is working as software engineer at redbus.
He is an active open source contributor. The tools that he has open sourced is currently used by thousands of android developers. He is also an raspberry-pi hacker and IOT enthusiast who has experience in combining connected devices and sensors with the smart phone.
Right from the college days he is very enthusiastic about technology and loves to share it with the community. He has won many awards for various technical presentations and projects that he presented.