Android - Hrily/NotesNearby GitHub Wiki

Android

Team Name :

SHA

Team Members :

  • Aiman Abdullah Anees
  • Hrishikesh Hiraskar
  • Salman Shah

About :

  • Notes::Nearby app is a location based and Augmented Reality(AR) app.
  • Its all about notes on locations.
  • Add note to your location, View notes which are near you in the map as well as in AR.

Minimum Android Version required

Android JellyBean 4.2 (SDK 17)

Directions to use

To add note: Click + on Map screen or select 'Post Note' from navigation menu, enter the details and click 'Add Note to my location. Your note is posted.

To view note: In Map Screen, click the marker to see the details of the note (Title, Description, Location). OR You can select 'View in Air' from Navigation menu to view notes in Augmented Reality. It will use your device camera to show the notes with respect to your position. In AR view, click note to view its description.

What was used to make the app

  • Azure Mobile App Service : For Connecting app to Database
  • Azure SQL Database : The backend of the app, used to store the notes
  • Google Maps : To visualize the user location and nearby notes
  • Google Sign In : To authorise user

About Augmented Reality

The notable feature of the Notes::Nearby app is its 'View in Air' feature. Its Augmented Reality based feature which shows nearby notes in the virtual world shown by the device camera.

The implementation of AR was a huge task before us. Thanks to NetGuru which had a basic implementation of AR in Android which uses the Device sensors, Location and Camera to realize AR. But the implementation was very basic which can only tell when an object is to be shown. The other features such as animating objects on screen, showing object at correct direction/bearing, Calculating the distances of objects from device was needed to be done.

The working of AR is as follows:

  • Calculate the angle between object, device and north, say alpha, using Trigonometry
  • Calculate the angle between north and device screen, say beta, using Sensors
  • If beta is in some specified range of alpha, calculate the position of object on screen and show the object at that position.

What we couldn’t do

We wanted to integrate Bing Speech API into our app, but couldn’t do as we ran out of time

A Vote of Thanks to Microsoft

We would really like to thank Microsoft Academia Accelerator for giving us the opportunity to showcase our skills in App development. Microsoft showed us how easy it is to develop an App and also for introducing us to the services provided by Microsoft and Azure which are really useful in making an app and which are really easy to use. We thoroughly enjoyed the Hackathon.

Thankyou :smile: