Learning Programming

Makers Academy – My Final Project

For the last two weeks at Makers Academy, I was very fortunate to be able to work on a really exciting project, which I would love to carry on developing in the future. My team and I developed an iOS app with the objective to aid users with varying degrees of visual impairment, to navigate an underground station. We utilised Bluetooth beacon technology in order to send notifications to the user once it detect the closest beacon with an ID that it knows.

How it works (in it’s simplest form):

screen-shot-2017-02-14-at-13-54-21

We built this app in Xcode using Swift – a new language and tool for all of us, which wasn’t taught on the course. I’m super proud of what we have been able to achieve in just two weeks, especially as we didn’t know how to code at all 13 weeks previous!

We spent the first couple of days getting to grips with the basics of Swift by each going over a Fizzbuzz tutorial, which can be found here: https://medium.com/@ynzc/getting-started-with-tdd-in-swift-2fab3e07204b 

This was great because it helped us continue to learn in the same way as we had been, through Test Driven Development, one of the best practices we had been doing from the very beginning at MA.

After these couple of days, we spent the rest pair programming, having daily morning stand-ups and end of the day retros, where we would discuss our ideas, any blockers we have, and what to do the next day. Having these stand-ups and retros were really important for our productivity.

Once we had our Minimum Viable Product (simply once the app could pick up one bluebooth beacon and update the on-screen text), we implemented extra features to help make the app as user-friendly as possible. For example, speech-to-text functionality as well as text-to-speech for the notifications. This is so that the user can tell the app which station they would like to go to. We also incorporated the TFL API so that when the user is on the platform, they would get a notification for the next train. If we had more time, we would combine the speech-to-text functionality with the API so that the user can be guided to the correct platform to take them to their destination.

Other features include, increasing and decreasing font size so that they user can adjust the font depending on their level of visual impairment; vibrating three times for left, and two times for right for a extra guidance; the ability to turn the voice over on and off if they don’t require the text-to-speech functionality and would like to listen to something else and a tap to repeat function in case they missed the verbal notification.

Our simple interface, which has been designed with Blind People Society advice for colour scheme, font size, and font style in mind:

screen-shot-2017-02-14-at-13-55-44screen-shot-2017-02-14-at-14-01-49

Our code is available on Github, if you would like to check it out!

https://github.com/aabolade/GuideMe

This post was originally posted on my original blog http://louisaspicer.wordpress.com, where you can find more posts about my experience at Makers Academy.

 

Follow my blog with Bloglovin

Leave a Reply

Your email address will not be published. Required fields are marked *