Inspiration

The struggle of navigating public transportation without using a visual based app.

How it works

Apollo.AR is a mobile app plugin we are creating to support GPS based auditory augmented reality. To demonstrate it's use we've mocked up a redesign of the MTA bus time app to create an experience which doesn't require the user to hold (if they use mic headphones) or look at the screen. The user will access the app then answer a set of auditory questions to determine the start and end point of a trip. The app will direct them to the bus stop and warn them of the bus's arrival. While on the bus, the user will hear an announcement for each street being passed and one when their destination is the next stop.

Challenges I ran into

We are not well versed in mobile development, so we prototyping the app in the Unity 3D game engine, and upon successful user testing will add a mobile developer to the team.

Accomplishments that I'm proud of

Developing and designing something with a universal design.

What I learned

How to create a sound interaction based off of GPS location, How to design an app for universal usability

What's next for Apollo.AR

Finishing the prototype (adding GPS points along a bus line with their accompanying sounds), Testing the prototype, Developing the mobile app, Brainstorming other areas that could use the technology

Share this project:

Updates