Skip to content

Hackathon Project #5: Augmented reality park feedback experience


Providing actionable information about a public park — features and amenities — and a means for reporting maintenance issues within the park to responsible parties in an accessible, easy manner.

Use cases

  • The pavement is broken. User can pull out a phone with this AR app to see who the responsible agency is and make a call or text to report it.
  • There is construction in the park. Someone who is blind or partially sighted is walking down a path and encounters the construction. They pull out their phone and hold it up; they hear a message about who is in charge of the construction and a question about if they want to report it being inaccessible. They make the call and then go about their day. If they want to request additional help, there’s also an option for that.

This prototype must include

  • Visual and audio component; you should hear an indicator when you’re in view of something
  • Interaction for further information (“shake to hear more”)
  • A call to action: send an email, make a call, notify responsible persons
  • Be aware of where the user is located and include in action / report
  • We assume that the user will have a phone or tablet here.

This prototype could optionally include

  • The ability to associate a targeted object (found by the user) with a known object (cataloged previously in a database), and link the two in the action / report (computer vision and comparison step where you don’t have a known located list of objects in AR space)
  • Ability to request instantaneous help with something that’s broken (e.g. elevator)

Major risks to work on/ What do we hope to learn

  • Does this create too much visual or auditory clutter?
  • Should you have to plug in headphones to use this? Do you hold the device to your ear?
  • What does it feel like to hear sounds as part of the AR experience?
  • Is it annoying to have to report something by talking to your phone (auditory experience)?
  • How accurate is the positioning and orientation; do you actually feel like you’re focusing on the thing you want to be?
  • How many things should have report mechanisms on them?
  • How long does it take go through a full reporting process?
  • Where should the indicators appear visually within the scene? Above the object? Embedded in the object? With a symbol?

Hardware/materials needed

  • iPads
  • Computers
  • Earphones
  • Mic (or computer mic)

Special skills needed

  • Experience with AR programming

Back to Hackathon Project Descriptions