This is Elsa’s Coding Exercise. It allows Elsa to assess candidates’ ability to develop a simple app with quality code. Since it’s just a coding exercise, we value code quality above UI/UX. The UI needs to be functional, it does not need to be fancy.
You can start this any time you like. Please take no more than 1 day to code. You can turn it in at any time. As you think about this exercise, please feel free to ask questions!
Please Fork this repo and submit your solution in the fork
Happy coding!
Create a Voice Note application that records audio and transcribes it to text. A location is captured for each note.
- A user should be able to create a note by starting to record audio
- Notes should be created with the audio and transcribed using some ASR/Voice-to-Text service (e.g. Google ASR API or others)
- Users should be able to edit the transcribed text
- Users should be able to see a list of their notes
- Users should be able to select a note and view the transcribed/edited text, playback the voice note, and view the any metadata that the note might have.
(These are ideas. You are not obligated to implement any of these. We prefer to see an app with limited feature set and quality code, as opposed to an app with more features that is hacked together)
- Location automatically captured and saved as metadata for the note
- Implement persistency of the notes across app launches
- Allow the user to attach a photo to a note
- Allow the user to attach multiple photos to a note
- Allow the user to share the audio + text via iMessage and other services that support that media
- Allow the user to share the audio + text + attached image(s) via iMessage and other services that support that media
- Map view of all the notes with a tap to select a note and view it
- Create an iMessage app that, when activated in iMessage, shows a list of taken notes and allows for sharing
- Create an AppleWatch app extension that lists the note items with their content