Leaves. And Collaborating.

A weekend creating something cool in AR with friends.  

May 22 6:30 PM
It all started with Spence Lindsay (Lindsay Digital), Chris Miller (Cloud Brigade and Launch Brigade) and I grabbing a drink at a local hangout called Reds Restaurant & Bar. We work out of NextSpace Santa Cruz, a fabulous collaborative coworking community. I’d collaborated with Chris in 2015 designing the UX for Bicycle Blue Book’s new marketplace, but hadn’t had the chance to work with my friend Spence. Tech Raising 2018, a local hackathon weekend was approaching, and Spence had an idea. As we discussed the inspiration for it, our enthusiasm grew. By the time we parted ways an hour later, we’d formed a team, including David Fierstein and Mike Muldoon, and had a rough plan to build an iOS Augmented Reality (AR) app with Unity and Placenote on the front end and a AWS framework for the serverless backend. 

My experience in AR? Zilch. Zero. Nada. But… I love figuring stuff out. I love learning. And I love the collaboration of designer and developers. I am a UX design geek. So I downloaded and tested a few AR apps, researched Unity and Placenote, and readied myself for a 48 hour adventure.

Day 1: Friday, June 1 6 PM - MIDNIGHT
48 hours to deliver a Minimum Viable Product (MVP) and pitch to the judges and fellow hackathon participants. Our team met up around 6pm and grabbed a space to work in. After the Tech Raising opening introductions and initial idea pitches (and any requests for team members), our team got to work.

We were all iterating various functions all at once. I use a Chrome extension, MindMup to plan out the user flows, map flows of websites, etc. It’s a collaborative tool, and free. This came in very handy for a hackathon. It enabled me to share a live mapping of the user flows and modify as we discussed backend, frontend, and interactions based on our original concept. This was a living, breathing doc up through the end of the weekend, and let us plan out the features for the MVP, stretch goals, and full features for the app. By the end of  the evening, we had a pretty good idea of the ideal experiences of both user roles - sender and recipient.

 Yeah. I blurred it. Shh. We don't want to give it all away.

Yeah. I blurred it. Shh. We don't want to give it all away.

Day 2: Saturday, June 2 8:30 AM - MIDNIGHT
Our Unity developers, David and Mike worked on building the core functionality; Spence researched and designed the particle system tool for the 3D drawing in space, Chris worked on setting up the serverless backend structure in Amazon Web Services (AWS), and I got started on the UX/UI product design. Much like designing a UI for a voice-controlled app, I wanted the focus to be on the drawing. So the UI needed to stay out of the way as much as possible. Complexity made simple, but intuitive. 

Having never designed with the restrictions of Unity, I did a lot of research and worked with Spence to discover just what we could do in this space. I knew the functions that we needed, but it took some team brainstorming to come up with a solution that would allow me to keep the UI clutter free, while letting our users to have artistic control with brush size and color. Teamwork.

By mid-morning I had the UI roughly mocked up on paper for the main screen, and a plan for the rest of the interactions, while the rest of the team worked on their parts. We kept an ongoing scrum going so that we could do concurrent development and design, and no-one lost time developing something that wouldn’t work with what we were trying to do. Now keep in mind that we were at a hackathon - so there were plenty of interruptions of other people either needing help or advice, or just curious about what we were doing and what tools we were using. Plenty of craziness, but an amazing amount of collaboration.

leaves-ui-sketched.jpg

Low fidelity design

Once I had buy-in to the product design, I took it into SketchApp and started laying out the elements and screens needed to show an interactive prototype. By evening I was ready to load them into InVision and set up a working interactive prototype for the team to view. This was an iterative process that involved the team as questions arose and functionality was discovered/discussed.

 Prototype screen with UI tools showing.

Prototype screen with UI tools showing.

 Saving your "leaf" to share with a friend - who must then go to the same location to "find" it.

Saving your "leaf" to share with a friend - who must then go to the same location to "find" it.

As the app was pretty complex for both the front- and back-end, we decided to have this InvisionApp prototype be part of our demo, and as Tech Raising participants and mentors came through we were able to show them what we had prototyped so far.

10 pm. As the UX and UI came along, our particle system drawing tool that Spence was creating gave me inspiration for our logo and our first-time use explainer screen. Leaves uses Placenote to permanently place our augmented reality drawings in precise locations, indoors and outdoors. So it made sense that our logo would play off of the map icon using a leaf. As our users would be hand drawing their 3D leaves, I drew a rough leaf, found a handwritten font with that same feel, and quickly put together a logo. Sometimes your first idea works. After several iterations done quickly, we had our logo.

 Hand drawn look.

Hand drawn look.

 This is our landing screen for first time users. Keeping it simple.  We struggled with a tagline, so I pestered Tech Raising team members  Matthew Swinteron and Margaret Rosa , for help. Margaret came up with the winning one “ What leaves me, you find ”. Collaboration for the win!

This is our landing screen for first time users. Keeping it simple.

We struggled with a tagline, so I pestered Tech Raising team members Matthew Swinteron and Margaret Rosa, for help. Margaret came up with the winning one “What leaves me, you find”. Collaboration for the win!

Day 3: Sun, June 3 8 AM - 6 PM
Demo day, and the pressure was on. The Unity-Placenote Dev team was fighting bugs with the Placenote integration, so we decided to scaled back the app demo to show the coolest part of the app - a proof of concept, with our interactive Invision app prototype as the first part of the demo. Spence, Chris and I then worked on elements to highlight in our pitch, and I pulled it together in a short keynote presentation to start the pitch. 

4:30 PM – Time to demo! Leaves was the 7th demo of 14 pitches – and it went well. Our team had such an amazing time collaborating and it came through in our pitch. Spence started off talking about why we wanted to do this app (so we could all work together finally), followed by how we built it, including a cool mapping of our AWS backend (read Chris’s blog), then I demo-ed the interactive prototype, and David finished up with by showing a live demo via Apple TV to the projector screen. He painted with his iPhone8  with the room as the live canvas. As he moved around you could see the existing stream of particles staying in place and creating a fixed 3D image in space. He could move into and all around what he’d already drawn. We got a lot of “oohs” and “ahhs”. 

We also got a lot of great feedback from the judges (thanks!), including suggestions for additional market Ideas. Tech Raising was not a competition, but a collaborative experience for the Santa Cruz tech community – a community I am proud to be a part of. And at the end of the weekend our Leaves team decided to continue building this app. 

We leave you now, and hope you’ll find us soon in the app store!