Snapbar

A rising star making the INC 500 list in 2019 , Snapbar was hit hard by the dramatic changes of the COVID-19 pandemic. It was an overnight shift from in-person, staffed photo booth experiences to a new photo booth software product that catered to virtual events. The initial version of this was built in-house by three developers. It saw instant demand. Velvet Software came on in late 2020 to define an architecture that would take Snapbar through the next years, lead new overseas engineers, and build out new, critical features.

Hub

Snapbar was eager to not just create a hit product but an ecosystem of products that drew from their strengths (photography-based experiences). We started discussing architecture, recognizing that several products would tie into a central hub to allow storing, organizing, moderating, and serving photos, videos, and animations created through these products. I designed a RESTful API that brought these concepts together. Snapbar's CTO was familiar with Django through past startups so I picked this up in concert with Django REST Framework to build out the first version of Hub.

There were several interesting challenging bits to this project: First was discovering what should be built. We had lots of collaborative discussion with Snapbar and a partner company that also was consuming the API. I made sure that the domain models reflected immediate needs but also were generic and loosely coupled so that they could be used with the future planned products. We had to extend Django substantially to support the per-account granular permissions that allows clients and employees to interact with the API. I wrote over 1000 unit and integration tests to ensure that permissions were enforced properly and that the API performed as designed. I worked closely with Ukranian-based developers to help them write front-ends that authenticated through our auth provider and used the API. Finally, I managed infrastructure and deploys with Terraform and GitHub Actions. This new system has handled over 1000 events and managed over 150k photos from attendees.

Snapshot

Snapshot (previously known as Virtual Booth) was the initial hit software product for Snapbar. It had seen rapid development in 2020, but that development left a lot of technical debt too. Customers were asking for animation/boomerang creation capabilities. I set out to figure out how we could build this. I began this process by working with the Chief Product Officer to understand the feature. I then starting building basic prototypes to explore this. This yielded a quick, tangible proof-of-concept: we could build this and knew what the risks were. Upon completing that, I showed it to the team and then started looking at how we might integrate it into the core product.

This process yielded some clear refactoring efforts. We could switch from a home-grown Jekyll + Babel + Browserify build system to Next.js which offered better bundling via Webpack and top-to-bottom support for React (the whole codebase was React except the wiring to Jekyll). This yielded a 10 second page (37%) load speedup on slow 3G-like connections: perfect for attendees of events with spotty WiFi. The logic that controlled a user's flow through the product was encoded as a lot of conditional statements spread across different React components. We could switch to a state machine-based approach of guiding the user through the product where states represented the different screens they viewed and next/back buttons could lean on the state machine to determine where to go. Finally, the process of composing the user's captured image with stickers/frame/text was a homegrown <canvas>-based approach. I evaluated different drawing libraries and recommended Fabric.js as the preferred way of composing these images. I pitched the team on all of this, giving tradeoffs and the benefits of evolving the current product towards these goals as opposed to starting a new product from scratch. They agreed to this and I addressed each of these as several week refactoring efforts. Animations was easy to implement on top of these changes, adapting what I had learned in early prototypes to the core product.

This product is customized on a per-event based in a clever way using Git branching and subdomain hosting provided by Netlify. A talented operations team takes a client's requests and assets provided by design to create a new branch for that client's event. Sometimes this involved customizing code to add new steps or custom integations. This allowed Snapbar to meet interesting client requests with a quick turnaround. This team didn't come from an engineering background though and were trained on how to make these customizations with occasional input from engineering. I recognized that the developer setup process for operations was time-consuming and confusing. I worked to simplify the setup and document it. Ultimately, I wrote a CLI tool that new team members use to get set up and ensure that dependencies such as Node are installed properly, the right version, configured correctly, etc. This tool has brought the setup time down from days to less than 2 hours.

Helium Development

Helium Development started in 2015 by developing and customizing Shopify websites for clients, but the steady growth of some side-project apps on the Shopify App Store shifted the focus of the company primarily to apps in 2019. Their flagship app, Customer Fields, grew in popularity and shops were asking for more features. Velvet Software has been involved with Helium from the start, acting as a technical mentor and investor through the years.

Customer Fields: Integrations

Velvet Software was most recently approached about a specific architectural goal: how do we build third-party integrations and possibly have those built by a third-party vendor or contractor? And more importantly, how do we do this without compromising our principles and practices around data security and privacy? As one of Shopify's top 500 apps, Customer Fields was managing tens of millions of customers' data across thousands of shops.

Customer Fields up to this point was a Rails monolith with data stored in MongoDB, all running on Google Cloud Platform. The lead developer and I discussed options and it seemed like Google Cloud Pub/Sub was a good fit as a messaging system between the third-party integrations and the main app. It was established early on that data flow would be uni-directional: integrations would be sent messages about changes to customer data. I evaluated different designs for Pub/Sub topics/subscriptions/messages/push-versus-pull and laid out the advantages and disadvantages of each. After explaining these options, I prototyped a Node.js-based integration and provided working demo repositories for both the Rails side and integration side. We discussed some of the pitfalls of messaging: accidental infinite loops, the need for idempotence, and how to preserve message ordering. Shops would need to configure these third-party integrations for their shop, providing API keys, changing settings, etc. That configuration needed to live somewhere. We discussed a handful of options including pushing the configuration via Pub/Sub, but landed on having each integration provide a configuration web page that Customer Fields would redirect to or embed, passing along a JSON Web Token to prove that the request was authenticated. Finally, I designed a way to do a full sync of customers to an integration, not just incrementally as changes happen. This would ensure that data was present shortly after the integration was configured or on-demand as necessary.

This was a project that was turned around in two billable weeks and gave the team a clear direction of where to go with the implementation. Velvet's familiarity with GCP products and prototyping answered a lot of architectural questions in a concrete way.

Humming

Humming is an ad tech startup based in Tacoma, WA that is trying to simplify how advertisers interact with demand-side platforms (DSPs). I was Humming's CTO in 2019 and 2020 and led a team of four engineers. We navigated a lot of change during this time period and my goal was drive the product forward incrementally and create stability for my team. We turned an unreliable, undocumented MVP that kept slipping deadlines into a product that saw features delivered weekly with excellent uptime.

We then started looking to future product ideas and I led an effort to take the time/location/keyword/site analytics data from all of the different DSPs that we worked with and present easy-to-understand aggregated stats to customers. I use Airflow to orchestrate this and work of processing in Python and SQL to clean and normalize the data. This was organizing several gigabytes worth of data per day and had over 100 steps to produce the final reports.