Here are some projects I've worked on (with some details elided for confidentiality).

Global E-Commerce API

I was the principal backend developer on the administrative backend for a global e-commerce system that offered different SKUs, prices, and discounts based on location, date, local time, and information from third party customer loyalty programs.

I implemented and documented a GraphQL API that gave the frontend team a flexible interface to work with. By generating the API from the database schema I was able to focus on designing and testing the data model, avoiding a great deal of tedious and error prone coding.

The data layer was solid, performant, and included built-in features to prevent logic errors from exposing data between customer accounts. Providing a flexible API and thorough documentation to the frontend team let me support multiple frontend developers.

Collaborative Clinical History Database

I was part of a collaboration to pool clinical data from labs around the world, enabling novel analyses by creating a dataset with more statistical power than any lab could get on their own.

I built an online portal for easy, secure data submission and then worked with domain experts to clean the data (mapping fields between datasets, normalizing treatment information, etc.) and combine them into a single database.

The result was a reproducible data artifact with thorough data provenance records, version controlled history, and a data dictionary that our collaborators could use to pursue their own analyses.

Data Quality Monitoring

I worked for an organization adopting a cloud data warehouse. A major concern was data quality as they migrated data streams and complex database logic.

Working with the analysts responsible for the data, I built a data quality tool with a simple UI for creating standard checks (e.g. "this number should always be in a particular range"). I also provided a web-based editor for creating custom checks using SQL. Checks could be run ad-hoc or on a schedule, notifying analysts when problems arose.

The tool was self-contained and simple to deploy, and giving analysts an option to create checks in a familiar language limited the complexity of the required built-in checks, letting the project deliver within its time and complexity budget.

Batch Sync Optimization

I was brought in to help a startup onboard a new client with a substantially larger data set. Their nightly synchronization batch jobs were projected to run longer than 24 hours--an obviously unacceptable situation.

After reviewing the code and finding multiple avenues for optimization we profiled the data synchronization procedure to find the highest-impact work and extracted a representative subset of the data so that we could iterate quickly under realistic conditions.

After fixing the worst bottlenecks and applying some judicious caching, we delivered a >95% speedup on time and on budget.

AI-Powered Wellness App

I was engaged to build the back-end for a wellness app that used LLMs and text-to-speech to create guided visualizations based on user input. The client wanted to add visuals to the app's output.

The progressive web app we delivered let us: