My process, my proof of work.

While my professional work is bound by confidentiality, I've created the following examples to demonstrate my process and analytical capabilities. These projects replicate the frameworks I use to tackle research challenges related to onboarding, early user experience, and behavior over time—using myself as the primary case study.

We'll start with a look at my approach to longitudinals, then dive into some projects focused on video games. While I do have extensive experience in video game research, I also used games as they are well suited for self-reported, small case studies.

Longitudinal Research: Because Behavior Over Time Tells a Different Story.

While I’ve run many types of longitudinal studies over the years—varying in length, format, and visualization style—my most recent approach was inspired by the classic Napoleon’s March, the piece of information design that first drew me into this work back in grad school.

The chart shown here is a generic recreation of a visualization I developed at Meta to represent a specific type of longitudinal study I’ve designed and led multiple times.

The foundation is a 30-day daily survey completed by 100+ participants. They’re compensated regardless of whether they use the product each day—the only requirement is that they complete the survey. (The first three days include mandatory use so participants can form an informed opinion.)

The survey is lightweight (under 4 minutes) and asks what they did in the app or game and how they felt about it. If they didn’t use it, they’re asked why not and what they did instead. Throughout the study, I layer in 1:1 interviews with selected participants.

Impact:

In this particular instance, I conducted a dozen studies across similar live-service games and apps. While each product gained value from its individual insights, the greater impact came from aggregating the data—revealing broader patterns and cross-product truths that directly informed and improved future live-service design strategies.

From User Expectations to Product Decisions.

I proactively initiated a foundational research study to understand how user expectations were solidifying for nascent VR technology. Starting with the very popular shooter genre, we surveyed hundreds of players to benchmark their mental models for button mapping and physical interactions (e.g., item pl). The resulting insights proved highly valuable to our internal development teams and were subsequently published on the Quest developer's forum to drive broader design impact.

Impact:

This research provided teams with an actionable framework for intuitive controls, which they reported directly led to conserved development cycles by eliminating design ambiguity. The final product was a more accessible and user-friendly experience, as the design was now aligned with established player mental models.


Fortnite: What 15 Matches Reveal About the New Player Experience.

15 matches. Multiple metrics. One question: what does it actually feel like to be a new player in one of the most influential games ever made?

I analyzed the data as I would any full-scale study — and the findings were clear: the strongest predictors of fun were feeling like I had a fighting chance, match length, and high-but-not-frustrating challenge. I died constantly. I still had a blast. That gap between outcome and experience is exactly what good UX research is designed to surface.

I crunched the numbers in Excel and then used, among other methods, Claude Opus 4 AIto dig deeper. At a company, I’d collaborate with a data scientist to help me massage the numbers and compare to telemetry.


Conclusion:

The data suggests that increasing match times in the initial rounds could boost players’ feeling that they had more of a fighting chance, which in turn could encourage longer play sessions. This effect may be driven by increased awareness of weapon placement, enemy movement/skirmishes, and/or overall map awareness.


More video game work examples live here.