Portfolio / Lego Fortnite

Lego Fortnite

Studio Epic Games Role Senior QA Engineer / Tech Art Specialist Dates July 2023 – March 2026 Engine Unreal Engine 5
  • Unreal Engine 5
  • QA Systems Design
  • FX Validation
  • Asset Audit Tools
  • Claude Code
  • Python
  • API / MCP
  • Jira
  • Pipeline
Screenshots and screen captures from Lego Fortnite development — pending clearance for portfolio use

At Epic Games I was embedded across multiple Tech Art and content teams on Unreal Engine 5 and Lego Fortnite, working as both a QA engineer and a technical artist specialist. The role covered pipeline integrity across FX, animation, content optimization, world/terrain, and procedural content — serving as the animation domain liaison and coordinating test coverage across a large-scale live service project with distributed teams.

Two pieces of work from this role are worth detailing: an FX validation framework that addressed a systemic gap in test coverage, and an automation triage tool built with Claude Code that changed how the team handled daily test results.

FX Coverage Framework

The problem surfaced when I started tracing escaped defects — bugs that reached production without being caught in testing. A pattern emerged: a significant portion traced back to FX that had never been tested across platforms or scalability settings. FX that looked correct on the highest PC settings would produce visual artifacts, disappear entirely, or behave differently on console or low-end hardware. Nobody had a systematic process for catching this.

The response to most coverage gaps is to add more centralized QA work. That wasn't the right answer here. The FX touched every content team's work — adding a centralized FX testing pass would have required constant coordination overhead and would have created a bottleneck every time new content shipped. The problem needed to be distributed.

I developed a suite of test cases specifically for FX validation across platforms and scalability settings, then packaged them as team-specific guides rather than centralized procedures. Each content team received a version scoped to the FX types they produced, with clear steps they could run independently as part of their own review process. The result: FX coverage scaled with content volume without scaling the centralized QA load.

The other thing this required was being credible with the content teams about what they needed to test and why. That's where the tech art background mattered — it's easier to explain FX validation criteria to an FX artist when you can describe what the scalability system is actually doing to their particle system.

Automation Triage Tool

Daily automation runs produced test results across multiple platforms — PC, console, Switch — and those results needed to be reviewed, compared, and turned into actionable failure reports with Jira tickets attached. The manual process was slow and error-prone: results were in raw formats, comparison across platforms required context-switching between multiple files, and writing Jira-linked notes for each failure ate time that could go toward actual triage.

I built a solution with Claude Code in a single day. The tool is a Python script that pulls test results via API, processes them across platforms, and generates a self-contained HTML report. The report includes:

  • Side-by-side comparison of results across platforms, with failures highlighted
  • Trend tracking — whether a failure is new, recurring, or recently resolved
  • Jira-linked failure notes pulled from the relevant ticket via MCP integration
  • A clean, scannable layout that makes daily triage a reading task rather than an assembly task
Triage tool HTML report screenshot — pending

The tool itself is a concrete example of how the Claude Code collaboration workflow functions in practice. The problem was well-understood, the output format was clear, and the integration points (APIs, MCPs, Jira) were documented. Within that framing, Claude Code could handle the implementation work — API calls, HTML generation, data transformation — while I focused on what the report needed to communicate and how it would be used. The result was a working tool, not a prototype, built and deployed in a day.

For more on how that collaboration actually works, see the How I Work page.

Asset Audit Tools

Separate from the triage work, I built asset audit tools using Unreal Editor Utility Widgets and extended an existing C++ submit validator to enforce mesh LOD and triangle count standards at check-in. These tools ran inside the editor, so artists could audit their own work before submitting rather than waiting for a downstream QA catch.