HMDA Redesign Project Overview


Timeline: 8 weeks (September - October 2025)


The Problem: Loan officers were consistently skipping required HMDA demographic collection during in-branch applications due to a cumbersome 15+ click workflow with no validation or enforcement. This created significant compliance exposure during CFPB examinations and burdened back-office teams with hours of manual data cleanup.


Key Metrics: Eliminate 15+ unnecessary clicks, achieve <3 minute completion time, prevent common HMDA violations through built-in validations, target 4+ banker confidence rating (1-5 scale), achieve 100% completion rate for HMDA-reportable loans.


Success: Loan officers complete HMDA collection confidently without skipping steps, real-time validation prevents compliance errors before submission, back-office teams spend less time on data cleanup, and nCino establishes scalable patterns for other regulatory flows like 1071 business demographics.

.

TLDR

  • Led end-to-end redesign of HMDA demographic collection to reduce skip paths and compliance risk

  • Advocated for designing within the new application architecture, influencing product roadmap priorities

  • Partnered with compliance, dev, and PMs to define validation logic and separation of regulatory vs operational data

  • Use Claude AI to translate dense regulatory language into plain-English requirements and generated test scenarios for edge cases like proxy collection

  • Created a single-page, guided experience that enforces rules in real time through disabling and contextual guidance

  • Established scalable patterns now used for other regulatory collection flows (like 1071 business demographics)

Our Team

Systems audit

  1. Irrelevant buttons

  2. Badges that no one used

  3. Tables with broken links

  4. Blank objects

From Compliance Burden to Design Opportunity

In our previous workflow, bankers were frequently skipping required HMDA collection during in-branch applications. The Entity Compliance form required too many clicks, lacked clear validation, and didn’t distinguish between individual and business applicants. Even when bankers wanted to do the right thing, the system made it difficult to know when and how to collect demographic data.


The result was a costly loop of missing data, manual cleanup by back-office teams, and compliance risk for banks during CFPB examinations. From a business standpoint, this also slowed loan processing and left nCino lagging behind competitor systems that offered more modern validation and enforcement.


Redesigning HMDA was both a compliance necessity and a strategic opportunity. If we could simplify the flow, prevent errors in real time, and educate bankers at the moment of decision, we could reduce compliance exposure and set a new design standard for other regulatory experiences across the platform.

Evaluating the Legacy HMDA Experience for Usability and Compliance

Lack of Guidance Led to Skipped HMDA Data Collection


There were no clear indicators that HMDA was required, and no error states for incompletion. Bankers were skipping demographic collection entirely to save time during applications.

.

Inefficient, Inconsistent, and Confusing Form Design


The legacy form was jargon-heavy, inconsistent, and inefficient. Irrelevant fields and unclear categories increased collection time and confusion, while poorly framed dropdowns and missing validations led to frequent data errors and incomplete records.

















  1. The form was filled with jargon, which added to banker's desire to simply "move on".





  1. Fields that were completely irrelevant to HMDA applications were listed, increasing collection time unnecessarily.








  1. Bankers found the dropdown menus difficult to understand and make selections, increasing cognitive load.


  1. Categories and sub-categories were not given clear mutual exclusivity, causing confusion during selection and creating HMDA recording issues downstream.


  1. Dropdown selections lacked plain language and positive framing.



















  1. "Other" fields were separated from their check-box counterpart, causing bankers to skip required fields.



On the whole :

No validation errors were enforced.

Minimal Post-Collection Visibility for Bankers


After collection of HMDA, the data table that was rendered reflected very little useful information for bankers post collection, specifically when reporting was completed by closers, or loans were audited during regular compliance reviews.

.

Understanding the Rules Behind the Work

To ground the redesign in both user behavior and regulation, I reviewed everything available across our ecosystem. Using Dovetail, I analyzed recordings of bankers navigating the current HMDA flow and saw frequent hesitation, skipped fields, and “N/A” workarounds.

Using Federal Guidelines to Inform UX Decisions


I read through the official CFPB documentation and physical branch forms to understand exactly how the questions were worded and what rules governed each step.


Reviewing the official HMDA form clarified why bankers found the existing experience confusing — it lacked the structure and contextual guidance seen in the regulation itself.

.

Leveraging Existing Patterns from Small Business Workflows


I connected with our Small Business Design Lead, who had recently designed the 1071 Business Demographic Collection flow, to compare approaches and identify shared compliance patterns. The Small Business modal flow offered a proven pattern for demographic intake, which informed our direction for HMDA.



.

Finally, I partnered with Armstrong Bank’s compliance team to walk through my early wireframes and confirm that our interpretations of federal rules matched their audit expectations. I used Claude AI to analyze data, not only our conversation with Armstrong, but also pulling multiple reverse demo recordings that were collected in a shared research repository.

Chatting with my bestie.

This research made one insight unmistakable: the collection method drives everything. Face-to-face applications follow a different set of rules than remote or online applications, especially when it comes to visual observation. That single discovery reshaped our entire validation architecture.


It also surfaced two key behavioral insights: bankers needed the system to prevent errors in real time, not just tell them they made a mistake after submission; and they needed in-context education, not rigid system blocks. Compliance enforcement couldn’t come at the cost of user trust.

Choosing to Design for the Future, Not the Past

At the time, our product manager initially proposed updating HMDA within the old interface. I advocated for designing it directly in the new application experience — even if that meant a slightly longer build cycle.


Designing for the future state first allowed us to create the right data model, reduce technical debt, and establish patterns that could later be back-ported to older builds. This approach aligned with nCino’s larger platform vision and helped secure buy-in across design, compliance, and development.

.

I advocated to build for the refreshed experience first. :)


I also promised I'd design a solution that worked in both experiences. ;)

The Redesign - Contextual and Conditional Display

HMDA Data Table – Conditional Display Rules
To reduce visual noise and improve workflow focus, the HMDA data table is now conditionally rendered. It only appears when all regulatory requirements apply, ensuring relevance without overwhelming users.


Display logic:

  • Product type requires HMDA collection

  • Collateral type requires HMDA collection

  • Applicants are HMDA-eligible individuals


When these conditions are met, the table automatically displays rows for the primary applicant and the first co-applicant, keeping data entry streamlined and context-aware.

Data Table Indicators


Color-coded status chips clearly show whether HMDA data has been collected, helping bankers instantly recognize missing information.

.

Date Modified Field


Adding a “Last Date Modified” column provides transparency into when demographic data was last updated, reinforcing audit readiness and accountability.

.

Error State for Missing HMDA


If bankers attempt to continue without completing demographic data, an inline error message now clearly communicates the requirement and provides a direct action path to resolve it.

.

Designing for Reuse Within Legacy Constraints


I designed the refreshed experience to be fully compatible with the legacy system, enabling developers to reuse existing components without refactoring — a decision that saved time and made my dev partner, Monty, very happy.

.

Designing Guardrails, Not Roadblocks


As we refined the system, we landed on a principle that guided the entire design: “Disable, don’t error.” Instead of hiding options or showing errors after submission, we built real-time disabling rules that prevented invalid combinations. For example, selecting Not Hispanic or Latino would automatically disable all Hispanic sub-categories, enforcing mutual exclusivity without interrupting the banker’s flow.


Another architectural improvement came from separating regulatory data (what’s reported to HMDA) from operational context (how the data was collected). A conditional checkbox for “Co-applicant responded on behalf of this applicant” allowed bankers to capture real-world context without affecting compliance codes. This distinction became a reusable pattern for other complex workflows across nCino’s platform.

.

.

Bringing Clarity and Humanity to Compliance

Early prototypes focused on simplifying the layout and reducing clicks. However, compliance reviewers quickly pointed out that my first version still allowed bankers to skip demographic fields entirely. Their feedback helped me see the real challenge: designing enforcement that still felt humane.


Through several iterations, I rebuilt the flow as a single, guided page that only appeared when HMDA was required. Each field dynamically responded to user input — disabling invalid combinations, showing contextual warnings, and blocking progression only when necessary. Scoped notifications replaced pop-over errors to prevent unnecessary friction.

.

Partnering Across Teams to Get It Right

Can you see from our faces that we were completely lost in the sauce?

I met weekly with Jason, our senior compliance officer, to validate each rule against HMDA documentation and CFPB interpretations. I worked with development to map dependencies between form states and backend HMDA codes, ensuring that every condition matched reporting requirements. I also coordinated with the small business design team to align on shared patterns between 1071 and HMDA demographic collection.

To handle the complexity, my PM and I built a shared Claude AI project space that became our central repository for transcripts, notes, and rule documentation. I used it to translate regulatory language into plain-English requirements, validate edge-case logic, and generate usability test scripts. It acted like a compliance co-pilot, helping us move faster and more confidently through dense federal language.

.

A Clearer, More Confident HMDA Experience

The final design consolidates HMDA demographic collection into a single, intelligent page that automatically appears when required by the loan product or applicant type. It filters out business entities, reducing clutter and confusion for bankers.


Each section — ethnicity, race, and sex — includes clear choices, controlled sub-selections, and independent refusal options. Real-time validation disables conflicting inputs rather than hiding them, guiding users toward compliant entry. When bankers select an option that doesn’t match their collection method, contextual helper text appears right in the form.

.

"Applicant Provided Response"


When the applicant provides their own demographic information (e.g., in a remote application), they can optionally choose “I do not wish to provide this information.” The help text clarifies that this option applies only to remote submissions, not in-person ones.

.

"Co-Applicant Provided Response"


When a co-applicant responds on behalf of the applicant, the co-applicant can indicate “I do not wish to provide a response.” This reflects scenarios where joint applications are processed remotely, ensuring accurate compliance capture without requiring in-person observation.

.

"Banker Observation (In-Person Application)"


If the application is completed in person and the applicant declines to self-identify, bankers are required to select “Visual observation or surname.” In this case, the form disables sub-category selections to maintain compliance with HMDA guidance — only high-level demographic categories can be recorded.

.

Laying the Foundation for What Comes Next

The new architecture separates regulatory and operational data, enabling accurate reporting and flexible front-end workflows. It also lays the groundwork for future compliance-driven experiences in the in-branch banking space.


As of October 2025, the design is in final compliance validation and scheduled for usability testing with loan officers. Results will inform Q4 development and pilot testing with select bank clients.

.

Lessons in Clarity, Confidence, and Collaboration

This project required balancing precision with empathy — translating dense regulation into usable design patterns without losing the human element.


By separating regulatory data from operational context, we created a scalable foundation for future compliance workflows.
Using AI as both a design and validation partner, we accelerated decisions and strengthened accuracy across the experience.


Ultimately, this work turned a bureaucratic task into a clear, confident process that empowers both bankers and customers.

.

In 2024, our Consumer design team—three product designers and our manager—set out to understand the daily realities of in-branch bankers using nCino’s Loan Origination System (LOS). Our software supports bankers through the entire loan process—from capturing customer information and verifying documents to structuring products and booking loans. We focused on uncovering what was slowing them down, where communication was breaking down, and how we could improve their experience through smarter, more empathetic design.


Over several months, we conducted 34 in-depth interviews and manually tagged over 1,400 qualitative observations. What began as a focused research effort evolved into cross-org persona alignment, strategic feature proposals, and critical momentum for a future-facing, AI-powered product vision.


I contributed to every aspect of the research process and led point on synthesizing insights, authoring the final report, and presenting our findings to stakeholders across the organization.

TLDR

  • Partnered with a small design team to conduct 34 banker interviews and manually tag over 1,400 qualitative observations, uncovering key inefficiencies in collaboration, training, and system integration.

  • Developed four intent-based personas to represent key user motivations and behaviors

  • Proposed feature concepts to improve clarity, reduce rework, and support accuracy

  • Led insight development, wrote research report, and presented findings across the org

  • Research influenced cross-functional alignment on personas and facilitated momentum toward an AI-first product vision

Our Team

Our Research Goals

  • Understand the workflows, frustrations, and collaboration dynamics across key banking roles

  • Define user personas based on intent and motivation—not just job title

  • Identify systemic inefficiencies and gaps in the loan origination experience

  • Recommend actionable features and KPIs rooted in real user needs

Systems audit

  1. Irrelevant buttons

  2. Badges that no one used

  3. Tables with broken links

  4. Blank objects

Approach and Participant Overview

To build a complete picture of the loan origination journey, we spoke with 34 employees across the banking ecosystem—including relationship managers, loan officers, processors, underwriters, and other specialized roles we hadn’t encountered before.


Some worked in small-town branches where they knew every customer by name. Others were buried in high-volume, high-pressure queues in corporate settings. Our participants ranged from a few months of experience to over two decades in the field.


Using a semi-structured script of 25 open-ended questions, we explored their top tasks, pain points, tech stacks, collaboration routines, and visions for a better workflow. These conversations revealed deeper issues than just tool friction—they illuminated fragmented communication, unclear accountability, and mounting pressure to move faster with less support.

Collaborative Analysis and Thematic Synthesis

Following the interviews, we hosted a three-day design workshop to synthesize what we had learned. Using Dovetail AI, we clustered our 1,400+ tagged quotes into themes around user motivations, challenges, and opportunities.

On the final day, we created a journey map across every loan product our company supports—revealing that the loan lifecycle is less about individual roles and more about recurring behaviors and tasks. This became the foundation for our intent-based personas.

Intent-Based Personas

Instead of anchoring personas to job titles, we defined them around motivations, behaviors, and challenges that consistently showed up across roles. These personas reflect the real people we spoke to—and the tasks required to move loans from start to finish.


The Initiator


Initiators are the customer’s first point of contact. They identify needs, guide onboarding, and often juggle multiple responsibilities. They're motivated by building trust and helping clients make smart financial decisions.

The Verifier


Verifiers are the gatekeepers for loan approval. They make sure each request is accurate and error free, and define what is needed to get the loan approved. They focus on accuracy and compliance, and work independently to solve complex issues and push tough loans across the finish line.

The Facilitator


Facilitators coordinate across roles, acting as a bridge between the customer and the bank. They gather documents, and ensure things keep moving. They thrive on variety, organization, and keeping the process smooth.

The Maintainer


Maintainers handle account updates, backend cleanup, and cross-selling. They’re driven by structure, consistency, and getting things right the first time.

Digging Deeper: From Patterns to Priorities

Once we defined these personas, we examined how they interacted with each other—and where things tended to break down. Our goal was to identify issues with the broadest cross-role impact and focus on solutions that could unlock efficiency at scale.


We asked ourselves: What pain points ripple through the entire system? Where can design have the greatest reach across roles and workflows?


That framing led us to three core themes—each pointing to deeper structural issues in the loan origination process.

Key Observations

Theme 1: Bankers Are Struggling with Time and Tools

Most bankers were overwhelmed—juggling nonstop tasks, jumping between systems, and losing time to fragmented communication. Burnout was common.

Theme 2: Collaboration and Ownership Are Unclear

Hand-offs were messy and undefined. Without a shared source of truth, work was duplicated or dropped, and no one knew who owned what.

Theme 3: Training and Knowledge Gaps Slow Everyone Down

New hires were left to figure things out alone, and even experienced employees lacked clear guidance. Small mistakes often snowballed into major delays.

My Role: Bringing the Story Into Focus

We surfaced hundreds of tagged quotes and behavioral patterns—but the full story hadn’t yet emerged. At the time, Dovetail AI was just beginning to take shape as a powerful AI research tool. I started scouring the depths of everything we'd done thus far.

I returned to the data, re-read interviews, and reexamined our personas to draw out the broader narrative. Slowly but surely, the underlying story of our bankers began to emerge.

Key Insights

Speed Over Accuracy Is Creating Risk

Bankers are pressured to move fast—but without the tools or training to do it right. Rushed files lead to rework, delays, and a vicious cycle of pressure.


Poor Handoffs Shift the Burden Downstream

Incomplete loan files pass from sales-facing roles to underwriters, who then spend time fixing errors or chasing missing information.


Processors Absorb the Fallout

Processors become the glue—clarifying statuses, managing expectations, and cleaning up mistakes—slowing fulfillment and degrading customer experience.


Role Misalignment and Distrust Cause Delays

Teams brace for failure rather than trust collaboration. Silos, poor tools, and unclear policies make cooperation harder than it needs to be.

Turning Insights Into Solution Proposals

I translated these insights into four strategic product proposals to support clarity, reduce mental load, and prevent errors before they start.


Role-Based Dashboards

A focused interface that surfaces only the most relevant tasks and data for each role.


Centralized Communication Hub

A single source of truth for task updates, messages, and progress tracking.


Real-Time Error Prevention

Smart validations and pre-checks to catch issues at the point of entry.


Context-Aware Guidance

Built-in support and prompts tailored to risky or unfamiliar workflows. Especially useful for onboarding.

Validating Design Concepts Already Gaining Traction

The Centralized Communication Hub feature proposal validated a concept already being designed and tested by our Consumer team.


This “Loan Hub” performed well in usability testing with bankers, and quickly advanced to leadership review. It was placed on the roadmap for a future overhaul of our legacy underwriting experience.

Defining Success: Metrics That Matter

To measure the value of these solutions, we proposed tracking KPIs directly tied to user pain points:


  • Loan Processing Time: Time from initiation to approval, and per-step breakdown

  • Error Rates: Frequency, type, and cost of corrections

  • Collaboration Response Time: Delays from unclear ownership or handoffs

  • User Satisfaction: NPS, CSAT, and feature-specific feedback

What Happened Next…

I authored the research report in Dovetail and presented on the work to several audiences—including Consumer PMs and our global design team.


After presenting our findings, our persona work was used as a catalyst for cross-org alignment. We standardized personas across business lines and partnered with our Mortgage to adopt their work around a customer persona. Together, we hosted collaborative working sessions to refine them.

Our Jam sesh across consumer, small business, commercial and mortgage teams to align on personas.

These personas became the foundation for a company-wide design summit, where over 30 designers collaborated on AI-first concepts tailored to each user type. Ideas ranged from role-specific dashboards to AI-powered document agents and automated verification workflows.


This research played a critical role in aligning teams around a common language and strategic vision—directly influencing how the organization approached AI integration at a key inflection point in product planning. It gave leadership and cross-functional teams a shared framework to imagine bold, future-facing solutions with clarity and confidence.

Bridging Role-Based and Intent-Based Thinking

When PMs brought Claude-generated personas into the fold, I helped clarify the distinction between role-based needs and foundational, intent-based personas—and how they each serve different stages of design.

After reviewing several of the PDFs Sophie shared, I mapped her role-based personas—like Branch Banker Betty and Processor Patricia—back to the foundational intent-based personas we had developed.


  • Betty? She aligned with The Initiator.

  • Patricia? A clear Facilitator.

  • Some of her personas were combinations, which gave us a great opportunity to talk about why we took an intent-based approach in the first place.


In banking, the number of unique roles is massive—and differs by product and institution. If we created a persona for each one, we’d quickly end up with dozens per business unit. That’s why we designed broader, behavior-based personas: to serve as foundational anchors, while allowing teams to layer on product- or role-specific detail as needed.


Sophie and I aligned on a shared strategy:
She would continue using her personas to describe product-specific needs, but we’d map them to the higher-level intent-based personas for consistency across teams.

What I Learned

Research takes a village
With no dedicated researcher, we split the work. We supported each other across PTO gaps, heavy loads, and shifting priorities—and got it done together.

Design can still feel intimate and alive
Our journey-mapping workshops were a return to form. Sticky notes. Whiteboards. Laughter. We felt like designers again—close to the work, close to our users.

Insights don’t announce themselves
They emerge from immersion—from reading transcripts twice, from re-tagging moments others missed, from holding the forest and the trees in mind at once.

Buy-in matters
I shared this work again and again, tailoring each version to its audience. That consistency built trust, inspired momentum, and helped carry the work forward.

The Beginning of a Bigger Story

This project was a turning point in how our team approached product design—from reactive problem-solving to insight-driven strategy. By staying close to users and aligning teams around shared personas and system-level breakdowns, we laid the foundation for deeper, more meaningful product decisions.


The work didn’t end here. These insights directly informed our next initiative: an end-to-end product vision for consumer banking at nCino—where we reimagined the entire origination experience through the lens of clarity, flexibility, and AI-assisted workflows.