Tara Jensen – UX Research & Design


M-SHARP, an Aviation Training and Readiness Reporting system for the Marine Corps, is one of many Department of Defense contract projects built by InnovaSystems. InnovaSystems serves all branches of the military, focusing on aviation.

When I started at InnovaSystems, the M-SHARP 2.0 redesign project had not yet started. M-SHARP 1.0 had been in place for 12 years. Constantly being added to and adjusted, this outdated and complicated system had become difficult to maintain. The goal of M-SHARP 2.0 was to rebuild this software from scratch with new technology.

"So... what exactly do you guys do...?"

Starting at InnovaSystems as the first of two UX hires for the Marine Business Unit, I had to gain the confidence of a new team, and incorporate user experience methods into the project and company. My first meeting with my supervisor started with the question, "So... what exactly do you guys do...?". This was the start of defining my role on the project, and in the company.

There were a few designers on other projects when I joined the company, typically focused on UI design and front-end development. On M-SHARP, we leveraged the processes the other designers were following, then slowly added user research into the process. Four years later, we now have a regular cadence for UX research and are building user research into processes for other projects throughout the company.

Contact me to learn more about my role on M-SHARP.

Our Process


We start with a feature brief by the Product Owner and ask questions to understand the business need. We then schedule a meeting with Customer Support where we ask for an end-to-end storyline-style explanation of the different user perspectives involved and use cases for how things work in different units with different situations. At the end of this meeting we draw out interesting differentiators within the user population.


Based on what we learn from our internal team, we schedule contextual inquiries focused on primary roles and activities. Our Customer Support Reps schedule participants that align with the differentiator criteria we discussed earlier, then we make our way out into the field. Most of our research is conducted at MCAS Miramar and Camp Pendleton.


Returning from field research, we review footage, photos, and any documents we've collected and come up with a UI flow model that would match the workflows we saw in the field. The UI Flow models are reviewed with the Solution Architect and Lead Developer, compared against the potential data model, then revised as we learn about additional needs, constraints, and options.

Once the UI flow models are approved, we sketch and wireframe a variety of options for page layouts, sections, and elements, eventually wireflowing them together with planned interaction details. As appropriate, we host a brainstorming workshop to generate additional ideas; and do card sort studies, tree testing, paper prototype usability studies, or wireframe based first-click testing to evaluate early concepts.


We prototype in HTML and CSS using a custom bootstrap framework to bring approved designs to high fidelity. The CSS is built for production at this time, later picked up by the development team during implementation. Prototypes are built with jQuery interactions to catch anything missing in the flow, and to communicate expectations to the development and test teams during implementation.

Following prototype approval, the lead developer works with the product owner to break the designs down into digestible user stories for the development team. This process can start at the approved wireframe stage.


Once internal prototypes are approved, we duplicate the internal prototypes we've built and tailor them for usability studies. In most cases these are coded prototypes (HTML, CSS, jQuery).

Before we tailor the prototypes for usability testing, we work closely with our Customer Support Reps to come up with a realistic scenario that will drive participants to the areas of the design that we have the most questions about. These questions are derived from team discussions throughout the design process, heuristic evaluations on the final prototypes conducted by myself and the UX designer, and general questions the team would like us to attempt to answer.

Once we have our scenario, we work closely with our business analysts and lead developer to collect the appropriate data to build into the prototypes. At the same time, our customer support representatives schedule participants based on discussions about the user population and differentiators we had identified together. Then, I head out into the field to test the prototypes with 1-2 observers and a moderator (myself and the junior UX designer rotate this role).


When we present the results of the usability study to the team, we agree on final changes, then adjust the internal prototypes.


The development team then picks up the user stories to implement the design. Whenever a user story involves UI work, myself and/or the UX designer are invited to a "conversation" meeting where we discuss expectations, advocate for key interactions, and often negotiate on implementation.

Continuous Field Learning

In a bi-weekly cadence, we shadow a Marine with a specific role as they go about their day. This allows us to observe when the application comes into play and why; which parts of it they're using and how they're using it; as well as which parts they aren't using, why they're not using it, and what they're doing instead. After 3 months we switch roles, all the while documenting issues, what works well, and interesting behaviors.

When our backlog presents an opportunity to update or rework an existing area of the application, we review this list to incorporate enhancements based on what we've observed in these post-release research sessions.

Note, for a company that wasn't doing UX Research when I was hired, we have built an outstanding research process that far exceeded my expectations. It wasn't easy and it didn't happen overnight, but we made a lot of progress!