High Density Vertiport (HDV) Sub-Project Research

A breakdown of my research process at NASA Ames on a slice of the High Density Vertiport (HDV) sub-project. Recognized with a Best of Session award at the Digital Avionics Systems Conference 2023 among 200+ submissions.

Role
Human Factors Engineer
For
SJSURF @ NASA Ames
Date
2022 – Onward
Project Slice Complete
Nov 2023
Presentation URL
ntrs.nasa.gov/citations/20230013246
Paper URL
ntrs.nasa.gov/citations/20230010285

Summary:


--Determined research questions for surveys and interviews. --Mapped out flight paths and created 60+ simulation vehicles with MACS simulation software.

--Used previous year's final report to design and deploy a flight rerouting preview feature that was well reviewed.

--Designed and deployed an automated process requiring final human approval to replace a manual process, reducing task time from hours and days to minutes (99% improvement).

--Worked with the team to synthesize the results, write the final paper, and presented our findings at DASC, winning Best in Session for the second year in a row.

While at NASA Ames, specifically on the HDV sub-project, my research process outline would look like this:

1. Determine research questions


The Advanced Air Mobility (AAM) project called for exploration into higher demand for passenger transport and movement of products from warehouses to distribution centers in the airspace. It was expected that these processes would be a mixture of human operators and automation. Our sub-project, High Density Vertiplex (HDV), was tasked with answering the questions of how the systems would enable scalable and high density operations in urban areas. Additionally, the project wanted to know how vertiports, similar to heliports but for next-gen aerial vehicles, would function. We also wanted to test traffic flow management capabilities for when vehicles needed to reroute to another vertiport, or make an emergency landing. These were made up of missed approaches, speed changes, and diverts.

AAM Future Airspace Render

2. Prepare materials


We came up with survey questions to be administered via Lime Survey, and interview questions to focus on more in-depth and open ended questions.

I collaborated with NASA Ames and NASA Langley to map out the live test site flight paths in MACS (NASA Ames's vehicle simulation software) over the mapped area of the test site at Langley and determine the needed vehicle characteristics. It was also determined the vertiports scenarios would have vehicle takeoff and landing rates of 60 per hour to one vertiport, incorporating traffic from 10 other vertiports. I created ~80 fully simulated vehicles in MACS and Langley created Measuring Performance for Autonomy Teaming with Humans (MPATH) partially simulated vehicles, as the MPATH vehicles would have human pilots using semi-autonomous features. The Vertiport Automation Software (VAS) would manage the schedules of vehicles arriving and departing the vertiports. It was my responsibility to also test this software and create the timeslots file the system used to correctly space vehicles arriving and departing from the vertiports. All of these things were interfaced with through NASA Ames' HDV (High Density Vertiplex) client.

The year previously, I had found in my interviews and surveys that users wanted to preview their reroute options on the map before committing. Once they accepted a reroute, it was hard to undo. I took this data and designed a feature in our HDV software that allowed them to see previews of their reroutes on the map, allowing them a more informed decision before committing.

3 Fleet Managers and 3 Vertiport Managers were recruited from available staff at Ames and Langley and had no prerequisites for participation. 9 GCSOs (ground control station operators) were also recruited from available staff, but were required to be authorized as small unmanned aerial system (sUAS) pilots.

NASA Ames

3. Pre-test


Many tests were run in the months leading up to the final field test in Langley. These involved both teams doing individual system checks to make sure they were communicating with each other and changes to the vertiport locations and flight paths to match realistic limitations of the test site. I managed and coordinated much of these pre-tests with Langley.

A huge limitation of MACS was that for scheduling to work with the VAS in the past, me and another researcher had to manually place each vehicle in the scenario, and hope everything was lined up perfectly for conformance monitoring and as to not intersect with other vehicles. In the past, at 20 or even 40 vehicles landing at a vertiport, it was very difficult but not impossible to get everything lined up through trial and error. This could take several hours to a day or two, and every change management sent down meant everything would have to be redone. In the past, the project would be delayed by a few days because of this specific issue. I thought, this seems like a problem that would work better automated and then the output validated by a human.

And so I lead a developer to create a subsystem in MACS that took every vehicle, took the inputted schedule, and then slightly moved every vehicle to take off and land at the precise time as to not intersect with another vehicle, and to get to their timeslot at the vertiport on time. Additionally, we added a feature where MACS could pre-test spacing issues and alert you to them without having to run the whole simulation. It was a revelation, as I designed and implemented a process that took hours or days down to one that took seconds or minutes, and would translate in most cases to reducing task time by 99%. It meant that any changes from management could be inputted and tested almost instantly, allowing for downscaling or upscaling operations in the future.

In addition, we worked on refining the questions, adding flags to the software to collect quantitative data, and tested with both teams over and over again to get out all the bugs.

Researcher walking the test participant through the interface

4. Run Field Study


The field test happened at NASA Langley once management gave the green light. I started by briefing participants on the tools they would be using, and giving them a basic rundown of how the day would go. They were given a demographics survey, and then the test began. At the beginning of each scenario, the PI assigned takeoff times to each GCSO. Then each GCSO used voice communication to request their takeoff times from the Fleet Manager. The FM responded to each GCSO and scheduled their flight plan in the HDV client. After all GCSOs had their timeslots reserved, MACS simulated background traffic was started. Once traffic started, the FM, GCSOs, and VM performed their scenario tasks. After each scenario participants completed our survey. After all of the scenario runs for the day, I would interview the participants on their experience during the scenarios.

During testing, because of the previously mentioned MACS automatic scheduling feature I developed, it meant that we were able to create scenarios on the fly, which came in handy when we finished ahead of schedule and management wanted to try 40 and 20 vehicle scenarios (something that wasn't created or planned beforehand).

NASA Langley Test Site

5. Analyze data


It was confirmed that 60 vehicles per hour either took off or landed at the tested vertiport, meeting the criteria for density. Sim vehicles in the run were found to reach their closest point of approach (CPA) 14.4 times. Delay was important, and it was found that piloted vehicles conducting missed approaches had an average delay to their schedule of 2.5 minutes, while those using diverts had an average delay of 1.3 minutes. It was found that activating a missed approach failed 3 out of 11 times, which meant that 3 times the fleet manager was unable to generate a missed approach path. At 20 vehicles, the interface delay was 0, 40 was 3:05 minutes, and at 60 was 3:16 minutes. Additionally, we would need to occasionally delete sim vehicles to make a manual slot for re-entering manned vehicles, since the 60 scenario timeslots did not have room in-between them for more vehicles.

I worked with the teams to synthesize these results for the final paper by coding survey and interview responses, as well as helping to write the paper itself.

Throughput data points for a 60 operations per hour scenario

6. Present final report


It was determined through this test that more research is needed to understand capacity limitations of the airspace in relation to vertiports and arrival flows. It is easy to put a vertiport in a state of over-demand with temporarily reduced capacity, and procedures need to be in place to handle the imbalance. Strategic ground delay, similar to how the FAA manages delay in tradition air traffic, needs to be considered in the future to help. Many gaps found were already known, as this is very early exploratory research on air traffic control for next-gen aerial vehicles.

I created the presentation I would be giving at Digital Avionics and Systems Conference (DASC) based on the submitted paper but with included recommendations for the next field test. I was able to win my team Best in Session, for the second year in a row.

Lessons learned include:

-- Qualitative data will give you more nuanced information than quantitative data, and you can really dig into the reasons why a participant thought or did something a certain way. This will lead to better project outcomes because you can really catch more and more of the smaller issues that might get passed over otherwise and cause trouble down the line.

-- The previous year, I learned that participants wanted a preview of their options before committing to rerouting in case one of them wasn't optimal, I created a design prototype, iterated with developers, and implemented that feature in this project. I was able to see a positive impact and hear from participants that they liked being able to preview their choices before committing, validating my design choice backed up with previous research.

-- Don't be afraid to think outside the box, and push for more efficient ways of doing things. Designing and implementing the new timeslot feature in MACS saved a lot of time (99% vs the previous way) and money, kept the project on time, and allowed for later flexibility to test more variations in the same time envelope.

Me presenting our results at DASC in Barcelona, Spain

Me accepting the Best of Session award for me and my team!