Category Archives: Science

A photo to get you to think about light and physics

iridis-rsmith

I am always proud of my hubby, Robert, and especially now, as he combined his love of photography (a special hobby of his) and astronomy and went one step forward and created something new to start conversations.

Did you know he won the Insight Astrophotographer of the Year 2016 award for the special category Robotic Scopes?

He was trying to convey not just the beauty of spectra (spreading light into different colors), but how much it can tell us about the object we are observing (what it’s made of, what’s going on, etc.).

He used data from a public archive of the Liverpool Telescope, a 2-meter robotic telescope on island of La Palma in the Canaray Islands, and combined imagery and spectra  to show off the beauty and the physics of two commonly photographed planetary nebulae (both readily seen from northern hemispheres): Cats Eye Nebula (NGC 6543 in Draco) and the Ring Nebula (M57 in Lyra).

Artistically arranged to show the colors of the rainbow, which helped inspire us to name the piece Iridis, which is latin for rainbow.

Nice writeup at the Liverpool Telescope website can be found here.

The Many Faces of Pluto and Charon

Reposted from https://blogs.nasa.gov/pluto/2016/02/12/the-many-faces-of-pluto-and-charon/.

Today’s blog post is from Kimberly Ennico, a member of the New Horizons’ Composition Theme Team and one of the deputy project scientists. She works at NASA’s Ames Research Center in Moffett Field, California, and has been on detail to the Southwest Research Institute in Boulder, Colorado.

No one can doubt the beauty of Pluto and Charon—amazing worlds revealed by the images from NASA’s New Horizons mission. From Pluto’s mountains, glaciers, ice-volcanoes, blue skies, and layered colorings to Charon’s vast tectonic structures and enigmatic red-colored pole, these pictures and associated spectra are rich puzzles waiting to be solved.

The July 14, 2015 Pluto flyby gave us an initial look at one side of Pluto, with its iconic heart-shaped feature. But I’m interested in the full planetary perspective, finding the “other sides” of Pluto to be every bit as fascinating as the encounter hemisphere. We must remember that a flyby is a moment in time lasting a few hours. In contrast, Pluto and Charon each rotate about its axis every 6.4 Earth days. This means that when New Horizons flew through the Pluto system it captured one hemisphere of each body in incredible detail.

What do we know about the “other sides” of Pluto and its largest moon? In the three weeks before the flyby, the Long Range Reconnaissance Imager (LORRI) and Multispectral Visible Imaging Camera (MVIC) imaged Pluto and Charon every day, sometimes two or three times a day to gather as much coverage across the bodies as New Horizons closed in. LORRI is New Horizons’ primary camera, an 8-inch telescope outfitted with an unfiltered charge-coupled device (CCD) – like you’d find in your own digital camera – sensitive to visible light. MVIC is a separate instrument with multiple CCDs, for which several are outfitted with color filters. The highest resolution images of the “other sides” of Pluto and Charon were observed 3.2 Earth days earlier, around July 10-11.

Working with a subset of the data (as not all these images have been sent to Earth from New Horizons yet), we’ve received our first glimpse of these “non-encounter” hemispheres below.

nh-pluto_four_faces_withwhitebox

Four faces of Pluto in black-and-white and color. From left to right, the central sub-observer longitudes are ~180, 240, 360 and 60 degrees East Longitude. The Pluto “Encounter Hemisphere” (indicated by the white box) is most recognizable by the “heart” feature of the informally-named Tombaugh Regio. This is also the hemisphere that today never faces Charon, as Charon is “tidally locked” to Pluto, similarly to how the Earth only sees one face of our moon. Pluto’s “Charon-facing” side is the second column from the right. Pluto’s north pole is up in all these images. The top row contains LORRI grey-scale images taken on July 13, July 12, June 27 and July 3rd, when Pluto was 620, 189, 24 and 36 LORRI pixels across, respectively. The bottom row shows MVIC “enhanced-color” images made by combining the near infrared, red and blue filters. They were taken on July 13, July 12, July 10 and July 9, when Pluto was 163, 56, 26 and 21 MVIC color pixels across, respectively. All these images surpass what we had previously seen from Hubble Space Telescope imagery where Pluto’s disk was only about 12 pixels across. Of course, New Horizons was only millions of miles from Pluto—Hubble is over 3 billion miles away! Credits: NASA/JHUAPL/SwRI

nh-charon_six_faces_box

Six faces of Charon. Central sub-observer longitudes: top, from left to right, 350 (B&W), 2 (color), 32 (color); Bottom, from left to right, 67 (color), 86 (B&W), and 180 (color) degrees East Longitude. The side that faces Pluto is highlighted by the inset box. From left to right, the top row images were taken July 14, 14 and 13, 2015, with Charon spanning 523 (LORRI), 81 (MVIC), and 43 (MVIC) pixels. The bottom row images were captured from July 12, 12 and 10, 2015, with Charon spanning 28 (MVIC), 96 (LORRI), and 13 (MVIC) pixels. Charon remains a mainly neutral greyish color all around, with a distinct red northern polar cap appearing from all sides. Credits: NASA/JHUAPL/SwRI

What strikes me most about the new Pluto color images is that the latitudinal (horizontal) banding identified on the encounter hemisphere is evident all around Pluto. Specifically, the northern polar region has a distinctive color from adjacent latitudes. The darkest region, which spans the equator, also appears to continue around Pluto, showing distinct variations on the side facing Charon, which have yet to be understood.

Why is this interesting? Coloring on Pluto is thought to have been the result of hydrocarbons called tholins that have formed in the atmosphere and have been “raining” down on Pluto’s surface over the millennia. We’re investigating whether Pluto’s colored terrains are primarily due to changes in or movements of its surface ices, specifically whether they have been undergoing seasonal effects –changing in temperature over time from the amount of cumulative sunlight – which could display itself as horizontal banding. The presence of that vast reservoir of methane, nitrogen and carbon monoxide ices in Pluto’s “heart” complicates the picture and could serve as a visible marker to trace changes.

Over the next few months, as more of this late-approach imagery gets downlinked from the spacecraft’s recorders, we will continue to piece together this colorful story of Pluto and Charon – from all sides.

Getting ready for the 2015 Pluto encounter. 2014 summer’s annual checkout brings high data payoff.

Reposted from http://pluto.jhuapl.edu/News-Center/Science-Shorts.php?page=ScienceShorts_07_11_2014.

You walk up to the Restaurant at the End of the Solar System, ready to try that slice of “Pluto on ice” that you heard amazing things about. The chef behind the counter asks, “So, how would you like your data? “ Without hesitation, you reply “Well calibrated.”

Pretty pictures or spectra make no sense “without context.” For images, we need to know how many kilometers map to a pixel and for each raw digitized value, a mapping from bits to energy units (like magnitude or ergs/cm^2/s). For spectra, we need to know how much spatial information is covered per pixel plus what each pixel’s response to wavelength and brightness is. For particle instruments, we need to what energy and from which direction that ion or dust grain came.

Before launch every New Horizons instrument underwent intensive laboratory characterization: “pre-flight calibration.” They were subject to spatial targets, integrating spheres, laser pulses, particle accelerators, to name a few good “known” sources, to get “translations” from bits stored to disk to “real” units like wavelength, flux energy, intensity, etc. After launch such “translations” were verified with “in-orbit” calibrations, where, for example, instead of a lab source, the instruments stared at stars or inspected Jupiter and its moons. Each year, the team executes an ACO, or Annual Check-Out, where instrument performances are trended and teams look for changes. Additional observations provide information to remove “unwanted artifacts” like hot-pixels, readout smearing, ghosts, etc.

Summer 2014 is ACO-8, our 8th annual checkout since launch. It showcases our last calibrations prior to the 2015 Pluto encounter. It’s jammed packed with observations that are done yearly for trending, but also some new ones to make sure the New Horizons instrument suite is “well calibrated.” Highlights include new radiometric calibrations for the LEISA IR spectrometer, a long stability test for the REX radio experiment, and a test for revised thresholds for PEPSSI, the high-energy particle detector. More calibration data is taken during the 2015 Pluto fly-by, and together, these data sets are placed in the data reduction pipeline to translate bits to “real” values. Resources and time aboard the spacecraft to execute these observations are limited, so a series of reviews and assessments are done prior to each checkout.

The team is eager to get the data from ACO-8. We wake up June 15th. After a similar series of spacecraft subsystem checkouts, the New Horizons payload calibrations begin and continue through August. It may not be the Pluto fly-by, but this summer’s data will play a big role in the science return from New Horizons next year!

jupiter-nh-calibrated

Demonstration of read-out smear removal, preserving the photon count, in LORRI’s calibration pipeline. The data in the smear is caused by imperfections in the CCD readout when illuminated by a lot of light. The source of the photons is from the object being imaged, so we need to correctly relocate the information. Data without good calibration is messy.

 

As a New Horizons deputy project scientist, Kimberly Ennico manages instrument readiness and calibration aspects of the mission. Her expertise includes instrument development, space qualification and calibration; optical/infrared astronomy; optical/infrared detectors, optics, cameras and spectrometers; and science communication.

Focus is the name of the game! Flight Day (Nov 17, 2013): Part 2 Experiencing Microgravity for the First Time!

Reposted from https://blogs.nasa.gov/mission-ames/2013/12/17/focus-is-the-name-of-the-game-flight-day-nov-17-2013-part-2-experiencing-microgravity-for-the-first-time/.

My final blog summarizes my experiences of the flight and my evolving perspective on this type of platform for doing engineering, science and technology experiments. (Earlier posts, part 1 here, part 2 here).

First Impressions. So for a first timer, the first question asked is, “So what was it like?” I am so glad I had an audio recorder since my first experience on the onset of micro-gravity for the first time (and hopefully not the last time) in my life was said in deadpan fashion (totally not typical for me) “Alright. That’s interesting. Oh, wow. Okay. Yeah. We’re good.”

(The second question is “Did you get sick?” Well, it was challenging to keep disciplined to keep my head straight, especially during the 1.8-2G periods. I did not get sick, but got close to being sick on Parabola #25. But it was totally my fault since I looked out the window between Parabola #24 & #25 and saw the horizon almost vertical and that messed with my head. Lesson learned: don’t look out the window.)

In the interest of full disclosure, one payload had been having some intermittent issues that, like all intermittent issues, reared its head during a pre-flight end-to-end test a day before the flight. Luckily I had a contingency operations sketched out which performed perfectly. So when were on the plane and were doing the set-up and startup, I was really “uber-focussed” on the payload and not on myself for the first few cycles. When things started to get into a rhythm around Parabola #5 I had no idea we were 1/5th of the way done. Wow.

between_parabola1_and_2

“That was short. That was very short.” My comments after the very first parabola, which was a Martian (0.33 G) scenario. This image shows our team’s positions in between parabola 1 & 2. We did not have space enough to fully lay down so we reclined against the side of the aircraft. Left to right is Con Tsang, myself (monitoring a payload via a table), Cathy Olkin, and Alan Stern (face not visible). The photo is taken via Go-Pro camera on the head of Dan Durda who was across the way. Eric Schindhelm, who rounded out our team, was next to Dan and not in this view.

The rapid change between the onset of low-gravity for about 10-15 seconds followed by 2-3 sec transition to what appeared to be about 30s of 1.8-2G forces was very unexpected. With each parabola I did start to realize that the set-up time for the manual operation of one payload took way too long. (Lesson learned)

Sometimes we had unexpected escapes (I escaped my foot-holds on Parabola #7) and Eric Schindhelm (shown below) escaped the next one.

eric_parabola8

Con was monitoring BORE and deftly diverted Eric’s collision path. For BORE, the key thing was to keep the box free from any jostling by others or the cables.

The payloads. We had two payloads, each with different goals for the flight. The fact that a decision to tether them together (made a few weeks before the flight) complicated the conops (concept of operations). One was a true science experiment: BORE, the Box of Rocks Experiment. The other was primarily an operations test for the SWUIS, the Southwest Universal Imaging System. Both experiments are pathfinder experiments for the emerging class of reusable commercial suborbital vehicles. Providers like Virgin Galactic, X-COR, Masten Space Systems, Up Aerospace, Whittinghill Aerospace, etc. You can read more about this fleet of exciting platform at NASA’s Flight Opportunity page https://flightopportunities.nasa.gov/, where they have links to all the providers.

swuis_bore_setup_1

From left to right: Dan & Con monitoring BORE (aluminum box with foamed edges) while Cathy holds onto the SWUIS camera doing a “human factors” test using a glove (yellow). Image from Go-Pro camera affixed to the SWUIS control box.

swuis_pib_target_2

View of the SWUIS control box and Go-pro camera (used for situation awareness) while Dan’s holding it. You can see the SWUIS target that we used for the operations testing.  Image from a Go-Pro camera affixed to Dan’s head. Multiple cameras for context recording were definitely a must! (Lesson Learned)

dan_swuis_parabola23

Dan Durda taking a test run with SWUIS on Parabola #23 (19th zero-G).

With BORE, we ask the question: how do macro-sized particles interact in zero gravity? When you remove “gravity” from the equation, other forces (like electro-static, Van der Waals, capillary, etc.) dominate. In a nut shell, BORE is a simple experiment to examine the settling effects of regolith, the layer of loose, heterogeneous material covering rock, on small asteroids.

Our goal is to measure the effective coefficient of restitution (http://en.wikipedia.org/wiki/Coefficient_of_restitution) in inter-particle collisions while in zero-g conditions. The experiment consists of a box of rocks. There are two boxes, one filled with rocks of known size and density, one filled with random rocks. Video imagery (30fps) is taken of the contents of each box during the flight. After the flight, the plan is to use different software (ImageJ, Photoshop, and SynthEyes) to analyze the rocks and track their movements from frame to frame. The cost of BORE is less than $1K in total, making it in reach of a the proceeds of a High School bake sale!

BORE does need more than 20 s of microgravity to enable a better assessment of rock movement, and this is exactly why this experiment is planned for a suborbital flight where 4-5 minutes of microgravity conditions can be achieved. Here, we used the parabolic flight campaign to test the instrumentation and get a glimpse of the first few seconds of the rock behavior. With this series of 15-20s of microgravity, we made leaps forward from previous tests using drop towers which provide only 1-2s of microgravity.

Today’s Microgravity platforms and durations of zero-G fromhttp://www.nasa.gov/audience/foreducators/microgravity/home/.

  • Drop Towers (1-5 s)
  • Reduced-Gravity Aircraft (10-20s)
  • Sounding Rockets (several minutes)
  • Orbiting Laboratories such as the International Space Station (days)

bore_samplezg_images

Some BORE images from one of the zero-G parabolas. Top Row: (left) Rest position of and (right) free-floating bricks of known size (they are actually bathroom tiles from Home Depot) but have the ratio L:W:H of 1.0:0.7:0.5. Surprisingly this is near the size and ratio of fragments created from laboratory impact experiments (e.g. Capaccioni, F. et al. 1984 & 1986, Fujikawa, A. et al. 1978) and similar to the ratio of shapes of boulders discovered on the rubble-pile asteroid Itokawa (see below).

Why is this important? Well, if you want to visit an asteroid someday and are designing tools to latch onto it, drill/dig into it, collect samples, etc. the behavior of collisional particles in this micro/zero-gravity environment is important. Scientifically, if you want to understand more about the formation, history and evolution of an asteroid where collisional events are significant, knowing more about how bombardment and repeated fragmentation events work is a key aspect.

itokawa_iss_forscale

Source: NASA & JAXA. The first unambiguously identified rubble pile. Asteroid 25143 Itokawa observed by JAXA’S Hayabusa spacecraft. (Fujiwara, A. et al. 2006). The BORE experiment explored some of the settling processes that would have played a role in this object’s formation.

SWUIS was more of a “operations experiment.” This camera system has been flown on aircraft  before to hunt down elusive observations that require observing from a specific location on earth. For example, to observe an occultation event, when a object (asteroid, planet, moon) in our solar system crosses in front of a distant star, the projected “path” of the occultation on our planet is derived from the geometry and time of the observation, similar to how the more familiar solar and lunar eclipses only are visible from certain parts of the Earth at certain times. Having a high-performance astronomical camera system on a flying platform that can go to where you need to observe is powerful. So, SWUIS got its start in the 1990s when it was used on a series of aircraft. You can read more about those earlier campaigns at http://www.boulder.swri.edu/swuis/swuis.instr.html.

Over the past few years I have been helping a team at the Southwest Research Institute update this instrument for use on suborbital vehicles that get higher above the earth’s atmosphere compared to conventional aircraft. Suborbital vehicles can get to 100 km (328,000 ft.; 62 miles) altitude, whereas aircraft fly mainly at 9-12 km (30,000-40,000 ft.; 5.6-7.5 miles). Flying higher provides a unique observational space, both spectrally (great for infrared and UV as you are above all of the water and ozone, respectively), temporarily (you can look along the earth’s limb longer before an object “sets” below the horizon) and from a new vantage point (you can look down on particle debris streams created by meteors or observe sprites & elves phenomena in the mesosphere). 100km altitude is still pretty low compared to where orbiting spacecraft live, which is 160-2000 km (99-1200 miles) up (LEO/Low Earth Orbit). For example, our orbiting laboratory, the International Space Station is 400 km (250 miles) in altitude.

suborbital_flight_trajectory

The SWUIS system today consists of a camera and lens, connected by one cable to a interface box. The interface box, which is from the 1990s version, allows one to manually control gain and black-level adjustments via knobs. It also provides a viewfinder in the form of a compact LCD screen. Data is analog but then digitized to a frame-grabber housed in a laptop. The 1990s version had a VCR to record the data, but since we are in the digital age, the battery-operated laptop augmentation was a natural and easy upgrade. The camera electronics are powered by a battery which makes it portable and compact. For this microgravity flight I introduced the notion of a tablet to control the laptop, to allow for the laptop to be stowed away. In practice this worked better than expected and my main take away is that the tablet is best fixed to something rather than hand-held to prevent unwanted “app-closure.” However, having a remote terminal for the laptop also would work.

Here’s a series of three short videos (no sound) of three legs when I got to hold the Xybion camera on Parabolas #13, 14 & 15. This captures how terribly short all the parabolas are and if you are doing an operations experiment, how utterly important it is to be positioned correctly at the start. One test was to position myself and get control of the camera and focus on a test target. A second test was to practice aiming at one target and then reposition for another target within the same parabola.

SWUIS_ZeroG_Parabola13_Web2

SWUIS_ZeroG_Parabola14_Web2

SWUIS_ZeroG_Parabola15_Web2

Above, the links are for lo-res (to fit within the upload file size restrictions on this site), no sound Videos of Parabola #13,14,15  (7,8 &9th in microgravity). By the third time I was getting faster at set-up and on-target time.

In 1-G this camera and lens weigh 6.5 lbs. (3 kg) . Held at arms length, when I was composing the test in my lab, as I scripted the steps, I had trouble controlling the camera. In fact, I was shaking to keep camera on target after some seconds. I was amazed at how easy it was to hold this in zero-G, and complete the task. The Zero-G flight told us many things we need to redesign. One issue we learned was the tethering cabling was not a good idea and in some cases the camera, held by one person, was jerked from the control box, held by another person. In the next iteration, one of those items will need to be affixed to a structure to remove this weakness.

My lessons learned from the whole experience: Everything went by very quickly. Being tethered was difficult to maintain. Design the conops differently (what we did seemed awkward). Laptop and tablet worked better than expected. Hard to concentrate on something other than the task at hand. Don’t plan too much. Have multiple cameras viewing the experiment. Need to inspect the cable motion via video, as it was hard to view it in-situ. Very loud, hard to heard, hard to know what other people were working on. The video playback caught a lot more whoops during transitions to zero-G than I remembered. Heard the feet-down call clearly but not the onset of zero-G. The timing between parabolas is very short. The level breaks were good to reassemble the cabling then. Next time, don’t hang onto the steady-wire which is attached to the plane (I got that idea from Cathy & Alan next to me) as it caused more motion than needed (the plane kept moving into me): instead remain fixed with the footholds and do crouch positions like Con & Dan did and let the body relax (Con & Dan were most elegant).

And, my biggest take-away of all: If you want to do a microgravity experiment, I strongly recommend doing a “reconnaissance” flight first. Request to tag along a research flight to observe, perhaps lend a hand as some research teams might need another person. Observe the timing and cadence and space limitations. Use that to best perform your experiment. It is an amazing platform for research and engineering development and can truly explore unique physics and provide a place to explore your gizmo’s behavior in zero-G and find ways to make it robust before taking it to the launch pad.

I am very much hoping to experience microgravity again! With these same two payloads or with others. One of the key points of these reduced-gravity flights, they fly multiple times a year, so in theory, experiment turn-around is short. Ideally I wished we flew the next day. I could have implemented many changes in the payload-operations and also in Kimberly-operations.

Our team is now working to assess what worked and what did not work on this flight. We achieved our baseline goals, so that is great! Personally, I wished I had not been that focused on certain aspects of the payload performance and made more time to look around. However, that said, my focus keyed me on the task at hand, the payload performed better than expected, and when you have 10-15s, focus is the name of the game!

It’s time to fly and go weightless! Microgravity Flight Day (Nov 17, 2013): Part 1, TSA Check, Board, Ascent, and Flight Profile.

Reposted from https://blogs.nasa.gov/mission-ames/2013/12/17/its-time-to-fly-and-go-weightless-microgravity-flight-day-nov-17-2013-part-1-tsa-check-board-ascent-and-flight-profile/.

This is a second entry (part one here, part three here) of a three part blog series about my recent experience in microgravity.

pre-flightphoto

The team is outfitted in their flight suits ready to go! left –to-right Kimberly Ennico (me), Con Tsang, Eric Schindhelm, Dan Durda, and Cathy Olkin. The photo was taken by Alan Stern, another member of the team, rounding us to six flyers. Con & Cathy had flown once before. Dan & Alan had multiple flight histories. It was Eric & I to savor the first-time-flyer award. All my colleagues work at the Southwest Research Institute in Boulder, Colorado.

In writing this blog entry, I still giggle at recalling the moments before the flight. We actually had a TSA check before boarding the plane. Now, pretty much every person has a go-pro or a recorder strapped to some limb, all carefully secured in the pockets of his/her flight suit. So as each person went through security, all the pockets had to be emptied before the TSA wand-scan, and then all the devices got re-pocketed ready for the adventure.

So what were in my pockets? I had some spare duct-tape affixed to plastic (for easy removal) to do patch-taping (came in super handy), a Nexus tablet for one of the experiments (with velcro on its backside ass it needed to be velcroed to the floor), 6 AA fresh batteries (for me to putt in one of the payloads during the setup leg), two checklists (both velcroed to me), and an audio recorder (affixed to my arm with a iPod armband). Any items that did not have some sort of way to be strapped or velcroed down had to be lanyard to you (such as a camera).

kim_boarding

That’s me outfitted with documentation. I’m “walking documentation.” My right shoulder holds an audio recording device, my right thigh our checklists, and my left wrist (not shown) the list of tests vs. parabola. If you look closely my name badge is upside down, indicating I am a first-flyer.  (Photo by Dan Durda).

I was assigned seat 3C for takeoff (and yes, they actually gave us boarding passes!). There are a few rows of seats in the back which all fliers have to be buckled in for take up.  We boarded from the rear of the 727-200. There was an in-flight safety briefing (oxygen, life jacket, seatbelts). There is an emergency card, tailored for Zero-G, similar to what was provided for SOFIA. The plane is operated by Zero-G corporation, but registered under Amerijet. Its call sign was AJT213. The main body is empty with padded floors, walls and ceilings. There are specific areas to bolt down footstraps and equipment. For those items that cannot be bolted down, there are a series of Velcro strips we placed the day before. This turned out to be important as during the in between microgravity parabolas, you experience 1.2-2 G and holding free-floating equipment will immediate come crashing down. So this experiment which involve 5 separate free-floating equipment, having a “safe place to store.”

At approximately 9:16 am EST (local time), we taxied and the takeoff felt just like a normal plane. At about 10 minutes after takeoff, we were instructed we could begin our set-up. This set-up leg is about 15-30 minutes in length. From our practice sessions last week we knew that setting up SWUIS took about 15 minutes (with no glitches). BORE took a similar amount and they are dovetailed in such a way that we need to go in parallel but also stage certain setup first. So the checklist came in handy to remind us our “dance” for setup. We put in fresh batteries for our equipment and got it up and running in a we bit more than 15 minutes, after experiencing a momentary pause when a known interference issue might have reared its head, but it played nice that morning.  We had a pretty complex set-up, which I realized we should simplify on future flights and I made some oral notes into the audio-recorder.

We knew from the review the day before we would be experiencing 25 parabolas in total, performed in bunches of five with a flat 1-2 minutes of 1 G of “level” in between. The first “set” would be four Martian (1/3 G) and one zero-G. The second set would be one lunar (1/6 G) and 4 zero-G. And all the remaining parabolas would be zero-G. There was only one experiment on board who had requested the Martian gravity, all others needed zero-G. I gathered that the tourist flights get 15 parabolas also similarly put in 5-sets, and depending on the experiments on the flight, the number of Martian & lunar parabolas are tailored appropriately.

Besides the research teams, Zero-G assigns at least one “coach” per experiment group. He or she can help with the experiment logistics, and also provide assistance if one of the team comes down with motion sickness. To avoid motion sickness, I was strongly advised not to turn my head, or if I had to turn my head, to ensure I turned my entire upper torso and slowly, and this especially important during the high-G parts of the parabolas.

Let me divert from the experience to summarize what the plane is supposed to do to provide these “periods” of reduced gravity. This “reduced-gravity environment” is created as the plane flies on a parabolic path: the plane climbs rapidly at a 45 degree angle (“pull up”), traces a parabola (“pushover”), and then descends at a 45 degree angle (“pull out”). During the pull up and pull out segments, everything on board, then crew and experiments, experience accelerations of about 2 g (and boy did I feel this! This was actually more striking than the <1 g). During the parabola (pushover), net accelerations are supposed to drop as low as 1.5×10-2 g for about 15-20 seconds. For me, this was the largest take-away of the entire experience: those periods of zero-G went by very, very, very quickly. Also the period of 2G felt like they went by much slower, but essentially they were the of similar duration. I was very surprised, but when I decoded my voice recorder results and looked at the camera data taken by our two experiments (which were time stamps) those “pushover” events were indeed in “20 s duration time chunks.”

After  5 parabolas, the aircraft was leveled off to get us back to “old familiar” 1 G. This was a key time I learned to help re-position cables (and in many cases, people!) to get ready for the next series of five. We erred in our conops design to rotate things in threes, which did not work very well with the break after 5 parabolas. Having known now the importance of using those breaks, I would have designed the operations-experiment differently. The other science experiment was not affected by that issue.

After speaking with other folks, apparently, the “20s duration of zero-G” is driven by safety limits on the aircraft’s flight profile, to drop only a few thousand feet during the parabolas. Here’s where the suborbital rockets (one-use) and the emerging new reusable commercial suborbital platforms come in, as they promise 4-5 minutes of microgravity in a single flight. This longer duration of zero-G is highly attractive for some experiments. However, others may still want multiple zero-G test times in a short time and those are nicely provided by these aircraft doing parabolic flight profiles.

Our entire flight from nose-up to nose-down was only 2 hrs. The time between the start of parabola 1 and the end of parabola 25 was about 1 hr. It was quick.

After the flight I looked up the flight path on flightaware.com and we were doing some pretty neat aerobatics over the Gulf of Mexico. Our altitude ranged from 25,000 ft. to 20,000 ft. during the parabolic maneuvers.

flightaware_1

flightaware_2

 

My final blog summarizes my experiences of the flight and my evolving perspective on this type of platform for doing engineering, science and technology experiments.