Helmet-mounted displays

Cockpit, radar, helmet-mounted display, and other avionics
  • Author
  • Message
Offline

neurotech

Elite 2K

Elite 2K

  • Posts: 2346
  • Joined: 09 May 2012, 21:34

Unread post19 Sep 2012, 18:28

spazsinbad wrote:neurotech says: "...The F-35 is even better instrumented, but they are still bumping into avionics issues that can be debugged and tested on the ground, given enough data." Where do you see this information? The CATBird has sorted out a bunch of avionics stuff that has been successfully tested in exercises such as Northern Edge.

Perhaps I was unclear. I was suggesting the avionics associated with the HMDS is problematic. The data from the helmet is processed in the ICP. The HMDS is more than a helmet, its an associated avionics software/hardware integration.

spazsinbad wrote:This thread is about the Helmet and associated bits and pieces. AFAIK the avionics testing is OK. What is not known or tested is the HMDS in the operational environment of High G and bumpy fast speed. Hence a dedicated F-35. It is important to get the HMDS working correctly and nice to see that this may happen soon enough.

The problem of jitter and lag, can be analyzed more thoroughly if the inputs and outputs are recorded in flight, then re-ran on the ground, and debugged by the programmers.

There is nothing more frustrating than debugging an issue in a real-time system where there is insufficient data (or logfiles) to get a clear picture.
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post28 Sep 2012, 02:22

Lockheed Martin Tests Some JSF Fixes By Amy Butler 24 Sep 2012

http://www.aviationweek.com/Article.asp ... 21.xml&p=1

"Lockheed Martin is inching closer to solving some of the technical challenges encountered with the F-35 during developmental testing....

...Meanwhile, program officials have set aside a single aircraft for up to 90 days of tests solely of the Vision Systems International helmet that has plagued the program for more than a year with jitter, latency and other operational problems discovered in testing, says Maj. Gen. Christopher Bogdan, deputy F-35 program director.

Helmet performance is key to ramping up pilot training on the F-35, which is slated to begin early next year. Without this helmet—or possibly a backup model being quickly developed by BAE—the U.S. Marine Corps cannot declare operational capability as planned, as early as 2015. Bogdan doubts fixes to all of the problems will be implemented by then, but the government's plan to dedicate a test aircraft to the problem signals how critical the helmet is to the program's future...."
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline

twistedneck

Enthusiast

Enthusiast

  • Posts: 34
  • Joined: 05 Jul 2009, 18:06
  • Location: Dearborn, MI

Unread post29 Sep 2012, 07:48

Interesting that out of all the technologies to surprise the F35 its the lack of very fast comptuting in a small space thats the issue.. Moore's law will take care of this aspect over the next 5 years and we will see reduced latency of the digital processors and improve all of the jitters and night vision. i'm glad they bit off more than they could chew now its time to get some really fast arm processors and data pathways.
Offline

quicksilver

Elite 2K

Elite 2K

  • Posts: 2647
  • Joined: 16 Feb 2011, 01:30

Unread post29 Sep 2012, 14:08

Most of the reporting seems to forget that the "problem" helmet is in use everyday in both flight test and at Eglin. What constitutes a "problem" in an engineering sense is not necessarily a problem when it gets to operational pilots. Jitter occurs in a relatively narrow part of the flight envelope (at some G levels above sustained turn performance) as a consequence of aircraft motion. They're tweaking the algorithms in the DMCH to compensate. But, for the most part jitter is not apparent to the Eglin guys.

Latency "problem" was analytical -- ie engineers arguing over what latency value was acceptable and what was not (and only relative to the projection of DAS images on the inside of the visor). Some expected that it should be on the order of the human eye (about 30-35ms). Very bright PhD (from AFRL I think) constructed a test event in a simulator where they could "dial a (latency) value" and have real live in the flesh pilots perform a variety of routine tacair tasks -- air refueling, low level flight, and recoveries to an amphib ship. Pilots were unaware of the latency value dialed in for any of their tasks and afterward rated the ease or degree of difficulty of each task performed. The results were compelling. Expect that we'll hear reporting in the next few months that reflects same -- and just like all the hoo-haa about melting flight decks and tro and (name your issue), it will magically be "solved" and the reporting will move on to the next "problem."

Was watching Apollo 13 last night and took note of the scene where Gene Kranz (Ed Harris) goes to the blackboard and draws a big circle for the earth and a smaller one for the moon and starts to discuss with a substantial group of engineers whether or not they should perform a direct abort or make the swing around the back side of the moon. Group of engineers immediately react with much hand waving and drama (ach, eke, screech...it will never work) but Kranz sets them off to find solutions to a range of challenges for the option. Later after considerable work, Kranz (Harris) reassembles the group when it becomes clear that the path they have chosen will only get them to a point about 2/3 of the way from the moon back to the earth. Upon delivery of the message and Kranz' dictate that they find a solution that gets them the rest of the way home -- more hand waving and drama (ach, eke, screech...it will never work). But it did 'work.'

Well, in spite of the "ach, eke, screech...it will never work" by many over (name your issue) -- the helmet will work. For the most part it already does -- and the pilots at Eglin will tell you so.
Offline

batu731

Active Member

Active Member

  • Posts: 122
  • Joined: 23 Jun 2010, 23:26

Unread post01 Oct 2012, 18:49

quicksilver wrote:Most of the reporting seems to forget that the "problem" helmet is in use everyday in both flight test and at Eglin. What constitutes a "problem" in an engineering sense is not necessarily a problem when it gets to operational pilots. Jitter occurs in a relatively narrow part of the flight envelope (at some G levels above sustained turn performance) as a consequence of aircraft motion. They're tweaking the algorithms in the DMCH to compensate. But, for the most part jitter is not apparent to the Eglin guys.

Latency "problem" was analytical -- ie engineers arguing over what latency value was acceptable and what was not (and only relative to the projection of DAS images on the inside of the visor). Some expected that it should be on the order of the human eye (about 30-35ms). Very bright PhD (from AFRL I think) constructed a test event in a simulator where they could "dial a (latency) value" and have real live in the flesh pilots perform a variety of routine tacair tasks -- air refueling, low level flight, and recoveries to an amphib ship. Pilots were unaware of the latency value dialed in for any of their tasks and afterward rated the ease or degree of difficulty of each task performed. The results were compelling. Expect that we'll hear reporting in the next few months that reflects same -- and just like all the hoo-haa about melting flight decks and tro and (name your issue), it will magically be "solved" and the reporting will move on to the next "problem."

Was watching Apollo 13 last night and took note of the scene where Gene Kranz (Ed Harris) goes to the blackboard and draws a big circle for the earth and a smaller one for the moon and starts to discuss with a substantial group of engineers whether or not they should perform a direct abort or make the swing around the back side of the moon. Group of engineers immediately react with much hand waving and drama (ach, eke, screech...it will never work) but Kranz sets them off to find solutions to a range of challenges for the option. Later after considerable work, Kranz (Harris) reassembles the group when it becomes clear that the path they have chosen will only get them to a point about 2/3 of the way from the moon back to the earth. Upon delivery of the message and Kranz' dictate that they find a solution that gets them the rest of the way home -- more hand waving and drama (ach, eke, screech...it will never work). But it did 'work.'

Well, in spite of the "ach, eke, screech...it will never work" by many over (name your issue) -- the helmet will work. For the most part it already does -- and the pilots at Eglin will tell you so.


Very informative, thanks
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post03 Oct 2012, 01:33

Excerpt from Main Article posted elsewhere.

Slow Climb for the F-35 By John A. Tirpak Executive Editor Oct 2012

http://www.airforce-magazine.com/Magazi ... 2slow.aspx
OR
http://www.airforce-magazine.com/Magazi ... 12slow.pdf (0.8Mb)

"...A number of fixes are being considered for the F-35 helmet. A new short-range night vision camera will be installed, Lawson [LM VP] said. The existing one was "the very best camera that was available at the time the helmet was designed and built," but the improved version should eliminate some of the concerns. The program office and Lockheed are discussing whether to retrofit existing helmets or build new ones.

Software fixes may resolve problems with jitter, in which data displays on the inside of the faceplate are not as rock-solid as pilots would like. There’s also some lag in displaying night imagery from cameras all around the aircraft, as the pilot’s head traverses the field of view of one camera to another. That latency will require another software fix.
"We’ve had over 2,000 flights" on the F-35, "and every one of those flights has been with this helmet." There’s no concern that it’s a safety issue, he asserted...."
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post20 Oct 2012, 11:41

Repeated here for the record (from: http://www.f-16.net/index.php?name=PNph ... 384#233384 )

Lockheed Martin Provides F-35 Flight-test Update AIN Defense Perspective
October 19, 2012 by Chris Pocock

http://www.ainonline.com/aviation-news/ ... est-update

"...O’Bryan also described the status of efforts to resolve development problems with the F-35’s unique helmet-mounted sight. In the latest simulations, the device demonstrated a latency of only 130 milliseconds, against a 150-millisecond requirement. A new near-infrared camera to improve night-vision acuity is being tested at MIT Lincoln Laboratories and will be flight-tested next year. The “micro-IMUs” (inertial measurement units) that are designed to solve the “jitter” problem are already in flight-test...."
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post30 Oct 2012, 22:27

Lockheed cites good reports on night flights of F-35 helmet By Dan Williams Oct 30, 2012

http://uk.reuters.com/article/2012/10/3 ... 4320121030

"Oct 30 (Reuters) - Lockheed Martin Corp said on Tuesday that it was making progress on resolving technical issues facing the cutting-edge helmet being developed for use by F-35 fighter pilots, and it cited positive initial reports from night flight tests of the system.

Lockheed Martin Executive Vice President Tom Burbage said that night vision performance was the "only real question" left on the helmet, which was designed by a joint venture of Rockwell Collins Inc and Israel's Elbit Systems to display all the information F-35 pilots need to fly the plane.

The question was whether the helmet system would allow pilots to see well enough at night to carry out precision tasks such as refueling or landing on a ship, Burbage told Reuters before an event at the Fisher Institute for Air and Space Strategic Studies in Herzliya....

...Yossi Ackerman, president and chief executive of Elbit, declined to comment except to cite what he called "dramatic progress" on the helmet.

No comment was immediately available from the Pentagon's F-35 program office.

Burbage said the company had logged almost 5,000 flights (HOURS?) using the primary helmet, and it would be used by the U.S. Marine Corps when they start flying the new fighter in 2015, a deadline Bogdan had called into question last month.

He said until it received full approval from the Pentagon for the primary helmet, the company was continuing to fund work on a less ambitious, alternate helmet being developed by BAE Systems, which uses goggles.

"No one really wants to use goggles in a fifth-generation airplane. It affects your ejection envelopes and everything else. We are trying to get away from the goggles," he said.

He said the primary issue now facing the Rockwell-Elbit helmet was whether pilots could see well enough to refuel the plane from a dark refueling aircraft and land the F-35B variant, which lands like a helicopter, on a dark ship at night.

Burbage said the night flights under way now would help answer that question. He said there had also been concerns about a lag in getting sensor data to the helmet, but that was not an issue anymore."
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline
User avatar

count_to_10

Elite 3K

Elite 3K

  • Posts: 3282
  • Joined: 10 Mar 2012, 15:38

Unread post30 Oct 2012, 23:25

For very delicate work, wouldn't it be possible to just display the relevant video stream on the main display?
Einstein got it backward: one cannot prevent a war without preparing for it.

Uncertainty: Learn it, love it, live it.
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post30 Oct 2012, 23:28

That happens if pilot selects it however bear in mind giving up on having night vision via HMDS II gives up a marvellous boon for any kind of night flying; particularly without other lights potentially. Even taxiing on an airfield - particularly an unfamiliar one - at night can be hazardous. Go aboard to find out what that might mean. :D Then go ahead and land son, this is where the food is.... :roll: :twisted:
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline

SpudmanWP

Elite 5K

Elite 5K

  • Posts: 8390
  • Joined: 12 Oct 2006, 19:18
  • Location: California

Unread post30 Oct 2012, 23:33

For delicate, close up lights-out night refueling they will likely use the camera built into the helmet itself as it has the best resolution and lowest latency.
"The early bird gets the worm but the second mouse gets the cheese."
Offline
User avatar

count_to_10

Elite 3K

Elite 3K

  • Posts: 3282
  • Joined: 10 Mar 2012, 15:38

Unread post30 Oct 2012, 23:35

Well, just using the big screen when the HMDS isn't stable enough doesn't mean giving up the HMDS.
I was thinking more along the lines of "could it be good enough as is".
Einstein got it backward: one cannot prevent a war without preparing for it.

Uncertainty: Learn it, love it, live it.
Offline
User avatar

Gums

Elite 2K

Elite 2K

  • Posts: 2279
  • Joined: 16 Dec 2003, 17:26

Unread post31 Oct 2012, 00:14

Salute!

Good grief.

If the requirement is a tenth of a second latency, all bets are off.

In the latest simulations, the device demonstrated a latency of only 130 milliseconds, against a 150-millisecond requirement.


You can use sfwe to "smooth out" the display, but for realistic use the frame rate should be 20 milliseconds or less. The data rate updates could be 50 hz or so, but even the 40 year old Marconi HUD's in the SLUF and Jaguar were closer to 60Hz. Ditto for the Viper. They were also vector versus digital scan. Hard to add new symbology, but real smooth.

My old roomie worked on the shuttle HUD and they had the same problem. The display was "jittery", and the old farts like Young and Crippen and others didn't like it. He had flown the Jaguar and A-7D, and knew what a "real" HUD should be like. The shuttle main data bus had "x" frame rate to work with, and it was decent. So it was a matter of getting the flight path vector and such to the HUD at a good update rate, then let the HUD smooth it out. So he got the problem solved, and you can see several shuttle HUD displays and landings on You Tube.

With everything being digital these days, you must have the display inputs at a higher update rate than the actual display rate. Hell, 60 frames per second is super for the display. But the data update has to be faster, then let the display doofer smooth it. Olden days it was analog, so no "frame rates" except for the display itself - like the older TV sets. Even the new digital TV sets show you a difference between refresh rates. Higher refresh rates cost more $$$, but you can see the difference, especially on sports programming.

To see what I am talking about, try to find a video game that you can adjust the frame rates for the display. Good luck. If you build your own input device, you could use your PC's inherent updates for the screen, then interpolate between data updates. OTOH, try iEnt's online Warbirds simulation, which likes 10 herz or so, same as the Darth Vader display +/- They let your home PC front end smooth things out between data updates ( display frame rates of 100 frames per second depending on your PC), but we're not talking about a real world attack jet with cosmic sensors and such.

I shall still maintain my position that the jet should have a simple, cheap, off-the-shelf fixed HUD to back up the helmet. We have used them for 40
years, and they work just fine. They are firmly mounted on the jet's airframe and so no jitter when the plane is shaking at high AoA. Your head might be shaking, but not the HUD display. OTOH, the cosmic helmet has to read pilot eyeball angle ( not basic helmet angle) at "x" rate, then get with the sensor suite, then put up a new display frame before the pilot just moves his eyeball less than a degree or so in 20 milliseconds. BEAM ME UP!

gotta go,

Gums sends, opines....
Gums
Viper pilot '79
"God in your guts, good men at your back, wings that stay on - and Tally Ho!"
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23260
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post31 Oct 2012, 00:24

I guess we all have to see it - then believe it. However those wot see it in flight are impressed and they think improvements are working. Perhaps the high angle of attack tests may change opinions. We are patient.
RAN FAA A4G Skyhawk 1970s: https://www.faaaa.asn.au/spazsinbad-a4g/ AND https://www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/
Offline

quicksilver

Elite 2K

Elite 2K

  • Posts: 2647
  • Joined: 16 Feb 2011, 01:30

Unread post31 Oct 2012, 02:07

Gums wrote:Salute!

Good grief.

If the requirement is a tenth of a second latency, all bets are off.

In the latest simulations, the device demonstrated a latency of only 130 milliseconds, against a 150-millisecond requirement.


You can use sfwe to "smooth out" the display, but for realistic use the frame rate should be 20 milliseconds or less. The data rate updates could be 50 hz or so, but even the 40 year old Marconi HUD's in the SLUF and Jaguar were closer to 60Hz. Ditto for the Viper. They were also vector versus digital scan. Hard to add new symbology, but real smooth.

My old roomie worked on the shuttle HUD and they had the same problem. The display was "jittery", and the old farts like Young and Crippen and others didn't like it. He had flown the Jaguar and A-7D, and knew what a "real" HUD should be like. The shuttle main data bus had "x" frame rate to work with, and it was decent. So it was a matter of getting the flight path vector and such to the HUD at a good update rate, then let the HUD smooth it out. So he got the problem solved, and you can see several shuttle HUD displays and landings on You Tube.

With everything being digital these days, you must have the display inputs at a higher update rate than the actual display rate. Hell, 60 frames per second is super for the display. But the data update has to be faster, then let the display doofer smooth it. Olden days it was analog, so no "frame rates" except for the display itself - like the older TV sets. Even the new digital TV sets show you a difference between refresh rates. Higher refresh rates cost more $$$, but you can see the difference, especially on sports programming.

To see what I am talking about, try to find a video game that you can adjust the frame rates for the display. Good luck. If you build your own input device, you could use your PC's inherent updates for the screen, then interpolate between data updates. OTOH, try iEnt's online Warbirds simulation, which likes 10 herz or so, same as the Darth Vader display +/- They let your home PC front end smooth things out between data updates ( display frame rates of 100 frames per second depending on your PC), but we're not talking about a real world attack jet with cosmic sensors and such.

I shall still maintain my position that the jet should have a simple, cheap, off-the-shelf fixed HUD to back up the helmet. We have used them for 40
years, and they work just fine. They are firmly mounted on the jet's airframe and so no jitter when the plane is shaking at high AoA. Your head might be shaking, but not the HUD display. OTOH, the cosmic helmet has to read pilot eyeball angle ( not basic helmet angle) at "x" rate, then get with the sensor suite, then put up a new display frame before the pilot just moves his eyeball less than a degree or so in 20 milliseconds. BEAM ME UP!

gotta go,

Gums sends, opines....


How 'bout we inject a little precision into the 'seemantics' of the HMDS. The HMDS provides a range of functions for the pilot. One is the virtual HUD or VHUD. The VHUD is projected on the visor where one would find a conventional HUD in a legacy jet -- oriented on a fixed reference to the aircraft -- e.g. the waterline, FRL or whatever they call it in the jets some of you have flown. VHUD has no latency issues; other issues from early DT have been resolved.

The latency question involves the projection of imagery on the inside of the visor. The latency value of the human eye is between 30 and 35 ms. IOW, when we humans shift our central vision from one object to another, it takes about 30-35ms for the updated image to register in our gray matter as something other than what we were just viewing. That is a total 'system' latency -- transmission of light thru a medium where it is focused onto certain receptors, communicated to the brain etc. During the period your central vision is in motion you have no ability to focus; you can 'see' and recognize things, particularly in familiar surroundings, but you have to stop the motion in order to register detailed differences. Try it it the room you're sitting in right now.

Once you stop the motion of your central vision -- or more importantly, once you shorten the distance you shift your central vision and/or slow down the rate at which you shift it -- the apparent 'latency' diminishes. Try it -- move your head around really fast and see what your brain registers; slow it down and do the same.

It is the same in the jet. The greater the distance, or the faster one moves ones head, the greater the 'apparent' latency. Conversely, one can manage the 'apparent' latency of the imagery in the HMDS in the same fashion depending on the task.

When we fly at night using artificial vision, we do not use our eyes in the same fashion, mechanically speaking. We tend to stop and look, move our head, and then stop and look again over and over and over. The reason is that the artificial vision provides an image to our visual system that is, in a word...artificial. It's not the usual stuff with color and other intrinsic values that our brain 'naturally' recognizes and thus it takes more time to process that into action. We also tend to build in more dwell time on each object of our focus since errors in perception or recognition can cost us our lives as well as the lives of those we're carting around.

USG has done latency testing in task-oriented simulator events where they could 'dial a latency value' without pilot knowledge and then get the pilots to characterize a range of relative performance metrics. The assessed cut-off value for acceptable latency was apparently 150ms. F-35 HMDS DAS latency performance is better than that value.

Gums, talk to the guys at Eglin flying the jets. To a man, they are some of the most highly qualified aviators in the USAF and the USMC. They don't want a conventional HUD.
PreviousNext

Return to F-35 Avionics

Who is online

Users browsing this forum: No registered users and 2 guests