F 35 Sensor Fusion and networking

Cockpit, radar, helmet-mounted display, and other avionics
  • Author
  • Message
Offline
User avatar

blindpilot

Elite 1K

Elite 1K

  • Posts: 1199
  • Joined: 01 Mar 2013, 18:21
  • Location: Colorado

Unread post01 Apr 2016, 16:58

cantaz wrote:Maybe BP can chime in on this, but aren't all databus in use on 4th gen variations of 1553? Meaning architecturally limited to half-duplex, command/response? Doesn't matter if the implementation is copper or fiber. That should by definition differentiate how each gen does fusion, given that beside the massive difference in speed between even the fastest 1553 implementation vs the slowest 1394, the way data can be routed within each bus type is quite different.


Hornetfinn's answer is sufficient and I am not going here to various technical points. First we'd eventually just get to eyes glazing over. Let's let garrya collect that info.

Secondly, the purpose of my input here, is understanding how things change at the application level when the system structure makes a paradigm shift created by new technologies. The best way to look at that, from a reasonably informed lay perspective follows.

I can collect information from several sources and filter, process, that info to a specified output for use. This is several pipes preprocessed and results coming into a central location(ex: display), all programmed to provide a predetermined output. But if you are familiar with metadata, things work differently. As large throughput channels and storage technologies grow, we can bypass the "hardcoding," to a purpose. That "hardcoding" is typical of the evolved 4th Gen fusion systems.

Rather, if we get to a point where the speed/size of the technology can "grab it all" and "hold it all," things change. I can just throw all the data into a big pot with some form of metatagging. Think Google, or NSA phone searches.

At this point the "Apps" can be built to reach into the information sources and construct new "metadata" responses that are not even collected or stored anywhere. Siri does not store a table of "If I 25 has traffic, take an alternate route from this table of choices." It looks at the metadata, position/speed of all the cell phones on the highway, and calculates real time "what if" responses. The reply, "turn left" may not even be in a look up of phrases. Siri will construct the grammar on the fly --- we need to go that way = "turn", that way = "right," conversation party = "BP" -- grammar engine yields > "BP turn right." There is no lookup phrase "BP turn right" in the working data engine.

The main point of this 5th gen evolution is not that this metadata approach is better or faster than a hardcoded "track generator" from installed inputs. It is that the approach allows queries and responses, frameworks and decision matrices, that aren't programmed at all in the classic sense.

"Siri, if it looks like I'm going to be late, please buy me a ticket for the movie and save the ticket on the phone that I can use when I get there." A 4th gen written program that "checks time of arrival and buys ticket" is not the same thing, although it appears to provide the same response. (maybe even quicker) The advantage of the 5th gen fusion models is that

- We don't know yet what all we can do with this system, -

but we are seeing that the possibilities are nearly limitless. If I can think it, it can do it. In general brute force 4th gen improvements do not do this, even if their responses seem faster or better to a specific task here or there. That's not the point. The point is that some 23 year kid pilot, is going to strap in and wave his hands, and create a whole new operating environment, that his instructor didn't even know existed. And with that, change the battlespace.

That is why the F-22 experience puts US (and cross assigned) users over a decade ahead of someone (Chinese/Euro types) trying to build a copy "just like the F=22."

FWIW,
BP

PS Its not just this working with metadata approach. 5th gen differences include more paradigm changing features. (Think clouds and megadata collecting, totally different competing cell technologies talking on the same virtual "phone network," lego style display changing, etc etc.)
Offline
User avatar

jetblast16

Forum Veteran

Forum Veteran

  • Posts: 539
  • Joined: 23 Aug 2004, 00:12
  • Location: USA

Unread post01 Apr 2016, 18:15

Here's a digestible link over at GlobalSecurity about the F-22's (Raptor's) avionics.

http://www.globalsecurity.org/military/systems/aircraft/f-22-avionics.htm

An interesting excerpt:

Integrated avionics means different things to different people.

To the pilot, it means all the information is coordinated and available from a single source.
To the software engineer, it means access to shared data about the situation, the mission, and the aircraft systems.
To the hardware designer, it means common modules in a single backplane with the connectivity and bandwidth to support the required processing.


To me, you can put all the sensor fusion and networking you want on non-(V)LO craft, but if you can't hide and don't have sufficient weapons to employ, then your advantage is marginal (debatable of course). However, if you have sensor fusion with advanced (high-speed / secure) networking AND you are stealthy; there is a paradigm shift in capability. You now have a 'multiplier' effect, whereby mass surveillance of the battle space can be done fairly clandestinely.
Bringing BLAST since 2004...(In my opinion)
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2726
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post04 Apr 2016, 14:48

Scorpion82 wrote:
One significant advantage of 5th gen sensor fusion systems is that in the event of one sensor detecting something, but still not strong enough to create a track or even good detection, the sensor fusion can use other sensors to try to find out if that was actually something important.


There is no real practical limitation why that shouldn't work on 4th gens either, though I doubt that any of them does this at this point. However, there are two fold limitations to this philosophy, first stong enough single detection to track isn't going to work as tracking is a continues process as I have explained before already. Secondly there are limitations wrt to cueing/looking after something. The only two sensors capable to actually look for something onboard the F-35 are the radar and the EOTS. The DAS "sees" or not, ESM receives or not neither can be cued, but both can cue the radar or EOTS. In case of the F-22 only the AN/APG-77 could look after, but one would assume that it can't be in the interest of a VLO platform to ping at any conceivable position where something has been noticed briefly as this would compromise LO. In some situations it might surely make sense, though but there are both limitations and trade offs to accept when doing so.


Tracking is continous process but that doesn't mean every update has to come from same sensor or even same type of sensor and that every update has to be full update and every sensor has to be able to get detections continously. Even tracking does not necessarily have to be totally continous and modern sensors can miss detections and still continue tracking successfully without dropping tracks.

And yes, even EODAS can be cued as can ESM systems. Signal to noise threshold can be altered by software in modern digital systems and it can be done even within small area within the whole sensor field of view. For example let's say Barracuda hears something interesting behind and above the F-35 but can not tell exactly what. Sensor fusion engine then cues EODAS by lowering signal to noise threshold of part of images coming only from angle where Barracuda received interesting signals. Lowering the S/N threshold everywhere would result in large amount of false targets and other similar problems. S/N threshold is usually pretty high because of this. However doing so in very limited part would not do so and would improve target detection probability if there is target to detect. Sensor fusion could also cue EOTS to do the same and also control FOV (magnification) to maximum performance in finding a possible target. Similarly other sensors can cue Barracuda in concentrating on some part of electromagnetic spectrum. For example if other sensors tell that target is fighter, Barracuda can concentrate on X-band emissions for the most part. Of course with F-35 and F-22 it does not have to be same jet, but several jets can be part of the sensor fusion process. I agree that currently there are many limitations in sensor fusion with F-35 as things get complex when more sensors and participants are added to equation.
Offline

Scorpion82

Elite 1K

Elite 1K

  • Posts: 1094
  • Joined: 07 Oct 2007, 18:52

Unread post04 Apr 2016, 19:48

Is that image processing optimization, let alone electronic zoom capability or selective listening by the ESM even a confirmed and actually implemented capability? There is a lot of talk about capabilities, but I sometimes can't get rid of the feeling that people consider the potential and talk about imaginary capabilities as if they were for real.
Offline

les_paul59

Senior member

Senior member

  • Posts: 330
  • Joined: 23 Jan 2016, 05:57

Unread post04 Apr 2016, 22:28

Scorpion this jet just hit ioc last year and it's revolutionary out of the box, in 20 years the f-35 will have capabilities that we havn't even thought of yet because of the open avionics architecture and it's ability to integrate new software
Offline

Scorpion82

Elite 1K

Elite 1K

  • Posts: 1094
  • Joined: 07 Oct 2007, 18:52

Unread post05 Apr 2016, 17:39

@les_paul

that's all well and fine, but some realism should be applied and the differentation between what's in now, what's planned for the future and what could possibly be feasibile even further down the road, should be made. Otherwise we're discussing a mix of current, future and what if capabilities.
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2726
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post08 Apr 2016, 13:30

Scorpion82 wrote:Is that image processing optimization, let alone electronic zoom capability or selective listening by the ESM even a confirmed and actually implemented capability? There is a lot of talk about capabilities, but I sometimes can't get rid of the feeling that people consider the potential and talk about imaginary capabilities as if they were for real.


I'm sure many of the potential features have not been implemented and currently implemented features are likely not nearly perfect. I'm also sure that we will not hear much details about all the capabilties these systems have as they are the core of combat effectiveness. What is certain is that 5th gen systems have far more potential and room for growth than any known 4th gen system. I also don't doubt that sensor fusion in advanced 4th gen systems (like Dassault Rafale and EF Typhoon) gives them serious advantages compared to earlier 4th gen systems.

Currently F-35 sensor fusion alone is about half a million lines of code, so I'm sure there are a lot of features already implemented. I'm also sure the amount of code will grow a lot during the lifetime of F-35.

http://www.lockheedmartin.com/us/news/features/2015/072015-f35-supercomputer.html

Highly sophisticated software enables the game-changing capabilities of the F-35, operating its navigation, communications and targeting systems. Each jet will have more than 8 million lines of code—more than any other U.S. or allied jet in history.

On board each F-35, nearly half a million lines of code are dedicated to capturing, analyzing and combining stunning amounts of information into an integrated picture for F-35 pilots. The F-35’s supercomputing brain even tracks maintenance needs and trends for the global fleet, thanks to the Autonomic Logistics Information System, better known as ALIS.


Btw, there is a lot of interest in image sensor fusion:
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4431292/

Interesting read about detecting point targets with omnidirectional IR cameras (like EODAS):
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4168490/

Very interesting patent from Lockheed Martin about data and sensor fusion:
https://www.google.ch/patents/US7283938

This patent sound an awful lot like F-35. From the patent:

Thus, in a very general aspect of the invention, a plurality of sensors observe an object, the sensor characteristics are fused to generate fused or combined sensor characteristics, and the raw sensor data is processed to produce evidence signals representative of characteristics which may be used to classify the object as to type. Thus, the fused sensor characteristics are equivalent to the characteristics of a single virtual sensor. The evidence and fused sensor characteristics are applied to a taxonomic classifier to determine the object type. Put another way, a virtual sensor according to an aspect of the invention incorporates fused sensor characteristics from plural sensors, together with a taxonomic classifier operating on (a) the unfused individual sensor evidence or information, (b) the individual sensor taxonomic classifications, and (c) the fused sensor characteristics, and classifies the target or object. While only three sensors have been illustrated in the arrangement of FIG. 4, any number of sensors may be included.

More particularly, a method according to an aspect of the invention is for fusing information from plural sources (312 1, 312 2, and 312 3). The method comprises the step of observing an object with at least first (312 1) and second (312 2) sensors, each of which (a) evaluates evidence or information and (b) based on the evidence, assigns a taxonomic classification to its observation of the object. The method further comprises the step of fusing the sensor characteristics (block 426) from the first and second sensors to produce compound sensor characteristics. The fusion of sensor characteristics may occur at any time prior to the combination of evidence and fused sensor characteristics. A classification is assigned (block 414) based on the evidence and compound sensor characteristics. In a particular embodiment of the invention, the classification based on compound evidence is taxonomic or type classification.

Fusion may be performed using any of a number of known methods including Bayes, Dempster-Shafer, evidence fusion, and other methods. An exemplary case is Bayes fusion resulting in probabilities P(a|E1,E2) that a given object type a, and P(b|E1,E2) that a given object type b, was observed when evidence E1 results from the observation by sensor 1, and evidence E2 results from the observation of the same object by sensor 2. No fusion of evidence from the actual observation is necessary, although additional fusion and other processing is not excluded.
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2726
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post11 Apr 2016, 14:41

Dug some more patents from Lockheed Martin or their personnel to learn more about how F-35 sensor fusion works. Here are some:

Method and system for multi-sensor data fusion using a modified dempster-shafer theory:
http://www.google.com.gh/patents/US6944566

Method and system for data fusion using spatial and temporal diversity between sensors:
http://www.google.com.gh/patents/US6909997

Target detection improvements using temporal integrations and spatial fusion
http://www.google.com.gh/patents/US7742620

Bernoulli taxonomic discrimination method:
https://www.google.si/patents/US7499833

Determination of the presence of closely spaced targets
https://www.google.si/patents/US7221307

Of course these are not said to be about F-35 sensor fusion, but the descriptions fit it extremely well as do dates of these patents. I'd be really surprised if they'd be about anything else besides F-35 sensor fusion.

Some interesting points can be found. What I gather is:

Sensor fusion described in these patents s concentrating a lot towards detecting difficult targets in difficult situations and then ID'ing them with high accuracy and reliability. We are talking about doubling the detection range with sensor fusion system vs. using sensors by themselves (like 4th gen systems do) in demanding situations. Of course the sensors themselves in F-35 are really good, which means the difference is likely even larger.

This sensor fusion system uses all the sensors basically like one virtual sensor with many kinds of ways of detecting and ID'ing targets. It's not about combining tracks to single track file but rather actually creating detections and tracks from scratch using all the signals coming from all the sensors.
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2726
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post14 Apr 2016, 12:13

KamenRiderBlade wrote:I can't wait till they port the Sensor Fusion technology to other platforms.

Subs with Sensor Fusion


Subs are definitely potential candidate for sensor and data fusion technologies. There is this for example this project:
http://www.navysbir.com/n15_1/N151-035.htm

Basically you could do sensor and data fusion using all the sensors, databases and other such systems all around the submarine. One could probably use for example acoustic noise sensors around the submarine and fuse their information with sonar array information to lower the effect of noise generated by the submarine itself.

Of course even one sensor can be used with sensor and data fusion. For example it's possible to generate significantly sharper pictures with a camera when you take several pictures from same object and fuse the pictures together. Similar techniques can probably be used in sonar systems to improve their performance although here the problem is low resolution of the main sensors. Of course most submarines have several sonar arrays (fixed and towed) which are distict sensors themselves and can be used as such for sensor fusion. We could also add the torpedoes to sensor fusion in the case of wire guided torpedoes. A torpedo closing in to enemy ship or submarine might get pretty good sonar image from the target which migth be very useful especially if the torpedo didn't kill the target (missed or didn't detonate for example).

I see sensor and data fusion being the next very big thing. We have a large amounts of sensors and data available but getting most out of them we need better and better automated fusion systems. For example self driving cars need very good sensor fusion to really work in real world.
Offline
User avatar

KamenRiderBlade

Elite 2K

Elite 2K

  • Posts: 2632
  • Joined: 24 Nov 2012, 02:20
  • Location: USA

Unread post14 Apr 2016, 15:35

hornetfinn wrote:
KamenRiderBlade wrote:I can't wait till they port the Sensor Fusion technology to other platforms.

Subs with Sensor Fusion


Subs are definitely potential candidate for sensor and data fusion technologies. There is this for example this project:
http://www.navysbir.com/n15_1/N151-035.htm

Basically you could do sensor and data fusion using all the sensors, databases and other such systems all around the submarine. One could probably use for example acoustic noise sensors around the submarine and fuse their information with sonar array information to lower the effect of noise generated by the submarine itself.

Of course even one sensor can be used with sensor and data fusion. For example it's possible to generate significantly sharper pictures with a camera when you take several pictures from same object and fuse the pictures together. Similar techniques can probably be used in sonar systems to improve their performance although here the problem is low resolution of the main sensors. Of course most submarines have several sonar arrays (fixed and towed) which are distict sensors themselves and can be used as such for sensor fusion. We could also add the torpedoes to sensor fusion in the case of wire guided torpedoes. A torpedo closing in to enemy ship or submarine might get pretty good sonar image from the target which migth be very useful especially if the torpedo didn't kill the target (missed or didn't detonate for example).

I see sensor and data fusion being the next very big thing. We have a large amounts of sensors and data available but getting most out of them we need better and better automated fusion systems. For example self driving cars need very good sensor fusion to really work in real world.


The brit's latest Sonar system is effectively DAS but with Hydrophones, lots of little Hydrophones surrounding the sub.

https://en.wikipedia.org/wiki/Astute-class_submarine
Last edited by KamenRiderBlade on 14 Apr 2016, 18:17, edited 1 time in total.
Offline

flighthawk128

Active Member

Active Member

  • Posts: 163
  • Joined: 24 Dec 2011, 23:25

Unread post14 Apr 2016, 17:17

Huh....

Is... is this one of us? It's much too reasonable for a random basement dweller...

http://dsmboarder.kinja.com/why-the-f-3 ... 1769973618

I ran into this article while going through http://jerryofgarcia.kinja.com/ for the photos :mrgreen:
Offline
User avatar

XanderCrews

Elite 3K

Elite 3K

  • Posts: 5840
  • Joined: 16 Oct 2012, 19:42

Unread post15 Apr 2016, 04:33

flighthawk128 wrote:Huh....

Is... is this one of us? It's much too reasonable for a random basement dweller...

http://dsmboarder.kinja.com/why-the-f-3 ... 1769973618

I ran into this article while going through http://jerryofgarcia.kinja.com/ for the photos :mrgreen:


Cool!

Christ on a cracker again with this A-10 stuff in the comments? Jesus.why?
Choose Crews
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2726
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post15 Apr 2016, 09:15

flighthawk128 wrote:Huh....

Is... is this one of us? It's much too reasonable for a random basement dweller...

http://dsmboarder.kinja.com/why-the-f-3 ... 1769973618

I ran into this article while going through http://jerryofgarcia.kinja.com/ for the photos :mrgreen:


Pretty rare to see such a good article about current 5th gen fighters and the potential they offer. I have to object this part though:

This isn’t because the F-35 blows you away in the traditional fighter jet metrics that we often compare different 4th generation fighter jets with (it doesn’t). Its because the F-35 will fundamentally alter which metrics are truly important. Range, speed, maneuverability etc. are still highly important, but they have been usurped by sensor fusion and its close, but arguably more important cousin data fusion.


Almost all 4th gen fighters offer equal traditional fighter jet metrics range, speed and maneuverability only when you compare one or two of them at the same time when those 4th gen fighters are not carrying weapons. F-35 seems to have pretty unique range, speed and maneuverability (all at the same time) especially in air-to-ground combat configuration. Even in air-to-air combat configuration, I think only the much larger and heavier Su-35S might equal it and all else have to carry several EFTs which they have to drop to have equal speed and maneuverability. That blows me away especially considering the other capabilities.
Offline

les_paul59

Senior member

Senior member

  • Posts: 330
  • Joined: 23 Jan 2016, 05:57

Unread post16 Apr 2016, 00:00

that link above was a pretty reasonable analysis of 5th gen. fighters....wish more people viewed the f-35 in the same light.
Offline
User avatar

KamenRiderBlade

Elite 2K

Elite 2K

  • Posts: 2632
  • Joined: 24 Nov 2012, 02:20
  • Location: USA

Unread post16 Apr 2016, 02:15

les_paul59 wrote:that link above was a pretty reasonable analysis of 5th gen. fighters....wish more people viewed the f-35 in the same light.

I wished more people could get away from their confirmation bias and use pure logic without bias.
PreviousNext

Return to F-35 Avionics

Who is online

Users browsing this forum: No registered users and 2 guests