F-35 JSF vs Eurofighter Typhoon

The F-35 compared with other modern jets.
  • Author
  • Message
Offline
User avatar

mas

Banned

  • Posts: 344
  • Joined: 31 Aug 2017, 13:16

Unread post30 Nov 2017, 21:15

PIRATE IRST does have some rough ranging capability and can be used by itself to cue and fire IR missiles (see about 1:30 of the Thales video).



It can also be sensor fused with the radar to provide a more accurate combined targeting solution for all missiles

Image
Offline

hornetfinn

Elite 2K

Elite 2K

  • Posts: 2843
  • Joined: 13 Mar 2013, 08:31
  • Location: Finland

Unread post01 Dec 2017, 09:34

mas wrote:PIRATE IRST does have some rough ranging capability and can be used by itself to cue and fire IR missiles (see about 1:30 of the Thales video).


I know, but that would work only in very limited scenarios, namely air-to-air situation firing IR missiles at close range. At longer ranges it would need range information from other sources and radar is the only sensor in Typhoon to give accurate enough information. Of course Typhoon pilot could fly silently (and with lower RCS with radar stowed) and use only passive sensors (Pirate and DASS) and turn radar on right before (like 10-20 seconds before) engagement. Difference is that Super Hornet pilot could use AN/APG-79 all the time without compromising RCS at all and thus have superior situational awareness. I agree that Captor-E will change that, but it will take years before it reaches FOC.

mas wrote:It can also be sensor fused with the radar to provide a more accurate combined targeting solution for all missiles

Image


Yes it can and I mentioned that in my previous reply. It basically combines different tracks into central tracks which declutters displays and gives the pilot and combat system better understanding of the situation. It's very good capability that also Dassault Rafale has and Super Hornet and JAS Gripen E will also have. Like the picture shows, FLIR/IRST has crude range resolution but fine bearing resolution. Radar is the opposite and thus they complement each other very well if sensor information (like tracks) can be correlated. Of course that works only when both sensors are used at the same time.
Offline
User avatar

ricnunes

Elite 2K

Elite 2K

  • Posts: 2203
  • Joined: 02 Mar 2017, 14:29

Unread post01 Dec 2017, 14:32

hornetfinn wrote:
mas wrote:PIRATE IRST does have some rough ranging capability and can be used by itself to cue and fire IR missiles (see about 1:30 of the Thales video).


I know, but that would work only in very limited scenarios, namely air-to-air situation firing IR missiles at close range. At longer ranges it would need range information from other sources and radar is the only sensor in Typhoon to give accurate enough information.


Precisely, and moreover the IRST effectiveness is heavily affected by adverse weather conditions such as clouds.


hornetfinn wrote:
mas wrote:It can also be sensor fused with the radar to provide a more accurate combined targeting solution for all missiles

Image


Yes it can and I mentioned that in my previous reply. It basically combines different tracks into central tracks which declutters displays and gives the pilot and combat system better understanding of the situation. It's very good capability that also Dassault Rafale has and Super Hornet and JAS Gripen E will also have. Like the picture shows, FLIR/IRST has crude range resolution but fine bearing resolution. Radar is the opposite and thus they complement each other very well if sensor information (like tracks) can be correlated. Of course that works only when both sensors are used at the same time.


Here in f-16.net there's a thread which has a couple of very nice diagrams (posted by eloise) which explains the difference between the "sensor fusion" in 4th gen fighter aircraft the sensor fusion in the F-35:
viewtopic.php?f=36&t=28397

Basically in a 4th gen fighter aircraft, namely in the Rafale and Typhoon there's no actual "real sensor fusion". What there is, is that when a target is detected by different sensors - for example detected by a Radar and ESM - that same target will be represented by the best/more accurate track which in the previous example would be the Radar (and thus the ESM track would be cluttered/hidden from the pilot).

Now looking at the "Eurofighter - Sensor Fusion" diagram above the closest thing that resembles an actual sensor fusion is the combined Radar and FLIR/IRST track. But then again, ESM and Data Link tracks don't seem to be truly fused (with Radar tracks for example) like it can be seen in the link that I shared above.
A 4th/4.5th gen fighter aircraft stands about as much chance against a F-35 as a guns-only Sabre has against a Viper.
Offline

loke

Forum Veteran

Forum Veteran

  • Posts: 769
  • Joined: 14 Nov 2008, 19:07

Unread post01 Dec 2017, 15:37

ricnunes wrote:Basically in a 4th gen fighter aircraft, namely in the Rafale and Typhoon there's no actual "real sensor fusion". What there is, is that when a target is detected by different sensors - for example detected by a Radar and ESM - that same target will be represented by the best/more accurate track which in the previous example would be the Radar (and thus the ESM track would be cluttered/hidden from the pilot).

According to my understanding the above is not quite right. I believe that at least for Rafale and Typhoon the "fusion engine" is doing much more than just selecting the "best/more accurate track" -- but rather that information from the different sensors are indeed fused (i.e, combined). However this happens at a "processed" level and not a "raw sensor" level like in the F-35.

So for Rafale/Typhoon/Gripen E each sensor will process the data, send to the fusion engine, which will then fuse it.

In F-35 the data will arrive in a more raw format and then be fused. To qoute from the thread you are linking to yourself:

"The sensor fusion process produces a unique track of a single target which may be reported by several sensors simultaneously, each one providing a subset of target attributes which are compiled to produce an as complete as possible view of the target," Friemer says. Algorithms weigh the reliability of each report before merging them to produce a fused target identity and priority.


Notice the last sentence: Algorithms weigh the reliability of each report before merging them to produce a fused target identity and priority. This is quite different from just "selecting the best track" in two respects; they are talking about reports (from each sensor) not tracks like you did; and they are talking about merging (i.e., fusing) the reports, not just selecting the best.

Note the above description is for Typhoon; other 4.5 gen fighters like the SH may have a simpler, and more primitive "sensor fusion".
Offline

sprstdlyscottsmn

Elite 4K

Elite 4K

  • Posts: 4535
  • Joined: 10 Mar 2006, 01:24
  • Location: Phoenix, Az, USA

Unread post01 Dec 2017, 17:00

Two things.

First - I didn't realize PIRATE was a staring IIR array.

Second - I don't see an appreciable difference between the above statement about how fusion works in the Tiffy vs how people say it works in Stubby. The end effect is still that the pilot has tracks on his display that come from a variety of sources. Does the Tiffy have a "God's Eye" view display? Does the SHornet? I think what REALLY separates the Stubby from the rest is the number and type of sensors plus all the processing of the threats. The Threat Blossom and Bat Symbol are amazing pieces of SA.
"Spurts"

-Pilot
-Aerospace Engineer
-Army Medic
-FMS Systems Engineer
Offline
User avatar

blindpilot

Elite 1K

Elite 1K

  • Posts: 1228
  • Joined: 01 Mar 2013, 18:21
  • Location: Colorado

Unread post01 Dec 2017, 21:00

sprstdlyscottsmn wrote:... I don't see an appreciable difference between the above statement about how fusion works in the Tiffy vs how people say it works in Stubby. ... Does the Tiffy have a "God's Eye" view display? Does the SHornet? I think what REALLY separates the Stubby from the rest is the number and type of sensors plus all the processing of the threats. ...


I've been searching for an analogy to show laymen (reporters etc.) what the F-35 (and F-22 actually) do differently from other "sensor amalgamaters"

This is as close as I've come so far.

----

Imagine that your dog has cornered a skunk in the back yard.

1. The dog is barking ... you hear barking.
2. There is a smell ... you smell a skunk.
3. You scan the horizon and look into the night and see shadows of the two..
etc. etc.

Sensor amalgamaters might/will connect the sound of barking to the shadow of the dog. It will connect the smell to the shadow of the skunk. You will have a basic picture. Dog Skunk. Now you put together a tactic. Run to the dog. ?Yell at the dog? Throw a rock at the dog?

With full 5th gen sensor fusion,
... the sound directs your view to look in the direction of the sound, you don't scan, you look "over there", and you see the dog, that it is a dog. That sight causes you to listen more carefully ... you now notice that the sound also says it's "your" dog. The smell is less directional, but as you make out the scene with your eyes, you connect the area smell to the additional figure, a skunk ... further more, you cringe as you put together a complete picture of what is to come, tomato baths, skunk smell all night ... you need some strategy to minimize this, and move to get the dog away, and direct the skunk to go away from your house. That strategy takes into account the fence location, current wind directions etc. etc. You decide based on a "God's Eye" picture and understanding. Your "database" of memory of the layout, along with the new elements, is put into the decision. You maneuver up wind and call your dog to you, while deriving and encouraging an escape route for the skunk, that doesn't go past your bedroom window.

Simple sensor amalgamating doesn't do all of that. It helps you do it in your head, but it doesn't give you the "God's eye picture."

Not sure that helps, or is even close to an analogy, but it's a start.

FWIW,
BP
Last edited by blindpilot on 01 Dec 2017, 21:09, edited 2 times in total.
Offline
User avatar

spazsinbad

Elite 5K

Elite 5K

  • Posts: 23485
  • Joined: 05 May 2009, 21:31
  • Location: ɐıןɐɹʇsn∀¯\_(ツ)_/¯
  • Warnings: -2

Unread post01 Dec 2017, 21:07

Thanks 'BP'. The F-35 may suggest a course of action(s) do this/that also? Suggest/Ready a weapon for use if required?
A4G Skyhawk: www.faaaa.asn.au/spazsinbad-a4g/ & www.youtube.com/channel/UCwqC_s6gcCVvG7NOge3qfAQ/videos?view_as=subscriber
Offline
User avatar

blindpilot

Elite 1K

Elite 1K

  • Posts: 1228
  • Joined: 01 Mar 2013, 18:21
  • Location: Colorado

Unread post01 Dec 2017, 21:19

spazsinbad wrote:Thanks 'BP'. The F-35 may suggest a course of action(s) do this/that also? Suggest/Ready a weapon for use if required?


Well, it could in software, if we wanted an AI engine to do it, but more importantly, with the display approaches, ... calculated SAM radar probable detection bubbles, weapon range markers, flight path lines etc. already in the system ... the mission data files and "God's eye view" naturally suggests such things, because the sensor presentation and data merging was designed with those things in mind, not simple de-cluttering.

MHO
BP
Offline
User avatar

blindpilot

Elite 1K

Elite 1K

  • Posts: 1228
  • Joined: 01 Mar 2013, 18:21
  • Location: Colorado

Unread post01 Dec 2017, 21:35

blindpilot wrote:... the display approaches, ... designed with those things in mind, not simple de-cluttering.

MHO
BP


When I visited the Pentagon as a NORAD SME to work on missile detection display strategies, the biggest and most important decision drivers were to provide a context picture to help make specific decisions ... such as "Start (or not) WW III."

It didn't matter whether the missile picture/track presentation etc. was "pretty" or "cartoonish." What mattered was whether the overall picture shown, clearly revealed what was actually happening. ex: The bad guys had launched a massive attack at cities or military ... or not? All the input, IR launch detection, Radar data, intel feedback, operator system confidence needed to be focused on that purpose. The display could be an arrow, and all the data merged to be seen in that arrow, but the important thing was that the picture given to the decision makers was clear and accurate.

I didn't want a "HAL" to decide what to do. But I surely did want the NCA (national command authority) to be able to rightly decide, easily. (so clearly that "HAL" could have decided perhaps, but not for that purpose)

BP

PS - As in all these discussions... there is a difference between . "My Garmin and flip phone can give me maps and make phone calls." ... and an iPhone. Even when they do exactly the same thing.
Last edited by blindpilot on 01 Dec 2017, 22:03, edited 1 time in total.
Offline
User avatar

playloud

Senior member

Senior member

  • Posts: 279
  • Joined: 13 Nov 2006, 04:07

Unread post01 Dec 2017, 21:50

blindpilot wrote:
blindpilot wrote:... the display approaches, ... designed with those things in mind, not simple de-cluttering.

MHO
BP


When I visited the Pentagon as a NORAD SME to work on missile detection display strategies, the biggest and most important decision drivers were to provide a context picture to help make specific decisions ... such as "Start (or not) WW III."

It didn't matter whether the missile picture/track presentation etc. was "pretty" or "cartoonish." What mattered was whether the overall picture shown, clearly revealed what was actually happening. ex: The bad guys had launched a massive attack at cities or military ... or not? All the input, IR launch detection, Radar data, intel feedback, operator system confidence needed to be focused on that purpose. The display could be an arrow, and all the data merged to be seen in that arrow, but the important thing was that the picture given to the decision makers was clear and accurate.

I didn't want a "HAL" to decide what to do. But I surely did want the NCA (national command authority) to be able to rightly decide, easily. (so clearly that "HAL" could have decided perhaps, but not for that purpose)

BP

Now I feel the need to rewatch "Wargames" tonight.
Offline
User avatar

ricnunes

Elite 2K

Elite 2K

  • Posts: 2203
  • Joined: 02 Mar 2017, 14:29

Unread post01 Dec 2017, 23:01

loke wrote:
ricnunes wrote:Basically in a 4th gen fighter aircraft, namely in the Rafale and Typhoon there's no actual "real sensor fusion". What there is, is that when a target is detected by different sensors - for example detected by a Radar and ESM - that same target will be represented by the best/more accurate track which in the previous example would be the Radar (and thus the ESM track would be cluttered/hidden from the pilot).

According to my understanding the above is not quite right. I believe that at least for Rafale and Typhoon the "fusion engine" is doing much more than just selecting the "best/more accurate track" -- but rather that information from the different sensors are indeed fused (i.e, combined). However this happens at a "processed" level and not a "raw sensor" level like in the F-35.

So for Rafale/Typhoon/Gripen E each sensor will process the data, send to the fusion engine, which will then fuse it.


From everything that I read in that thread (great thread by the way) I believe to have understood it well, specially regarding the Rafale.
In the thread (which you also referenced to) you can read the following about the Rafale's Sensor Fusion:

Implementation of the “multi-sensor data fusion” into the RAFALE translates into accurate, reliable and strong tracks, uncluttered displays, reduced pilot workload, quicker pilot response, and eventually into increased situational awareness.

It is a full automated process carried out in three steps:

Establishing consolidated track files and refining primary information provided by the sensors,
Overcoming individual sensor limitations related to wavelength / frequency, field of regard, angular and distance resolution, etc, by sharing track information received from all the sensors,
Assessing the confidence level of consolidated tracks, suppressing redundant track symbols and decluttering the displays.


Another great example in the following picture:

Image

Note that it can be clearly seen in the picture above that multiple tracks coming from the several sensor enters the "Sensor Correlator" but only one of those tracks (the best one) is shown to the pilot.


loke wrote:"The sensor fusion process produces a unique track of a single target which may be reported by several sensors simultaneously, each one providing a subset of target attributes which are compiled to produce an as complete as possible view of the target," Friemer says. Algorithms weigh the reliability of each report before merging them to produce a fused target identity and priority.

Notice the last sentence: Algorithms weigh the reliability of each report before merging them to produce a fused target identity and priority. This is quite different from just "selecting the best track" in two respects; they are talking about reports (from each sensor) not tracks like you did; and they are talking about merging (i.e., fusing) the reports, not just selecting the best.

Note the above description is for Typhoon; other 4.5 gen fighters like the SH may have a simpler, and more primitive "sensor fusion".


Note that I was talking about "tracks" (only). I did not say the there aren't other kind of information sharing which can be generated by each sensor and then shared.
For example a Radar detects a enemy Su-27 aircraft which has its radar on and as such is also detected by ESM. My interpretation of what happens in this case (which seems more evident with the Rafale) is that the track used for this contact/enemy aircraft is the one generated by the Radar (since it's more precise) but together with this the information generated by the ESM, this case the Su-27 radar information (detected by ESM) is added to the radar track but the two tracks aren't truly merged as one (the ESM track is simply hidden) - The end result is a radar track with a Su-27 tag added to it but show to the pilot as a merged (contact).

But I also understand that the Typhoon has some better level of integration/fusion between the Radar and IRST which seem to be able to create a single track with the bearing from the IRST and range from the radar - This is probably more similar or closer to the sensor fusion seen in 5th gen fighter aircraft like the F-35 but again this is a partial "truly" sensor fusion.
A 4th/4.5th gen fighter aircraft stands about as much chance against a F-35 as a guns-only Sabre has against a Viper.
Offline
User avatar

ricnunes

Elite 2K

Elite 2K

  • Posts: 2203
  • Joined: 02 Mar 2017, 14:29

Unread post01 Dec 2017, 23:03

@blindpilot,

Excellent dog skunk analogy there! Thanks for sharing it. :wink:
A 4th/4.5th gen fighter aircraft stands about as much chance against a F-35 as a guns-only Sabre has against a Viper.
Offline
User avatar

mas

Banned

  • Posts: 344
  • Joined: 31 Aug 2017, 13:16

Unread post02 Dec 2017, 00:41

Both Rafale and Typhoon share all sensor information to obtain consolidated tracks with all this information. It's not just Radar and IRST, in the Typhoon sensor fusion diagram the lower enemy aircraft is being tracked using a fusion of MIDS and ESM. Redundant tracks that are being dropped are the raw ones that have been improved upon with other sensor information (consolidated) whereas some raw tracks only have one sensor source and so are kept. I am not sure whose 4th gen sensor fusion LMT is referring to where they say only one source (the best) is being kept but it is not what is happening with Rafale or Typhoon.

What the sensor fusion in F-35 is doing above this is each sensor is cueing every other sensor on the aircraft automatically to obtain more fused information before only one (consolidated at source) track is shown whereas in the 4th gen aircraft it is up to the pilot what sensors he uses and when so fusion is a manual process in terms of what sensors have been fused. So the quality of the fused track is always optimum in F-35 as all potential sensors have gone into it automatically.
Offline
User avatar

blindpilot

Elite 1K

Elite 1K

  • Posts: 1228
  • Joined: 01 Mar 2013, 18:21
  • Location: Colorado

Unread post02 Dec 2017, 03:33

mas wrote:Both Rafale and Typhoon share all sensor information to obtain consolidated tracks with all this information. .. I am not sure whose 4th gen sensor fusion LMT is referring to where they say ...
What the sensor fusion in F-35 is doing above this is each sensor is cueing every other sensor on the aircraft automatically to obtain more fused information before only one (consolidated at source) track is shown whereas in the 4th gen aircraft it is up to the pilot what sensors he uses and when so fusion is a manual process in terms of what sensors have been fused. So the quality of the fused track is always optimum in F-35 as all potential sensors have gone into it automatically.


Again we are dealing with "faster, higher, better" 4th gen ways of thinking instead of what 5th Gen brings to the situation. That makes descriptions difficult.

Basically, 4/4.5 gen sensor management is amalgamation of data (correlating tracks etc.) This is certainly "better" in that it de-clutters and gives a clearer picture of the data/tracks being reported. BUT it is still only "better" in a 4th Gen sense. It is still a 4th Gen track. Although a whole lot better than that 2nd Gen green blip in the little 4 inch scope used to be.

Fifth gen is not "sensor/track management," even though it does manage sensors and track data. The system manages sensors to build a situational overview of reality. That reality contains a lot of things including track path, speed, nature and such but it's not the track we are watching. It's the element in the context. ex: SU-27 flying over the border. It's almost as if it is the context that is being managed. It is certainly the entire picture. Normally that's done (even with 4.5 gen fusion) in the pilot's head. What this does is create new applications (see iPhone etc.) that single function actions (Garmin and flip phone), even merged cannot duplicate. You see things as a whole, that you could never see plainly with simple amalgamation. The pilot can certainly ( and with a cleaner picture more easily) do it in his head but it is still "in his head."

Back to my dog and skunk. There is a difference between ...
1. Hearing barking ... considering what it is ... listen ... try to find direction with your head placement for hearing better. Looking ... left ... then right ... look again ... deciding what you see .... taking the sound and sight and connecting them ... ah ha! that's where the barking is happening ... etc. etc.

and

2. Observing and filling in the dynamic elements with all your senses at once, smelling all, hearing all, seeing all and constructing the full picture with that data, all the while unconsciously using one sense and the next together to fill in the blanks. The pilot doesn't do anything in his head. He just sees the "God's eye picture" of all around him. He is able to focus in a more strategic way ... "I don't want to go that way, perhaps have my friend fake left, and I will go down and under there to the opening at the right .. I wonder if that destroyer could .." ... etc. That's a completely higher level of thinking for the pilot, and a totally new concept of Situational Awareness.

As I have noted before, even the long time F-22 pilots, and current batch of Marine F-35 guys say that they are just figuring out what this means. People are still writing new "Did you know what you can do with an iPhone!" applications.

Which is to say 4th Gen sensor fusion is not what the F-22/35 fusion is all about. The Typhoon (or SH, Rafale, Gripen) sensor fusion is not what the F-35 is doing. Explaining the how of the camera, radar, ESM functions is just more "faster, higher, better" speak. I don't care how many sensors you "fuse into one track." Fifth Gen is not even knowing (or caring) where the data comes from ... it's from the "ether" (often from off platform sources, but you don't even care then).

I'm not concerned with what I can see... better (I don't need NVG to see clearer or laser range finding)
..... I'm concerned that my dog is going to smell like a skunk! and I can see, hear and smell clearly with what I know, that this is exactly what is about to happen if I don't ...

MHO,
BP
Offline

optimist

Forum Veteran

Forum Veteran

  • Posts: 991
  • Joined: 20 Nov 2014, 03:34
  • Location: australia

Unread post02 Dec 2017, 04:44

I've seen this said, that seems more like it, because of a very narrow field of view.
"You scan the horizon and look into the night through a straw and may see shadows of the one or two..
etc. etc."

Would radar modes normally give you where you want to look, for a wider search?
Aussie fanboy
PreviousNext

Return to F-35 versus XYZ

Who is online

Users browsing this forum: white_lightning35 and 6 guests