MH-370 Back Tracking

For each ping between the aircraft and the Inmarsat satellite we have the range/elevation angle between the satellite at that time (satellite is moving) and the aircraft and we have the Burst Frequency Offset (BFO) Doppler data.

Doppler Diagram

So the LOS speed between the satellite and the aircraft along the LOS vector is as follows:

LOS – Line Of Sight
EA – Elevation Angle between the aircraft and the satellite Vs – Satellite’s velocity vector
Va – Aircraft’s velocity vector
α – Bearing angle from the satellite to the aircraft’s position
β – Angle between the aircraft’s velocity vector and the LOS vector

Doppler Formula 1

If we assume a particular position for the aircraft on the ping ring and a particular ground speed for the aircraft then given we have the LOS speed from the BFO Doppler data we can solve for β.

Doppler Formula 2

Couple of assumptions/simplifications.

- Assuming that Vx and Vy are 0 for the satellite and only using the Vz component.
- Assuming that Vx is 0 for the aircraft

At 00:11UTC at the time of the last ping the satellite’s velocity components were:

Vz = -82.1m/s Vx = 1.5m/s Vy = -1.5m/s

Using the following data from Duncan Steel – duncansteel.com.

Time (UTC) Elevation Angle Ping Radius (nm) LOS Speed DS (kt) LOS Speed POL (kt) Satellite Z (nm) Satellite Vz (kt)
18:29 53.53 1880 39.77 76.24 623.87 49.12
19:40 55.80 1760 39.14 4.97 651.30 -3.22
20:40 54.98 1806 65.80 57.02 625.88 -47.42
21:40 52.01 1965 79.85 118.34 557.62 -88.38
22:40 47.54 2206 100.64 175.49 451.20 -123.31
00:11 39.33 2652 125.35 250.43 233.91 -159.58

- DS – Duncan Steel & Mike Exner BFO analysis
- POL – Polynomial fit of BFO data – BFO = 0.3673 * ABS(LOS vel in km/hr) + 94.975

I’ve written a Python script that uses the formula above and the data above to iterate over a number of sample aircraft positions along the 00:11UTC ping ring by iterating over α from 1 degree to 89 degrees. An assumed aircraft ground speed is entered as a parameter and for each sample position along the ping arc the script calculates β and derives 2 possible ground tracks for the aircraft at the sample position to match the known LOS speed from the BFO Doppler data.

Given the ground tracks calculated and the supplied aircraft ground speed and the time between this ping and the previous ping the script calculates the aircraft’s position at the time of the previous ping assuming a constant ground speed and constant ground track.

If the computed aircraft position at time of the previous ping is within 2% of the previous ping ring then the aircraft’s position on the current ping ring and it’s calculated position on the previous ping ring are plotted with a line joining them.

The script then continues recursively using the points that are within 2% of the previous ping ring to compute β for each of them based on the LOS speed for this ping ring and working backwards to the previous ping ring.

For a given aircraft ground speed the resulting plot shows the range of possible aircraft positions on the 00:11UTC ping ring and ground track that match the BFO LOS speed and which track back to intersecting the previous ping ring at the correct time.

The idea was to get a feel for the range of possible locations on the 00:11UTC ping ring for a given aircraft ground speed that also matches the BFO LOS speed data at that time and to see how many of those tracks track back to the 19:40UTC ping ring.

The plots also include the last known location for the aircraft at 18:22UTC from the Malaysian military radar, indicated as a yellow diamond. I’ve then assumed a maximum ground speed of 500kts from this time and location for 78 minutes until the 19:40UTC ping and rendered the maximum range that could be covered (650nm) in this time as the yellow circle.

The intersection of this circle with the 19:40UTC ping ring places a maximum northern and southern limit to where the aircraft could be at 19:40UTC along this arc.

When I initially started thinking about writing this program I was only aware of the BFO LOS speeds that Duncan Steele and Mike Exner had calculated. However none of Duncan’s routes that he tried in STK could match the LOS speeds that had been calculated from the BFO data. The routes did match the relevant ping ring times but there was no match for the BFO Doppler data.

Subsequently I became aware from a comment on Duncan Steele’s blog posting for an alternative set of BFO derived LOS speeds. I’ve listed both in the table of data that I’m using above and the Python script takes the LOS speed for the ping ring as an input parameter.

In running the script however there are no matches for any airspeeds tested between 350kts and 500kts that produce any matches using Duncan Steele and Mike Exner’s BFO derived LOS speeds, so in all the plots below I’ve only used the BFO derived LOS speeds from the polynomial fit approach.

Source code for the Python program – MH370Backtrack.py

Plots for 350kt, 400kt, 450kt, 480kt and 500kt.

MH370 Backtrack - 350kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 400kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 450kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 480kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 500kt - Last Doppler LOS Speed - 250kt

Posted in Uncategorized | Leave a comment

Live Photo Gallery People Tags

I’ve used the tagging feature in the original Windows Vista Photo Gallery and the Windows Live Photo Gallery versions for tagging people in photos and the location of the photo.

The latest version of Windows Live Photo Gallery now includes a specific ‘People’ category separate from the generic tags category for identifying people in photos. In addition it includes a feature to automatically identify faces in photos and to associate the ‘People’ tags with specific faces in the photo.

The sample below taken from the Windows Live Photo blog shows the new face tagging features. You can also manually draw a rectangle over a face and tag it if the Photo Gallery doesn’t automatically detect the face.

One issue I’ve noticed is that the new people tags aren’t indexed by Windows Search. I often use Windows Search to search for photos based on my people and location tags and the new people tags aren’t found. So at the moment you have to search or filter photos based on the new people tags in Live Photo Gallery itself.

I took a look at the XMP meta-data that is stored by the new people tagging feature in the associated photos.

In the snippet below you can see how the rectangular region for the relevant face is stored if there is one plus the PersonDisplayName. There are APIs in WIC plus associated .Net wrappers in the .Net Framework that allow you to read this meta-data so you could make use of the rectangular regions in your own application that may want to display photos and show the tagged faces etc.

<rdf:Description xmlns:prefix0="http://ns.microsoft.com/photo/1.2/">
  <prefix0:RegionInfo>
    <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <prefix1:Regions xmlns:prefix1="http://ns.microsoft.com/photo/1.2/t/RegionInfo#">
        <rdf:Bag xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
          <rdf:li>
            <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
              <prefix2:Rectangle xmlns:prefix2="http://ns.microsoft.com/photo/1.2/t/Region#">0.209985, 0.526367, 0.167401, 0.111328</prefix2:Rectangle>
              <prefix3:PersonDisplayName xmlns:prefix3="http://ns.microsoft.com/photo/1.2/t/Region#">Sarah McLeod</prefix3:PersonDisplayName>
            </rdf:Description>
          </rdf:li>
          <rdf:li>
            <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
              <prefix4:Rectangle xmlns:prefix4="http://ns.microsoft.com/photo/1.2/t/Region#">0.430250, 0.148438, 0.284875, 0.189453</prefix4:Rectangle>
              <prefix5:PersonDisplayName xmlns:prefix5="http://ns.microsoft.com/photo/1.2/t/Region#">Gwen De Roubaix</prefix5:PersonDisplayName>
            </rdf:Description>
          </rdf:li>
          <rdf:li>
            <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
              <prefix6:PersonDisplayName xmlns:prefix6="http://ns.microsoft.com/photo/1.2/t/Region#">Marcelle De Roubaix</prefix6:PersonDisplayName>
            </rdf:Description>
          </rdf:li>
        </rdf:Bag>
      </prefix1:Regions>
    </rdf:Description>
  </prefix0:RegionInfo>
</rdf:Description>

The following snippet shows the XMP meta-data that is stored if you tag one of your Messenger contacts as a People tag.

<rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:prefix0="http://ns.microsoft.com/photo/1.2/">
  <prefix0:RegionInfo>
    <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <prefix1:Regions xmlns:prefix1="http://ns.microsoft.com/photo/1.2/t/RegionInfo#">
        <rdf:Bag xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
          <rdf:li>
            <rdf:Description xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
              <prefix2:PersonDisplayName xmlns:prefix2="http://ns.microsoft.com/photo/1.2/t/Region#">gerhard</prefix2:PersonDisplayName>
              <prefix3:PersonEmailDigest xmlns:prefix3="http://ns.microsoft.com/photo/1.2/t/Region#">89C386678731AB3D7DEE0E14E11E633387FBDBCD</prefix3:PersonEmailDigest>
              <prefix4:PersonLiveIdCID xmlns:prefix4="http://ns.microsoft.com/photo/1.2/t/Region#">8765613456339678115</prefix4:PersonLiveIdCID>
            </rdf:Description>
          </rdf:li>
        </rdf:Bag>
      </prefix1:Regions>
    </rdf:Description>
  </prefix0:RegionInfo>
</rdf:Description>

And lastly a snippet showing how the regular tags which are indexed by Windows Search are stored, basically in a <dc:subject> element and in a <MicrosoftPhoto:LastKeywordXMP> element.

<rdf:Description xmlns:dc="http://purl.org/dc/elements/1.1/">
  <dc:subject>
    <rdf:Bag xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <rdf:li>Party</rdf:li>
      <rdf:li>People/Sarah McLeod</rdf:li>
    </rdf:Bag>
  </dc:subject>
</rdf:Description>
<rdf:Description xmlns:MicrosoftPhoto="http://ns.microsoft.com/photo/1.0">
  <MicrosoftPhoto:LastKeywordXMP>
    <rdf:Bag xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <rdf:li>Party</rdf:li>
      <rdf:li>People/Sarah McLeod</rdf:li>
    </rdf:Bag>
  </MicrosoftPhoto:LastKeywordXMP>
</rdf:Description>

Posted in Uncategorized | 3 Comments

Bitmap snapshots of WPF Visuals

Recently I needed to create a bitmap of some WPF controls to be used in another program. Doing a quick search turned up references to the RenderTargetBitmap class in WPF with sample code along the lines of:

RenderTargetBitmap bmp = new RenderTargetBitmap((int)element.Width, (int)element.Height, 96, 96, PixelFormats.Pbgra32);
bmp.Render(element);

However if the WPF control had a margin then the rendered bitmap had transparent pixels for the margin area. As an example here is a button inside a StackPanel with a margin applied.

 

And the following is the bitmap that is created via the sample code above:

Doing some more searching turned up the following code which creates a VisualBrush from the target Visual and then renders that into a DrawingVisual and then finally uses RenderTargetBitmap to take a snapshot of the DrawingVisual. Using this approach the margins are ignored and the bitmap only consists of the target WPF control/visual as shown below:

void CreateBitmapFromVisual(Visual target, string filename) 
{ 
    if (target == null) 
        return; 

    Rect bounds = VisualTreeHelper.GetDescendantBounds(target);

    RenderTargetBitmap rtb = new RenderTargetBitmap((Int32)bounds.Width, (Int32)bounds.Height, 96, 96, PixelFormats.Pbgra32); 
    
    DrawingVisual dv = new DrawingVisual(); 
    
    using (DrawingContext dc = dv.RenderOpen()) 
    { 
        VisualBrush vb = new VisualBrush(target); 
        dc.DrawRectangle(vb, null, new Rect(new Point(), bounds.Size)); 
    } 

    rtb.Render(dv);

    PngBitmapEncoder png = new PngBitmapEncoder();

    png.Frames.Add(BitmapFrame.Create(rtb));

    using (Stream stm = File.Create(filename))
    {
        png.Save(stm);
    }
}

Posted in Uncategorized | 23 Comments

Geeks and Fast Jets

I was approached recently to help develop some Flight Test Instrumentation (FTI) for a Hawker Hunter jet to be used during a test pilot course involving high angle of attack manoeuvres and spinning.

The following data needed to be recorded:

  • Angle of Attack (AoA)
  • Sideslip angle
  • Attitude (pitch, roll and heading)
  • Indicated airspeed and altitude
  • Stick and rudder position
  • GPS
  • Video

In addition to recording all the data listed above they also wanted to display a subset of the data in the cockpit as an aid for spin recovery.

Sensors

AoA and sideslip – basically small weather vanes that act as potentiometers were ordered and then physically mounted on a pitot/static boom which was then mounted on the front of the Hawker Hunter’s nose. One minor issue was finding a route for the wires from the vanes to the PC in the avionics bay that didn’t have to pass through the cockpit’s pressure capsule.

Attitude – ordered an Attitude and Heading Reference System (AHRS) from Crossbow that uses solid state sensors and outputs at 25Hz. Ideally the AHRS needs to be mounted as close as possible to the centre of gravity. Luckily the avionics bay is fairly close to the centre of gravity, just slightly forward but there was space to mount it quite easily and definitely on the centre line of the aircraft.

GPS - the Crossbow AHRS also includes a built-in GPS which it uses for aiding it’s attitude solution and also outputs standard NMEA data via a second serial port. Mounted the GPS antenna on the spine of the aircraft behind the cockpit with a fairly short vertical run for the GPS antenna cable to the avionics bay.

Air data – used an existing unit we had from MGL Avionics that we had been using on the Ikarus C-42. Wasn’t ideal since it’s maximum indicated airspeed is 217kt which means it was maxed out a lot of the time. However the stall speed was generally below 217kt so it did provide useful data during the stall and spin.

Control positions – we ordered 3 linear transducers which basically have a shaft that moves in and out and it acts as a potentiometer. They were mounted and attached to the 3 control rods that pass through the spine of the aircraft behind the cockpit. It was again a fairly short straight drop down into the avionics for the wires.

PC

We needed a PC on fairly short notice that was fairly ruggedized in terms of handling vibration and also fairly cold operating temperatures since it was going to be mounted in an unpressurized and unheated avionics bay and the spins would normally be starting at 35,000ft.

The only real change we made to the embedded PC we found was to replace the hard drive with a CF card and to run everything off the CF card. In addition to the high dynamics during the spin plus regular vibration causing problems for a normal hard drive most hard drives aren’t rated to operate above 10,000ft pressure altitude anyway.

We used XP Embedded and it’s Enhanced Writer Filter (EWF) to turn the system partition into a read-only volume so it couldn’t be corrupted during power down since the power was simply going to switched off with no proper shutdown. The data from the sensors was logged to a separate partition on the CF card.

The embedded PC also came with 4 onboard RS232 ports which we needed in order to connect to all the RS232 based sensors (AHRS, GPS, air data). To read the analog data from the AoA, sideslip and control position sensors we installed an A/D card into the single PCI slot.

Display

For the in cockpit display of ‘spinning panel’ which mainly consisted of displaying the AoA and sideslip angles plus the control positions we initially developed a Pocket PC application which was connected to the embedded PC via an RS232 connection. However after some concerns regarding it’s sunlight readability and some concerns about physically mounting it in the cockpit the Pocket PC was only used on 2 initial test flights.

We then bought a 6" LCD which was rated at 1200nits for sunlight readability and removed the gunsight to mount the LCD in it’s place on the student’s side of the cockpit. The software for the ‘spin panel’ was then modified to run on the embedded PC that we were using for data capture and to connect it to the LCD using a VGA connection.

The one issue or complication with the ‘spin panel’ display being mounted in the cockpit was the need to have wires passing through the pressure bulkhead from the avionics bay into the cockpit. Special bulkhead connectors have to be used in order to pass the wires through without causing a pressure leak.

Video

An SD based video camera recorder with a remote lens was mounted in the cockpit and used to capture video footage of the flight.

Playback

The last part of the project involved the development of a playback system to take the captured data from the sensors and from the video camera and allow the data to be played back and analysed by the pilots and flight test engineers.

The attitude data and air data are displayed as part of a standard Primary Flight Display (PFD). The spin panel display that is used in the cockpit is also rendered plus playback of the video. Lastly we have line graphs displaying the current, past and future data for each of the recorded parameters.

All of the data including the video is synchronised during playback and can be paused, seeked and played back at different speeds, i.e. various slow motion and fast motion playback speeds. The playback application is a Windows Presentation Foundation (WPF) application and makes heavy use of the animation framework within WPF.

And finally getting ready for the last of the 34 sorties on a glorious Cape Town winter’s day.

Posted in Uncategorized | 13 Comments

MapReduce implementation using C# generics

I came across the Google paper on MapReduce again the other day and decided to try a simple implementation using C# generics allowing you to specify specific types for the keys and values as opposed to being forced to use strings for all keys and values.

The initial version doesn’t include any automatic parallelism across multiple CPUs or clusters of machines.

The core implementation below is only about 50-60 lines of code. I’ve also included sample map and reduce functions making use of this library and mirroring some of the sample applications mentioned in the Google paper.

using System; using System.Collections.Generic; // Common shortcut where both keys and both values are strings using MapReduceAllStrings = MapReduce.MapReduce<string,string,string,string>; namespace MapReduce { // Map delegate public delegate IEnumerable<KeyValuePair<K2,V2>> Map<K,V,K2,V2>(KeyValuePair<K,V> input); // Reduce delegate public delegate IEnumerable<V2> Reduce<K2,V2>(KeyValuePair<K2, IEnumerable<V2>> input); public class MapReduce<K, V, K2, V2> { // Process input data using user supplied map and reduce delegates public IEnumerable<KeyValuePair<K2, IEnumerable<V2>>> Process(IEnumerable<KeyValuePair<K,V>> input, Map<K,V,K2,V2> map, Reduce<K2,V2> reduce) { // Use dictionary to store intermdiate data - (k2, list(v2)) Dictionary<K2, IEnumerable<V2>> intermediateData = new Dictionary<K2, IEnumerable<V2>>(); // Perform map over all input foreach (KeyValuePair<K, V> inputItem in input) { // Add map results to intermediate dictionary foreach(KeyValuePair<K2,V2> mapOutput in map(inputItem)) { IEnumerable<V2> enumerableList; // If k2 already exists in dictionary then just add this v2 to it's list(v2) if (intermediateData.TryGetValue(mapOutput.Key, out enumerableList)) { List<V2> v2List = (List<V2>)enumerableList; v2List.Add(mapOutput.Value); } else { // Add new k2 to dictionary and create initial list(v2) with this v2 value List<V2> v2List = new List<V2>(); v2List.Add(mapOutput.Value); intermediateData.Add(mapOutput.Key, v2List); } } } // Setup final output data structure List<KeyValuePair<K2, IEnumerable<V2>>> finalOutput = new List<KeyValuePair<K2,IEnumerable<V2>>>(); // Perform reduce over all intermediate data foreach (KeyValuePair<K2, IEnumerable<V2>> intermediateVal in intermediateData) { // Setup final output value, i.e. k2 and an empty list(v2) in preparation for reduce operation KeyValuePair<K2, IEnumerable<V2>> outputVal = new KeyValuePair<K2, IEnumerable<V2>>(intermediateVal.Key, new List<V2>()); finalOutput.Add(outputVal); // Perform reduce over all intermediate data foreach (V2 val in reduce(intermediateVal)) { // Add resultant values from reduce to final output list(v2) ((List<V2>)(outputVal.Value)).Add(val); } } return finalOutput; } public static IEnumerable<V2> IdentityReduce(KeyValuePair<K2, IEnumerable<V2>> input) { List<V2> output = new List<V2>(); foreach (V2 val in input.Value) { output.Add(val); } return output; } } class MapReduceTest { static void Main(string[] args) { WordCountTest(); ReverseWebLinkGraph(); } /////////////////// - WordCount test static void WordCountTest() { List<KeyValuePair<string, string>> input = new List<KeyValuePair<string, string>> { new KeyValuePair<string,string>("a", "the quick brown fox jumps over the log"), new KeyValuePair<string,string>("b", "while the mokey jumps on the fox"), new KeyValuePair<string,string>("c", "and the cow jumps over the moon") }; MapReduce<string, string, string, int> WordCount = new MapReduce<string, string, string, int>(); foreach(KeyValuePair<string,IEnumerable<int>> val in WordCount.Process(input, WordCountMap, WordCountReduce)) { PrintResult<string, int>(val); } } static IEnumerable<KeyValuePair<string, int>> WordCountMap(KeyValuePair<string, string> input) { List<KeyValuePair<string, int>> output = new List<KeyValuePair<string, int>>(); string[] words = input.Value.Split(' '); foreach (string word in words) { output.Add(new KeyValuePair<string,int>(word, 1)); } return output; } static IEnumerable<int> WordCountReduce(KeyValuePair<string, IEnumerable<int>> input) { List<int> output = new List<int>(); int total = 0; foreach (int wc in input.Value) { total += wc; } output.Add(total); return output; } /////////////////// - Reverse web link graph test static void ReverseWebLinkGraph() { List<KeyValuePair<string, string>> input = new List<KeyValuePair<string, string>> { new KeyValuePair<string,string>("abc.com", "cnn.com ibm.com dti.com"), new KeyValuePair<string,string>("cnn.com", "ms.com slashdot.org abc.com iol.com dti.com"), new KeyValuePair<string,string>("dti.com", "ibm.com avd.com abc.com") }; MapReduceAllStrings ReverseWebLinks = new MapReduceAllStrings(); foreach (KeyValuePair<string, IEnumerable<string>> val in ReverseWebLinks.Process(input, ReverseWebLinksMap, MapReduceAllStrings.IdentityReduce)) { PrintResult<string, string>(val); } } static IEnumerable<KeyValuePair<string, string>> ReverseWebLinksMap(KeyValuePair<string, string> input) { List<KeyValuePair<string, string>> output = new List<KeyValuePair<string, string>>(); string[] targets = input.Value.Split(' '); foreach (string target in targets) { output.Add(new KeyValuePair<string, string>(target, input.Key)); } return output; } /////////////////// - Helper static void PrintResult<K,V>(KeyValuePair<K, IEnumerable<V>> result) { Console.Write(result.Key.ToString()); Console.Write(" - "); foreach(V val in result.Value) { Console.Write(val.ToString()); Console.Write(" "); } Console.WriteLine(""); } } }
Posted in Uncategorized | 12 Comments

High Dynamic Range (HDR) Photo Test

I started playing around with generating HDR photos. Here is one of my first attempts using our Pentax DSLR.

I’m using a trial version of Photomatix to perform the HDR processing and tone mapping which is why you’ll notice their watermark in the HDR image below.

I set the auto-bracketing option on the camera to shoot a normal exposure, a -2 stop exposure and a +2 stop exposure.

This is the resultant HDR image.

Next up is the camera’s normal exposure, so effectively what you would end up with if you weren’t doing any HDR processing. Camera selected f/9.5 and 1/60 sec.

Now the -2 stop image, so f/9.5 1/250 sec.

And lastly the +2 stop image, so f/9.5 1/15 sec.

Posted in Uncategorized | 17 Comments

Windows Desktop Search Powershell Cmdlet

 
Microsoft are introducing a new command line shell named PowerShell, previously known as Monad. The main difference between PowerShell and other shells like cmd, bash, ksh etc. is the pipelining and processing of objects as opposed to plain text.
 
I had previously used the Windows Desktop Search API to write a music browser and thought it would be useful to create a Windows Desktop Search Powershell cmdlet to allow users to query their search index from the command line. The results of the query can then be pipelined to further cmdlets to act on the results or to simply display the result set.
 
A more detailed write up with the source code available for download is available at CodeProject.
Posted in Computers and Internet | 26 Comments