MH-370 Forward Tracking

The last reported location for MH-370 from Malaysian military radar is N6.6381 E96.408 at 18:22UTC. If we assume some maximum ground speed for the next 78 minutes until the 19:40UTC satellite ping then the intersection of the maximum aircraft range from the time of the last radar contact until the 19:40UTC satellite ping will define the northernmost and southernmost points on the 19:40UTC ping ring that the aircraft could have reached. Effectively turning the 19:40UTC ping ring into an arc.

The following plot shows the possible positions for the aircraft at 19:40UTC as blue dots on the ping ring based on a maximum ground speed of 520kts from the time of the last radar contact. The blue dots in this plot along the ping ring are spaced 1 degree apart in bearing from the satellite sub-point at 19:40UTC.

1940 Ping Arc

The subsequent 4 satellite pings at 20:40UTC, 21:40UTC, 22:40UTC and 00:11UTC show the aircraft being further away from the satellite at each ping.

What I’ve done is to write a simple Python program that takes an assumed constant ground speed as input and then starting with each starting point indicated by the blue dots on the 19:40UTC ping arc tries to fit a projected ground track at that speed to all the subsequent ping rings.

It does that simply by for each starting point iterating over all possible ground tracks from 0-359 degrees in 1 degree increments and working out the resulting ground position at the time of each of the subsequent satellite pings. If there is a match to within a user defined error range, at the moment set at 4% then it plots that ground track.

I’ve plotted the resulting possible ground tracks for 3 different ground speeds from 19:40UTC onwards, 350kt, 400kt and 450kt.

You can see why the initial ‘red arcs’ that Inmarsat released based on the 00:11UTC ping had a gap between them.

MH370 Forwardtrack - 350kt - Doppler Filter - False

MH370 Forwardtrack - 400kt - Doppler Filter - False

MH370 Forwardtrack - 450kt - Doppler Filter - False

In addition to the being able to determine a range to the satellite for each ping Inmarsat also recorded the frequency change due to the Doppler effect at the time of each ping. The Doppler shift data in effect allows you to work out what the LOS (Line Of Sight) speed was between the aircraft’s position and velocity and the satellite’s position and velocity.

The authorities released a graph showing the BFO (Burst Frequency Offset) recorded for each satellite ping. Mike Exner with the help of Duncan Steele have converted these into LOS speeds for each ping.

So my idea was to then filter out the possible ground tracks shown in the plots above by only including those that also matched the Doppler data for each ping.

Given the satellite’s 3D position (SPx, SPy, SPz) and velocity (SVx, SVy, SVz) and the aircraft’s position (APx, APy, APz) and velocity (AVx, AVy, AVz) then the LOS speed between them can be calculated as follows.

LOS Formula

The following plots show the result of including the Doppler data to filter out ground tracks that don’t match both the satellite ping rings and the LOS speed derived from the Doppler data.

However unfortunately in order for any tracks to match based on the Doppler data I had to increase the error margin for the Doppler data to 50-95% compared to the 4% used for the ping ring positions.

The Doppler error margin used for each of the plots is included in the plot title, 50% for 350kt, 70% for 400kt and 95% for 450kt.

MH370 Forwardtrack - 350kt - Doppler Filter - 50

MH370 Forwardtrack - 400kt - Doppler Filter - 70

MH370 Forwardtrack - 450kt - Doppler Filter - 95

All three of the plots above only show southern routes matching both the ping rings and the Doppler data. However if I drop the ground speed to 300kt and use a Doppler error margin of 70% then some of the northern routes also match.

MH370 Forwardtrack - 300kt - Doppler Filter - 70

Given the need to use such large error margins for the Doppler data filtering I see 3 main possibilities.

– I’ve made a mistake in some of the calculations in my program. Feel free to look at the source code and point any errors out.

– I’ve made a typing error when entering some of the data into the program, e.g. the satellite co-ordinates, velocities etc. Again take a look at the source code and let me know if you spot any such errors.

– The data I’m relying on isn’t accurate.

In terms of the data I’m relying on most of it has been derived by others rather than coming directly from the authorities. In terms of the ping ring ranges the authorities have only released the elevation angle of 40 degrees for the last ping at 00:11UTC. All the other ping ring ranges have been calculated by someone digitizing a plot that the authorities released showing a possible 450kt and 400kt route that the aircraft could’ve taken. The assumption being that the two possible ground tracks matched the ping ring range data that Inmarsat had.

The authorities haven’t released the LOS speeds based on the Doppler data for each of the pings. Instead they’ve only released a graph showing the Burst Frequency Offsets (BFO) recorded and others in particular Mike Exner have then tried to work backwards to calculate the LOS speed. However there is a lot of debate regarding the accuracy of this process mainly based on the assumptions used.

I’ve sourced all my data from Duncan Steele’s blog at

In particular the satellite position and velocity data can be found at –

Time Ping Ring Range LOS Speed
19:40UTC 1762nm -53.74kt
20:40UTC 1805nm -70.87kt
21:40UTC 1962nm -84.20kt
22:40UTC 2199nm -97.14kt
00:11UTC 2642nm -111.18kt

The source code for my program, only about 100 lines of Python can be found at – Python source code

Posted in Uncategorized | 9 Comments

MH-370 Back Tracking

For each ping between the aircraft and the Inmarsat satellite we have the range/elevation angle between the satellite at that time (satellite is moving) and the aircraft and we have the Burst Frequency Offset (BFO) Doppler data.

Doppler Diagram

So the LOS speed between the satellite and the aircraft along the LOS vector is as follows:

LOS – Line Of Sight
EA – Elevation Angle between the aircraft and the satellite Vs – Satellite’s velocity vector
Va – Aircraft’s velocity vector
α – Bearing angle from the satellite to the aircraft’s position
β – Angle between the aircraft’s velocity vector and the LOS vector

Doppler Formula 1

If we assume a particular position for the aircraft on the ping ring and a particular ground speed for the aircraft then given we have the LOS speed from the BFO Doppler data we can solve for β.

Doppler Formula 2

Couple of assumptions/simplifications.

- Assuming that Vx and Vy are 0 for the satellite and only using the Vz component.
– Assuming that Vx is 0 for the aircraft

At 00:11UTC at the time of the last ping the satellite’s velocity components were:

Vz = -82.1m/s Vx = 1.5m/s Vy = -1.5m/s

Using the following data from Duncan Steel –

Time (UTC) Elevation Angle Ping Radius (nm) LOS Speed DS (kt) LOS Speed POL (kt) Satellite Z (nm) Satellite Vz (kt)
18:29 53.53 1880 39.77 76.24 623.87 49.12
19:40 55.80 1760 39.14 4.97 651.30 -3.22
20:40 54.98 1806 65.80 57.02 625.88 -47.42
21:40 52.01 1965 79.85 118.34 557.62 -88.38
22:40 47.54 2206 100.64 175.49 451.20 -123.31
00:11 39.33 2652 125.35 250.43 233.91 -159.58

- DS – Duncan Steel & Mike Exner BFO analysis
– POL – Polynomial fit of BFO data – BFO = 0.3673 * ABS(LOS vel in km/hr) + 94.975

I’ve written a Python script that uses the formula above and the data above to iterate over a number of sample aircraft positions along the 00:11UTC ping ring by iterating over α from 1 degree to 89 degrees. An assumed aircraft ground speed is entered as a parameter and for each sample position along the ping arc the script calculates β and derives 2 possible ground tracks for the aircraft at the sample position to match the known LOS speed from the BFO Doppler data.

Given the ground tracks calculated and the supplied aircraft ground speed and the time between this ping and the previous ping the script calculates the aircraft’s position at the time of the previous ping assuming a constant ground speed and constant ground track.

If the computed aircraft position at time of the previous ping is within 2% of the previous ping ring then the aircraft’s position on the current ping ring and it’s calculated position on the previous ping ring are plotted with a line joining them.

The script then continues recursively using the points that are within 2% of the previous ping ring to compute β for each of them based on the LOS speed for this ping ring and working backwards to the previous ping ring.

For a given aircraft ground speed the resulting plot shows the range of possible aircraft positions on the 00:11UTC ping ring and ground track that match the BFO LOS speed and which track back to intersecting the previous ping ring at the correct time.

The idea was to get a feel for the range of possible locations on the 00:11UTC ping ring for a given aircraft ground speed that also matches the BFO LOS speed data at that time and to see how many of those tracks track back to the 19:40UTC ping ring.

The plots also include the last known location for the aircraft at 18:22UTC from the Malaysian military radar, indicated as a yellow diamond. I’ve then assumed a maximum ground speed of 500kts from this time and location for 78 minutes until the 19:40UTC ping and rendered the maximum range that could be covered (650nm) in this time as the yellow circle.

The intersection of this circle with the 19:40UTC ping ring places a maximum northern and southern limit to where the aircraft could be at 19:40UTC along this arc.

When I initially started thinking about writing this program I was only aware of the BFO LOS speeds that Duncan Steele and Mike Exner had calculated. However none of Duncan’s routes that he tried in STK could match the LOS speeds that had been calculated from the BFO data. The routes did match the relevant ping ring times but there was no match for the BFO Doppler data.

Subsequently I became aware from a comment on Duncan Steele’s blog posting for an alternative set of BFO derived LOS speeds. I’ve listed both in the table of data that I’m using above and the Python script takes the LOS speed for the ping ring as an input parameter.

In running the script however there are no matches for any airspeeds tested between 350kts and 500kts that produce any matches using Duncan Steele and Mike Exner’s BFO derived LOS speeds, so in all the plots below I’ve only used the BFO derived LOS speeds from the polynomial fit approach.

Source code for the Python program –

Plots for 350kt, 400kt, 450kt, 480kt and 500kt.

MH370 Backtrack - 350kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 400kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 450kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 480kt - Last Doppler LOS Speed - 250kt

MH370 Backtrack - 500kt - Last Doppler LOS Speed - 250kt

Posted in Uncategorized | Leave a comment

Live Photo Gallery People Tags

I’ve used the tagging feature in the original Windows Vista Photo Gallery and the Windows Live Photo Gallery versions for tagging people in photos and the location of the photo.

The latest version of Windows Live Photo Gallery now includes a specific ‘People’ category separate from the generic tags category for identifying people in photos. In addition it includes a feature to automatically identify faces in photos and to associate the ‘People’ tags with specific faces in the photo.

The sample below taken from the Windows Live Photo blog shows the new face tagging features. You can also manually draw a rectangle over a face and tag it if the Photo Gallery doesn’t automatically detect the face.

One issue I’ve noticed is that the new people tags aren’t indexed by Windows Search. I often use Windows Search to search for photos based on my people and location tags and the new people tags aren’t found. So at the moment you have to search or filter photos based on the new people tags in Live Photo Gallery itself.

I took a look at the XMP meta-data that is stored by the new people tagging feature in the associated photos.

In the snippet below you can see how the rectangular region for the relevant face is stored if there is one plus the PersonDisplayName. There are APIs in WIC plus associated .Net wrappers in the .Net Framework that allow you to read this meta-data so you could make use of the rectangular regions in your own application that may want to display photos and show the tagged faces etc.

<rdf:Description xmlns:prefix0="">
    <rdf:Description xmlns:rdf="">
      <prefix1:Regions xmlns:prefix1="">
        <rdf:Bag xmlns:rdf="">
            <rdf:Description xmlns:rdf="">
              <prefix2:Rectangle xmlns:prefix2="">0.209985, 0.526367, 0.167401, 0.111328</prefix2:Rectangle>
              <prefix3:PersonDisplayName xmlns:prefix3="">Sarah McLeod</prefix3:PersonDisplayName>
            <rdf:Description xmlns:rdf="">
              <prefix4:Rectangle xmlns:prefix4="">0.430250, 0.148438, 0.284875, 0.189453</prefix4:Rectangle>
              <prefix5:PersonDisplayName xmlns:prefix5="">Gwen De Roubaix</prefix5:PersonDisplayName>
            <rdf:Description xmlns:rdf="">
              <prefix6:PersonDisplayName xmlns:prefix6="">Marcelle De Roubaix</prefix6:PersonDisplayName>

The following snippet shows the XMP meta-data that is stored if you tag one of your Messenger contacts as a People tag.

<rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:prefix0="">
    <rdf:Description xmlns:rdf="">
      <prefix1:Regions xmlns:prefix1="">
        <rdf:Bag xmlns:rdf="">
            <rdf:Description xmlns:rdf="">
              <prefix2:PersonDisplayName xmlns:prefix2="">gerhard</prefix2:PersonDisplayName>
              <prefix3:PersonEmailDigest xmlns:prefix3="">89C386678731AB3D7DEE0E14E11E633387FBDBCD</prefix3:PersonEmailDigest>
              <prefix4:PersonLiveIdCID xmlns:prefix4="">8765613456339678115</prefix4:PersonLiveIdCID>

And lastly a snippet showing how the regular tags which are indexed by Windows Search are stored, basically in a <dc:subject> element and in a <MicrosoftPhoto:LastKeywordXMP> element.

<rdf:Description xmlns:dc="">
    <rdf:Bag xmlns:rdf="">
      <rdf:li>People/Sarah McLeod</rdf:li>
<rdf:Description xmlns:MicrosoftPhoto="">
    <rdf:Bag xmlns:rdf="">
      <rdf:li>People/Sarah McLeod</rdf:li>

Posted in Uncategorized | 3 Comments

Bitmap snapshots of WPF Visuals

Recently I needed to create a bitmap of some WPF controls to be used in another program. Doing a quick search turned up references to the RenderTargetBitmap class in WPF with sample code along the lines of:

RenderTargetBitmap bmp = new RenderTargetBitmap((int)element.Width, (int)element.Height, 96, 96, PixelFormats.Pbgra32);

However if the WPF control had a margin then the rendered bitmap had transparent pixels for the margin area. As an example here is a button inside a StackPanel with a margin applied.


And the following is the bitmap that is created via the sample code above:

Doing some more searching turned up the following code which creates a VisualBrush from the target Visual and then renders that into a DrawingVisual and then finally uses RenderTargetBitmap to take a snapshot of the DrawingVisual. Using this approach the margins are ignored and the bitmap only consists of the target WPF control/visual as shown below:

void CreateBitmapFromVisual(Visual target, string filename) 
    if (target == null) 

    Rect bounds = VisualTreeHelper.GetDescendantBounds(target);

    RenderTargetBitmap rtb = new RenderTargetBitmap((Int32)bounds.Width, (Int32)bounds.Height, 96, 96, PixelFormats.Pbgra32); 
    DrawingVisual dv = new DrawingVisual(); 
    using (DrawingContext dc = dv.RenderOpen()) 
        VisualBrush vb = new VisualBrush(target); 
        dc.DrawRectangle(vb, null, new Rect(new Point(), bounds.Size)); 


    PngBitmapEncoder png = new PngBitmapEncoder();


    using (Stream stm = File.Create(filename))

Posted in Uncategorized | 23 Comments

Geeks and Fast Jets

I was approached recently to help develop some Flight Test Instrumentation (FTI) for a Hawker Hunter jet to be used during a test pilot course involving high angle of attack manoeuvres and spinning.

The following data needed to be recorded:

  • Angle of Attack (AoA)
  • Sideslip angle
  • Attitude (pitch, roll and heading)
  • Indicated airspeed and altitude
  • Stick and rudder position
  • GPS
  • Video

In addition to recording all the data listed above they also wanted to display a subset of the data in the cockpit as an aid for spin recovery.


AoA and sideslip – basically small weather vanes that act as potentiometers were ordered and then physically mounted on a pitot/static boom which was then mounted on the front of the Hawker Hunter’s nose. One minor issue was finding a route for the wires from the vanes to the PC in the avionics bay that didn’t have to pass through the cockpit’s pressure capsule.

Attitude – ordered an Attitude and Heading Reference System (AHRS) from Crossbow that uses solid state sensors and outputs at 25Hz. Ideally the AHRS needs to be mounted as close as possible to the centre of gravity. Luckily the avionics bay is fairly close to the centre of gravity, just slightly forward but there was space to mount it quite easily and definitely on the centre line of the aircraft.

GPS - the Crossbow AHRS also includes a built-in GPS which it uses for aiding it’s attitude solution and also outputs standard NMEA data via a second serial port. Mounted the GPS antenna on the spine of the aircraft behind the cockpit with a fairly short vertical run for the GPS antenna cable to the avionics bay.

Air data – used an existing unit we had from MGL Avionics that we had been using on the Ikarus C-42. Wasn’t ideal since it’s maximum indicated airspeed is 217kt which means it was maxed out a lot of the time. However the stall speed was generally below 217kt so it did provide useful data during the stall and spin.

Control positions – we ordered 3 linear transducers which basically have a shaft that moves in and out and it acts as a potentiometer. They were mounted and attached to the 3 control rods that pass through the spine of the aircraft behind the cockpit. It was again a fairly short straight drop down into the avionics for the wires.


We needed a PC on fairly short notice that was fairly ruggedized in terms of handling vibration and also fairly cold operating temperatures since it was going to be mounted in an unpressurized and unheated avionics bay and the spins would normally be starting at 35,000ft.

The only real change we made to the embedded PC we found was to replace the hard drive with a CF card and to run everything off the CF card. In addition to the high dynamics during the spin plus regular vibration causing problems for a normal hard drive most hard drives aren’t rated to operate above 10,000ft pressure altitude anyway.

We used XP Embedded and it’s Enhanced Writer Filter (EWF) to turn the system partition into a read-only volume so it couldn’t be corrupted during power down since the power was simply going to switched off with no proper shutdown. The data from the sensors was logged to a separate partition on the CF card.

The embedded PC also came with 4 onboard RS232 ports which we needed in order to connect to all the RS232 based sensors (AHRS, GPS, air data). To read the analog data from the AoA, sideslip and control position sensors we installed an A/D card into the single PCI slot.


For the in cockpit display of ‘spinning panel’ which mainly consisted of displaying the AoA and sideslip angles plus the control positions we initially developed a Pocket PC application which was connected to the embedded PC via an RS232 connection. However after some concerns regarding it’s sunlight readability and some concerns about physically mounting it in the cockpit the Pocket PC was only used on 2 initial test flights.

We then bought a 6" LCD which was rated at 1200nits for sunlight readability and removed the gunsight to mount the LCD in it’s place on the student’s side of the cockpit. The software for the ‘spin panel’ was then modified to run on the embedded PC that we were using for data capture and to connect it to the LCD using a VGA connection.

The one issue or complication with the ‘spin panel’ display being mounted in the cockpit was the need to have wires passing through the pressure bulkhead from the avionics bay into the cockpit. Special bulkhead connectors have to be used in order to pass the wires through without causing a pressure leak.


An SD based video camera recorder with a remote lens was mounted in the cockpit and used to capture video footage of the flight.


The last part of the project involved the development of a playback system to take the captured data from the sensors and from the video camera and allow the data to be played back and analysed by the pilots and flight test engineers.

The attitude data and air data are displayed as part of a standard Primary Flight Display (PFD). The spin panel display that is used in the cockpit is also rendered plus playback of the video. Lastly we have line graphs displaying the current, past and future data for each of the recorded parameters.

All of the data including the video is synchronised during playback and can be paused, seeked and played back at different speeds, i.e. various slow motion and fast motion playback speeds. The playback application is a Windows Presentation Foundation (WPF) application and makes heavy use of the animation framework within WPF.

And finally getting ready for the last of the 34 sorties on a glorious Cape Town winter’s day.

Posted in Uncategorized | 13 Comments

MapReduce implementation using C# generics

I came across the Google paper on MapReduce again the other day and decided to try a simple implementation using C# generics allowing you to specify specific types for the keys and values as opposed to being forced to use strings for all keys and values.

The initial version doesn’t include any automatic parallelism across multiple CPUs or clusters of machines.

The core implementation below is only about 50-60 lines of code. I’ve also included sample map and reduce functions making use of this library and mirroring some of the sample applications mentioned in the Google paper.

using System; using System.Collections.Generic; // Common shortcut where both keys and both values are strings using MapReduceAllStrings = MapReduce.MapReduce<string,string,string,string>; namespace MapReduce { // Map delegate public delegate IEnumerable<KeyValuePair<K2,V2>> Map<K,V,K2,V2>(KeyValuePair<K,V> input); // Reduce delegate public delegate IEnumerable<V2> Reduce<K2,V2>(KeyValuePair<K2, IEnumerable<V2>> input); public class MapReduce<K, V, K2, V2> { // Process input data using user supplied map and reduce delegates public IEnumerable<KeyValuePair<K2, IEnumerable<V2>>> Process(IEnumerable<KeyValuePair<K,V>> input, Map<K,V,K2,V2> map, Reduce<K2,V2> reduce) { // Use dictionary to store intermdiate data - (k2, list(v2)) Dictionary<K2, IEnumerable<V2>> intermediateData = new Dictionary<K2, IEnumerable<V2>>(); // Perform map over all input foreach (KeyValuePair<K, V> inputItem in input) { // Add map results to intermediate dictionary foreach(KeyValuePair<K2,V2> mapOutput in map(inputItem)) { IEnumerable<V2> enumerableList; // If k2 already exists in dictionary then just add this v2 to it's list(v2) if (intermediateData.TryGetValue(mapOutput.Key, out enumerableList)) { List<V2> v2List = (List<V2>)enumerableList; v2List.Add(mapOutput.Value); } else { // Add new k2 to dictionary and create initial list(v2) with this v2 value List<V2> v2List = new List<V2>(); v2List.Add(mapOutput.Value); intermediateData.Add(mapOutput.Key, v2List); } } } // Setup final output data structure List<KeyValuePair<K2, IEnumerable<V2>>> finalOutput = new List<KeyValuePair<K2,IEnumerable<V2>>>(); // Perform reduce over all intermediate data foreach (KeyValuePair<K2, IEnumerable<V2>> intermediateVal in intermediateData) { // Setup final output value, i.e. k2 and an empty list(v2) in preparation for reduce operation KeyValuePair<K2, IEnumerable<V2>> outputVal = new KeyValuePair<K2, IEnumerable<V2>>(intermediateVal.Key, new List<V2>()); finalOutput.Add(outputVal); // Perform reduce over all intermediate data foreach (V2 val in reduce(intermediateVal)) { // Add resultant values from reduce to final output list(v2) ((List<V2>)(outputVal.Value)).Add(val); } } return finalOutput; } public static IEnumerable<V2> IdentityReduce(KeyValuePair<K2, IEnumerable<V2>> input) { List<V2> output = new List<V2>(); foreach (V2 val in input.Value) { output.Add(val); } return output; } } class MapReduceTest { static void Main(string[] args) { WordCountTest(); ReverseWebLinkGraph(); } /////////////////// - WordCount test static void WordCountTest() { List<KeyValuePair<string, string>> input = new List<KeyValuePair<string, string>> { new KeyValuePair<string,string>("a", "the quick brown fox jumps over the log"), new KeyValuePair<string,string>("b", "while the mokey jumps on the fox"), new KeyValuePair<string,string>("c", "and the cow jumps over the moon") }; MapReduce<string, string, string, int> WordCount = new MapReduce<string, string, string, int>(); foreach(KeyValuePair<string,IEnumerable<int>> val in WordCount.Process(input, WordCountMap, WordCountReduce)) { PrintResult<string, int>(val); } } static IEnumerable<KeyValuePair<string, int>> WordCountMap(KeyValuePair<string, string> input) { List<KeyValuePair<string, int>> output = new List<KeyValuePair<string, int>>(); string[] words = input.Value.Split(' '); foreach (string word in words) { output.Add(new KeyValuePair<string,int>(word, 1)); } return output; } static IEnumerable<int> WordCountReduce(KeyValuePair<string, IEnumerable<int>> input) { List<int> output = new List<int>(); int total = 0; foreach (int wc in input.Value) { total += wc; } output.Add(total); return output; } /////////////////// - Reverse web link graph test static void ReverseWebLinkGraph() { List<KeyValuePair<string, string>> input = new List<KeyValuePair<string, string>> { new KeyValuePair<string,string>("", ""), new KeyValuePair<string,string>("", ""), new KeyValuePair<string,string>("", "") }; MapReduceAllStrings ReverseWebLinks = new MapReduceAllStrings(); foreach (KeyValuePair<string, IEnumerable<string>> val in ReverseWebLinks.Process(input, ReverseWebLinksMap, MapReduceAllStrings.IdentityReduce)) { PrintResult<string, string>(val); } } static IEnumerable<KeyValuePair<string, string>> ReverseWebLinksMap(KeyValuePair<string, string> input) { List<KeyValuePair<string, string>> output = new List<KeyValuePair<string, string>>(); string[] targets = input.Value.Split(' '); foreach (string target in targets) { output.Add(new KeyValuePair<string, string>(target, input.Key)); } return output; } /////////////////// - Helper static void PrintResult<K,V>(KeyValuePair<K, IEnumerable<V>> result) { Console.Write(result.Key.ToString()); Console.Write(" - "); foreach(V val in result.Value) { Console.Write(val.ToString()); Console.Write(" "); } Console.WriteLine(""); } } }
Posted in Uncategorized | 12 Comments

High Dynamic Range (HDR) Photo Test

I started playing around with generating HDR photos. Here is one of my first attempts using our Pentax DSLR.

I’m using a trial version of Photomatix to perform the HDR processing and tone mapping which is why you’ll notice their watermark in the HDR image below.

I set the auto-bracketing option on the camera to shoot a normal exposure, a -2 stop exposure and a +2 stop exposure.

This is the resultant HDR image.

Next up is the camera’s normal exposure, so effectively what you would end up with if you weren’t doing any HDR processing. Camera selected f/9.5 and 1/60 sec.

Now the -2 stop image, so f/9.5 1/250 sec.

And lastly the +2 stop image, so f/9.5 1/15 sec.

Posted in Uncategorized | 17 Comments