Monday, December 27, 2010

Hull Stripping and Attempted Staining

Stripping was relatively easy and very quick. The wood reminded me of making balsa wood models when I was a kid:)

Main points were:

  • Make sure strips are even thickness – if I do this again I’ll invest in a planner thicknesser.

WP_000011

  • I mostly just stapled strips only occasionally using a bead of glue – this had one advantage – towards the end as I looked at the forms and the shape of the hull I realised I had one form on  slightly wrong angle – not sure how this happened as I was sure I had it right at the beginning. Nevermind, I unscrewed it and re-adjusted just a tiny bit and suddenly it looked great.
  • I thought I’d stain the strips before using epoxy to glue them all thogether – unfortunately, Resene interior stain turns out to not be as epoxy compatible as the Resene staff thought it would be. It beaded a bit but wasn’t irrecoverable – I might even put more stain on after I rub it all back in preparation for fibreglassing – but if I do I’ll wash with acetone and use a much thinner application of stain. Check the image below…

WP_000024

 

Hull looks great but I can’t get a good photo showing it the way I see it in real life. Perhaps for the next posting…

Wednesday, December 08, 2010

Spatial Notes–Getting Started

Thought this would be harder than it turned out to be.

Objective: chart some metrics across geographic regions (in my case looking at customer metrics over New Zealand).

To get started I took advantage of SQL Server Spatial. Initially appearances are a  bit daunting until you try it… and it’s fairly easy.

  • Finding the GIS data
    • for NZ  Statistics NZ publish digital boundaries – versioned by publication year (2007, 2001, 1996 etc: http://www.stats.govt.nz/browse_for_stats/people_and_communities/geographic-areas/download-digital-boundaries.aspx
    • The data is available in both ESRI Shapefile format and MapInfo format. You can use shape2sql to load this data into sql server spatial. Unfortunately… it has a bug, it won’t create a table from the shp file inside shape2sql – I had to use SQL Profiler to capture the create table instruction and run it interactively – after that the data load works fine - not sure why.
    • The data has a hierarchy of increasingly larger geometrical areas made up of units of the smallest size (the meshblock). There’s Area Units, Urban Areas, Regional Councils etc.
  • Connecting people to areas
  • Aggregating
    • Use SQL Server Analysis Services, or much faster
    • Use PowerPivot – for me this allowed me to very quickly aggregate customer metrics from an internal database across areas loaded from the Stats NZ files
  • Reporting
    • It’d be nice if Excel had spatial reporting support, but it doesn’t – big thumbs down Microsoft…
    • I used SQL Server Reporting Services and the Map control. Problem is that I need to have the map data in SQL/SSAS. It’d be nice to load from PowerPivot but I can’t do that unless I store the Excel PowerPivot doc in Sharepoint 2010. Jumping through hoops to do this though… Microsoft – just get the spatial support into Excel please!

Sunday, December 05, 2010

Strip cutting

I followed Bjorn’s instructions and cut the strips with a circular saw. The planks were pinned to a folding stand at each end. The paulownia was extremely stable. It didn’t split when I pinned it at the edge of the plank. Not one of the strips split or snapped. In the end it was much easier than I expected. Suspect it would’ve been even easier if I had a decent saw…

Photo0045

Ebonizing Paulownia

Vinegar and steel wool, left for a few days, filtered and then applied to paulownia strip – thinking of using this on my kayak. Top of the strip has been painted twice.

Photo0047

Sunday, November 28, 2010

Black Pearl Forms

So terribly slow but here’s a snippet of the forms.

IMG_1180

 

Lessons learned:

  • Level everything first – makes it much easier to align cross sections
  • Don’t start screwing cross sections to wooden cross bars until they all line up right. Use clamps first.
  • Drill a hole about 5mm on the center of the waterline and shine a bright light from one end – you should be able to see the light all the way down
  • I pre-cut the forms for the deck and re-attached them – supposedly to make it easier for later. Not sure it was worth it.

Here’s that light shining through a couple of the holes.

IMG_1182

Tuesday, October 05, 2010

New Kayak Build: Black Pearl

Two years on from building a Guillemot Kayaks Night Heron stitch and glue I’ve just purchased plans for a Bjorn Thomasson Black Pearl. The Night Heron is a great kayak that’s very quick in the water but I just enjoyed the building side too much so I’ve decided to try building a strip kayak. Rather than go with something closer to the Night Heron I decided to go down a slightly different path:

  • As light as possible – means small or low volume
  • Doesn’t need speed – the Night Heron fulfils that requirement for me
  • Play boat – I’ve taken up canoe polo so I want the most manuverable sea kayak experience I can get – something that will respond to hip/torso movement.

I think the Black Pearl will do this but the proof will be in the pudding. You don’t just order this off the book – the plans are custom printed to your height, arm span and hip measurements.

I’ll update the blog as I go. It will take awhile – no rush as I have an existing kayak already.

Monday, October 04, 2010

Populating a Word Document from a Sharepoint List

I thought this would be simpler since Sharepoint and Word are so closely related. A mail merge based on list data perhaps? Maybe it’s possible with an office data connection file but I couldn’t figure it out.

Well, it was useful to me so here’s how I did it with Powershell and Word automation. The example here is a list of standards and the output will be a list of those standards formatted on one Word document. The soap call from powershell comes from somewhere on the net with an added soapaction header. The list has been created with folder items so it needs to list the items recursively – this wouldn’t apply to a normal list. You’ll figure it out as you start looking at the xml attributes. Best way to make progress with these sharepoint soap calls is to use Wireshark in combination with Stramit Caml Viewer.

Here we go…

function Execute-SOAPRequest 
(
[Xml] $SOAPRequest,
[String] $SOAPAction,
[String] $URL
)
{
write-host "Sending SOAP Request To Server: $URL"
$soapWebRequest = [System.Net.WebRequest]::Create($URL)
$soapWebRequest.Headers.Add("SOAPAction","`"" + $SOAPAction + "`"")

$soapWebRequest.ContentType = "text/xml;charset=`"utf-8`""
$soapWebRequest.Accept = "text/xml"
$soapWebRequest.Method = "POST"

write-host "Authenticating"
$soapWebRequest.Credentials = [System.Net.CredentialCache]::DefaultCredentials
if($soapWebRequest.Proxy -ne $null) {
$soapWebRequest.Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
}

write-host "Initiating Send."
$requestStream = $soapWebRequest.GetRequestStream()
$SOAPRequest.Save($requestStream)
$requestStream.Close()

write-host "Send Complete, Waiting For Response."
$resp = $soapWebRequest.GetResponse()
$responseStream = $resp.GetResponseStream()
$soapReader = [System.IO.StreamReader]($responseStream)
$ReturnXml = [Xml] $soapReader.ReadToEnd()
$responseStream.Close()

write-host "Response Received."

return $ReturnXml
}

function Add-Standard
(
[String] $Standard,
[String] $Justification,
$Doc
)
{
$p = $Doc.Paragraphs.Add()
$p.Range.Text = $Standard
$p.Format.Style = "Heading 1"
$p.Range.InsertParagraphAfter()

$p = $Doc.Paragraphs.Add()
$p.Range.Text = $Justification
$p.Range.InsertParagraphAfter()
}

$d4 = [xml] @"
<?xml version="
1.0" encoding="utf-8"?>
<soap:Envelope xmlns:soap="
http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<soap:Body>
<GetListItems xmlns="http://schemas.microsoft.com/sharepoint/soap/">
<listName>{CF6E48BA-650A-489E-83AB-8EF1E545388A}</listName>
<queryOptions>
<QueryOptions>
<ViewAttributes Scope="Recursive"/>
<IncludeMandatoryColumns>False</IncludeMandatoryColumns>
</QueryOptions>
</queryOptions>
</GetListItems>
</soap:Body>
</soap:Envelope>
"@


# create a document from list data!
$w = new-object -com Word.Application
$w.Visible = $true
$d = $w.Documents.Add()

Execute-SOAPRequest -SOAPRequest $d4 -SOAPAction "
http://schemas.microsoft.com/sharepoint/soap/GetListItems" -URL "http://<some sharepoint site>/sites/CTO/_vti_bin/lists.asmx" |
foreach {$_.Envelope.Body.GetListItemsResponse.GetListItemsResult.listitems.data.row |
foreach {Add-Standard -Standard $_.ows_Standard -Justification $_.ows_Justification -d $d }
}

$d.SaveAs([ref]"c:\temp\standards.docx")
$d.Close()
$w.Quit()

Sunday, August 15, 2010

FST Auckland Conference Presentation

You can find the presentation I did for the Financial Services Technology conference in Auckland last week here.

Brief synopsis: systems/operational management provides a useful analogue to the problems faced by implementers of business intelligence environments: huge volumes of data streaming in for which you need to provide health indicators and predictions of the future.

In the case of Microsoft’s System Center Operations Manager tool this is done in a manner that is very different from conventional BI efforts that rely on staging large quantities of data, building relational and dimensional models and producing KPIs. The solutions developed for operations management can be usefully applied to BI efforts, particularly when not dealing with financial data.

Monday, August 09, 2010

CIO Conference–Enterprise 2.0 Presentation

I’ve put the presentation here: http://cid-0e5ff53cd7f11485.office.live.com/browse.aspx/Public/CIO%20Conference%20July%2010

Questions from the audience about my criticism of data focussed BI got me thinking about what might be a better model. I’ve changed the slide deck to cover more of a BI story for presentation at the FST conference in Auckland this Wednesday.

Seq.unfold

Don’t know why this had escaped me for so long!

 

let PascalsTriangle =
let nextLine aLine =
let partialLine =
aLine
|> Seq.pairwise
|> Seq.map (fun (x,y) -> x+y)
|> Seq.toList
List.append (1::partialLine) [1]
[1] |> Seq.unfold (fun s -> Some(s, nextLine s))

Compare to: Pascal’s Triangle in F#

Friday, July 30, 2010

Heat/Image Mapping with .Net/F#

As part of a demonstration to the Wellington .net user group this week I put together a demo showing a heat mapping of the performance of a transactional system as a function of time of day and arrival rate. My objective was to provide a simple eye balling test of whether performance was being impacted more by daytime customer loading or by competing overnight batch processing. I’ve tried a similar task before with Python and MatPlotLib but this time I wanted to see how easy it was to accomplish in .Net/F#.

The answer was ‘very easy’.

The interactive F# environment in VS2010 gives you JIT compiled code so it’s very fast to work with. The data structures/collections available within F# make importing and manipulating data simple – once it’s in memory you spend most of the time using filter, map and fold operations on elements – explicit recursion or iteration is rarely necessary.

I found I started with sequences then moved to arrays as data volumes increased and performance slowed. Array.Parallel.map/mapi seemed to make a significant difference whenever the operation being performed was of order a few milliseconds. It’s pointless parallising when the operation performed is short – the overhead of managing threads must just be too great.

Most of the effort was spent aggregating and binning the data, the actual image prep was trivial. The following assumes you have an array of data to display.

open System.Drawing
open System.Drawing.Imaging
open System.Windows.Forms
// adapted some code here from http://thecodedecanter.wordpress.com/2010/04/30/modelling-the-2d-heat-equation-in-f-using-100-lines-of-code/
let toBitmap (arr:Color[,]) =
let image = new Bitmap(arr.GetLength(0),arr.GetLength(1),Imaging.PixelFormat.Format24bppRgb)
for i=0 to image.Width-1
do
for
j=0 to image.Height-1
do
image.SetPixel(i, j, (arr.[i,j]))
done
done
image

let maxInArray2D (ar:'a[,]) =
seq{ for r in 0..Array2D.length1 ar-1
do
yield
Seq.max (ar.[r..r,*] |> Seq.cast<'a>)
} |> Seq.max

let intensityMap intensity = Color.FromArgb((int (intensity * 255.0)),0,0)

let bitmap imageArray =
let max = imageArray |> maxInArray2D
imageArray
|> Array2D.map (fun f -> f/max)
|> Array2D.map (fun f -> intensityMap f)
|> toBitmap

let ShowForm (f : Form) =
#if INTERACTIVE
f.Show()
#endif
#if
COMPILED
Application.Run(f)
#endif

let
BitmapForm bitmap =
let picBox = new PictureBox(BorderStyle = BorderStyle.Fixed3D, Image = bitmap, Size = bitmap.Size,
Dock = DockStyle.Fill, SizeMode = PictureBoxSizeMode.StretchImage)
let form = new Form(Text = "F# Connection Performance versus Connection Rate and Time", Size = bitmap.Size, AutoSize = true)
form.Controls.Add(picBox)
form

bitmap surfaceArray |> BitmapForm |> ShowForm


If you don’t have many points that’s going to generate a small image which you can expand by dragging the corner and Windows will magically interpolate for you. I’m sure there’s a way to explicitly do this with the BitmapForm API as well – just didn’t have time to figure that out.



If you want to interpolate yourself, you could try something simple like a bilinear approach:



let bilinearInterpolation f00 f01 f10 f11 x y =
let boundingBox = Array2D.zeroCreate<float> 2 2

boundingBox.[0,0] <- f00
boundingBox.[0,1] <- f01
boundingBox.[1,0] <- f10
boundingBox.[1,1] <- f11
let A = RowVector.ofList [1.-x; x]
let B = Matrix.ofArray2D boundingBox
let C = Vector.ofList [1.-y; y]
A * B * C


Or why not just double the resolution?



// take an array m x n and return (2m-1) x (2n-1)
let interpolate (A:float[,]) =
    let B = Array2D.zeroCreate<float> (2*(A.GetLength 0) - 1) (2*(A.GetLength 1) - 1)
    for row in 0..((A.GetLength 0) - 1) do
        for col in 0..((A.GetLength 1) - 1) do
            B.[2*row, 2*col] <- A.[row,col]
        done
    done
    //
    for row in 0..((B.GetLength 0) - 1) do
        if row % 2 = 0 then
            for col in 0..((B.GetLength 1) - 1) do
                if col % 2 = 1 then
                    B.[row, col] <- (B.[row,col-1] + B.[row,col+1]) / 2.
            done
        else
            for col in 0..((B.GetLength 1) - 1) do
                if col % 2 = 0 then
                    B.[row, col] <- (B.[row-1,col] + B.[row+1,col]) / 2.
                else
                    B.[row, col] <- (B.[row-1,col-1] + B.[row+1,col-1] + B.[row-1,col+1] + B.[row+1,col+1]) / 4.
            done
    done
    B

// now I can do this...
bitmap (surfaceArray |> interpolate |> interpolate |> interpolate |> interpolate)|> BitmapForm |> ShowForm


And generate results like this (where connection rate increases down, time goes across, and redness indicates duration of calls):



performancemap

Monday, July 26, 2010

F# for Analysis: Wellington .Net User Group Meeting Wed 28th July

Some brief notes for anyone attending this session.

There’s a brief slide deck here.

There’s sample code working through the basics of the language here.

Sample code for the cross correlation of two time series in F# here.

I’ll put a separate post together for image generation of a heat map showing web performance data.

System Center Operations Manager and BI

Last week as part of a presentation to the CIO summit in Auckland I had a bit of fun criticising conventional corporate BI efforts as being too focussed on data management. Basically, the argument went that companies spend too much of the budget worrying about getting the data right and not about enabling people to use it – and that ‘easy to use’ BI front ends are a pointless exercise. It’s not a hard argument to make – you can expend a great deal of time and effort on staging, transforming, cubing, KPI development, and reconciliation quality checks – usually driven by the fact that corporate BI is focussed on financial measures which drives people to want the data correct to the cent.

The natural question that comes back in response is what should be done instead?

Operations management using a model driven tool like System Center Operations Manager provides some useful pointers.

Think about it – what’s operations management?

= Data + Model + Health logic + Alerting for response

What’s BI?

= Data + Model + Business Health Logic + “Strategy execution” for response

What’s smart about model driven operations management?

The tools (like SCOM) monitor systems that can produce huge volumes of data; they give that data context with hierarchical models (à la cubes) that can even be graphical (LiveMaps/Visio 2010 diagrams); they provide health indicators (cf KPIs) and they empower end users by making the information available for more detailed analysis at the desktop.

So how would you put this kind of model into action for BI?

How about mirrored true to source data sources using cheap databases (maybe SQL Express/MySQL). You’d need a grooming schedule to stay on top of the data volumes – perhaps and agent model just like the operations management tools have.

To collect the data from multiple sources, and give some context a cube model is attractive, but what about implementing with PowerPivot. It puts the end user in charge. They can create health indications in an Excel like environment; and they can submit back into a shared (Sharepoint) repository.

I’ve got to present at the longwinded FST 3rd Annual Technology & Innovation – the Future of Banking & Financial Services New Zealand conference in August. I’ll be exploring this in more detail and hopefully have a few examples to test out.

Sunday, July 18, 2010

Dynamically changing format statements in the F# ISE

This had me stuck yesterday – how to change the width of  some formatted printf output. In the end with a bit of searching I realised the TextWriterFormat member gives you a way to define the format such that you can change it in your program. An example is a better viewer for the Pascals Triangle sequence generator I made recently – all related to a school math project. Also – where does it tell you that to print a % you need to double it? Why didn’t they do as with speech marks and use a leading \?

let rec PascalsTriangle = seq {
yield [1];
for aLine in PascalsTriangle
->
let
newLine =
aLine
|> Seq.pairwise
|> Seq.map (fun (x,y) -> x+y)
|> Seq.toList
List.append (1::newLine) [1]
}

PascalsTriangle |> Seq.take 10
//|> Seq.iter (fun i -> printfn "%A" i)
|> Seq.iter (fun v
->
v |> Seq.iter (fun w -> printf "%3i" w)
printfn
""
)

// smarten up the printing...
let samplePT = PascalsTriangle |> Seq.take 12
let maxDigits = (Seq.concat samplePT |> Seq.max).ToString().Length
let numberFormatter digits = Printf.TextWriterFormat<int->unit>(sprintf "%%%dd" digits)
let spaceFormatter digits = Printf.TextWriterFormat<string->unit>(sprintf "%%%ds" digits)
let rows = Seq.length samplePT
samplePT |> Seq.iteri ( fun u v
->
for
i in 1..((rows / 1) - u)
do
printf (spaceFormatter ((int)((maxDigits+1)/2)))
" "
v |> Seq.iter (fun w -> printf (numberFormatter (maxDigits+1)) w)
printfn
""
)


The output is like this:


                      1

                    1   1


                  1   2   1


                1   3   3   1


              1   4   6   4   1


            1   5  10  10   5   1


          1   6  15  20  15   6   1


        1   7  21  35  35  21   7   1


      1   8  28  56  70  56  28   8   1


    1   9  36  84 126 126  84  36   9   1


  1  10  45 120 210 252 210 120  45  10   1


1  11  55 165 330 462 462 330 165  55  11   1


Wish I knew how to do this in WPF…



Wish I knew a better way to post into the blog…

Friday, June 04, 2010

IP Address to Country Map

I’m learning how to use the F# HashMultiMap and I thought up a sample exercise – do a lookup in memory of IP address to a country code. This is a task I’ve seen embedded in a relational table before but that seemed daft – it’s on disk so it will be slow. It seems smarter to do this all in memory and just serve out a result.

It’s relatively easy to get hold of IP address country/region/ISP lookups in the form of a single file, for example http://software77.net/geo-ip/ provide one for country lookups under a GNU license.

First I load up the text file:

open System
open System.IO
open System.Text.RegularExpressions

let (|ActiveRegex|_|) regex str =
let ms = Regex(regex).Matches(str)
if ms.Count > 0
then Some ((Seq.cast ms : Match seq))
else None

let matches s re =
match s with
| ActiveRegex re results -> results
| _ -> Seq.empty

let capturesSeq s p =
seq{
for m in matches s p ->
Seq.skip 1 (seq{for g in m.Groups -> g.Value})
}
|> Seq.concat


let csvRegex = "\"([\w\s:;'`~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)(?:\",|\"$)"

let isInt64 i =
let v,_ = Int64.TryParse(i)
v

let parseIpToCountryLine lineNo (line:String) =
try
let
values =
capturesSeq line csvRegex
|> Seq.toArray

isInt64 values.[0] |> fun test -> if not test then failwith (sprintf "Bad IP FROM on line %i" lineNo)
isInt64 values.[1] |> fun test -> if not test then failwith (sprintf "Bad IP TO on line %i" lineNo)
int64 values.[0], int64 values.[1], string values.[6]
with
| :? System.IndexOutOfRangeException -> failwithf "Failed on line %A, contents: %A" lineNo line

let IpToCountryLines = File.ReadAllLines(@"c:\temp\IpToCountry.csv", Text.Encoding.Default)

(*
File format like this... detail in the header comments of the file
# IP FROM IP TO REGISTRY ASSIGNED CTRY CNTRY COUNTRY
# "1346797568","1346801663","ripencc","20010601","il","isr","Israel"

there is a tricky line in there, it contains an 'Ã…' character which seems to require that I explicitly define the text encoding. Don't see why since it's just default...
*)

let getIpToCountry (lines:string []) =
lines
|> Seq.filter (fun i -> not(i.StartsWith("#")))
|> Seq.mapi parseIpToCountryLine

let Ip2Country = getIpToCountry IpToCountryLines

Then a function to create a numeric form of an IP address:

let numIP (a:int) (b:int) (c:int) (d:int) = (int64 d) + ((int64 c)*256L) + ((int64 b)*256L*256L) + ((int64 a)*256L*256L*256L)


 


Then the hash and a quick test:

#r "FSharp.PowerPack.dll"
// count down until a key is found - could be speed up by creating extra entries when to-from is a large gap
let Ip2Country3 = HashMultiMap<_,_>(
Ip2Country
|> Seq.map (fun (ipFrom,ipTo,country) -> ipFrom, country )
, HashIdentity.Structural)

let rec countryFromIP (ip:int64) =
if Ip2Country3.ContainsKey(ip) then
Ip2Country3.[ip]
else
countryFromIP (ip-1L)


// test with 10 addresses taken from our public site web logs
let testIPs = [… in the interests of confidentiality stick some of your own in here… as (a,b,c,d) tuples…]

let time f x =
let timer = System.Diagnostics.Stopwatch.StartNew()
try f x finally
printf "Execution duration: %gms\n" timer.Elapsed.TotalMilliseconds;;

testIPs |> List.map (fun (a,b,c,d) -> numIP a b c d) |> time List.map (fun nIP -> countryFromIP nIP)

I’ve found it takes about 1 millisecond for 10 random lookups on my virtual workstation instance (running under Hyper-V with 3GB allocated on a desktop Dell Optiplex 760).

Wednesday, May 26, 2010

My Regex Lessons (Probably 1 of many…)

OK, that previous post had a couple of minor problems.

First that regular expression – terribly unwieldy. I haven’t managed to shorten it fully yet, but this is marginally better:

"([\w\s:;… add whatever characters are valid here…]*)(?:\",|\"$)


Secondly, I wasn’t fully thinking through the matches/groups/captures hierarchy. I kept getting empty captures being reported back without understanding why I was getting them.



Updated code here:



open System
open System.Text.RegularExpressions

let (|ActiveRegex|_|) regex str =
let ms = Regex(regex).Matches(str)
if ms.Count > 0
then Some ((Seq.cast ms : Match seq))
else None

let matches s re =
match s with
| ActiveRegex re results -> results
| _ -> Seq.empty

let capturesSeq s p =
seq{
for m in matches s p ->
Seq.skip 1 (seq{for g in m.Groups -> g.Value})
}
|> Seq.concat

let csvRegex = "\"([\w\s:;~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)(?:\",|\"$)"

let testLine = "\"31\",\"a 1\",\"b-2\",\"c+3\",\",.;~!@#$%^&*()\/?><,.|{}[]_+-\",\"\",\"14/05/2010 12:12:20 a.m.\",\"1: 2; 3. 4? 5[ 6] 7& 8*\",\"a,b\""

capturesSeq testLine csvRegex
|> Seq.iter (fun x -> printfn "%A" x)

Tuesday, May 25, 2010

Parsing CSV Escaped with Speech Marks

I just know this isn’t the right way to do this but what the heck it seems to work for me. 

The simple match is "([\w,]*)",+|"([\w,]*)"$ but I needed to account for all the other commonly occurring characters such as %&* etc. ( I would’ve thought I could just use .* but I can’t seem to get it to work.)

I’m using the F# Active Pattern approach to divvy up the matches – and returning the Match objects into the seq rather than breaking out the capture in the active pattern. This was more useful to me when I was using the matches later on in the code.

open System.Text.RegularExpressions

let (|ActiveRegex|_|) regex str =
let ms = Regex(regex).Matches(str)
if ms.Count > 0
then Some ([ for m in ms -> m ])
else None

let matches s re =
match s with
| ActiveRegex re results -> results
| _ -> []

let testLine = "\"31\",\"a 1\",\"b-2\",\"c+3\",\",.;~!@#$%^&*()\/?><,.|{}[]_+-\",\"\",\"14/05/2010 12:12:20 a.m.\",\"1: 2; 3. 4? 5[ 6] 7& 8*\",\"a,b\""

matches testLine "\"([\w\s:;~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)\",+|\"([\w\s:;~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)\"$"
let printMatches s p =
for m in matches s p do
seq{for g in m.Groups -> g}
|> Seq.skip 1
|> Seq.iter (fun x -> printfn "%A" x)

printMatches testLine "\"([\w\s:;~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)\",+|\"([\w\s:;~!@#$%\^&\*_<>,\.\\\/\|\[\]\{\}\(\)\-\+\?]*)\"$"
  

Thursday, May 20, 2010

System Center Operations Manager Health Explorer Context

I’ve been wondering for a while why the Health Explorer doesn’t always display the context for a state change. I found one recent link through both Bing and Google that explains what’s going on: http://social.technet.microsoft.com/Forums/en-US/systemcenteressentials/thread/df725076-e307-4d63-9be9-5d875c6924b5/.

Basically, at Entity level if there are multiple sources for state change information then the UI doesn’t know how to represent that information so it just says “No context was available for this state change event” and it displays a bunch of state changes with no text… something like this:

image

What you need to do is drill down into Availability/Configuration/Performance etc and as you do you’ll get the state change context appearing.

Wednesday, May 19, 2010

Pascal’s Triangle in F#

Fun with F# :)

let rec PascalsTriangle = seq { 
yield [1];
for aLine in PascalsTriangle ->
List.append (1::Seq.toList (Seq.map (fun (x,y) -> x+y) (Seq.pairwise aLine))) ([1])
}



> PascalsTriangle |> Seq.take 10 |> Seq.toList;;
val it : int list list =
[[1]; [1; 1]; [1; 2; 1]; [1; 3; 3; 1]; [1; 4; 6; 4; 1]; [1; 5; 10; 10; 5; 1];
[1; 6; 15; 20; 15; 6; 1]; [1; 7; 21; 35; 35; 21; 7; 1];
[1; 8; 28; 56; 70; 56; 28; 8; 1]; [1; 9; 36; 84; 126; 126; 84; 36; 9; 1]]
>


 



I actually started this with a pipe forward approach, but ( at least with my knowledge of the syntax) it seemed longer to write down.



let rec PascalsTriangle1 = seq { 
yield [1];
for aLine in PascalsTriangle1 ->
let
newLine =
aLine
|> Seq.pairwise
|> Seq.map (fun (x,y) -> x+y)
|> Seq.toList
List.append (1::newLine) [1]
}

Friday, May 14, 2010

Discovering the presence of a database in a System Center Powershell script

I’m pretty sure there’s a better way to do this using the native SQL Server management pack but this worked for me (peppered with debugging statements). It looks for a database called staging and if it finds it, it returns the discovery information back.

      <Discovery ID="B.Staging.DiscoverProcessingComponent" Enabled="true" Target="B.Staging.ComputerRole" ConfirmDelivery="true" Remotable="true" Priority="Normal">
<
Category>Discovery</Category>
<
DiscoveryTypes>
<
DiscoveryClass TypeID="B.Staging.ProcessingComponent" />
</
DiscoveryTypes>
<
DataSource ID="PSScript" TypeID="Windows!Microsoft.Windows.TimedPowerShell.DiscoveryProvider">
<
IntervalSeconds>180</IntervalSeconds>
<
SyncTime />
<
ScriptName>DiscoverBStagingProcessingComponent.ps1</ScriptName>
<
ScriptBody><![CDATA[
param($sourceId, $managedEntityId, $computerName)

$api = New-Object -comObject 'MOM.ScriptAPI'
$api.LogScriptEvent("Processing discovery",101,2,"Created MOM.ScriptAPI with param sourceId = $sourceId , managedEntityId = $managedEntityId , computerName = $computerName ")

$discoveryData = $api.CreateDiscoveryData(0, $sourceId, $managedEntityId)
$api.LogScriptEvent("Processing discovery",101,2,"Executed CreateDiscoveryData")

[system.reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null

$s=new-object('Microsoft.SqlServer.Management.Smo.Server') $computerName
$dbs=$s.databases
$dbs | get-member -membertype property
$staging = ($dbs | select name | where {$_.name -eq 'staging'})."Name"
$api.LogScriptEvent("Processing discovery",101,2,"Found anything: $staging")

$instance = $discoveryData.CreateClassInstance("$MPElement[Name='B.Staging.ProcessingComponent']$")
$api.LogScriptEvent("Processing discovery",101,2,"Executed CreateClassInstance")

if ($staging -eq 'staging') {
$instance.AddProperty("$MPElement[Name='Windows!Microsoft.Windows.Computer']/PrincipalName$", $computerName)
$instance.AddProperty("$MPElement[Name='B.Staging.ProcessingComponent']/B.Staging.ProcessingComponentKey$", "staging")

$api.LogScriptEvent("Processing discovery",101,2,"In the staging branch")

} else {
$instance.AddProperty("$MPElement[Name='Windows!Microsoft.Windows.Computer']/PrincipalName$", "")
$instance.AddProperty("$MPElement[Name='B.Staging.ProcessingComponent']/B.Staging.ProcessingComponentKey$", "")

$api.LogScriptEvent("Processing discovery",101,2,"In the not-staging branch")

}

$api.LogScriptEvent("Processing discovery",101,2,"About to AddInstance")

$discoveryData.AddInstance($instance)

$api.LogScriptEvent("Processing discovery",101,2,"About to return discoveryData")

$discoveryData

]]></ScriptBody>
<
Parameters>
<
Parameter>
<
Name>sourceID</Name>
<
Value>$MPElement$</Value>
</
Parameter>
<
Parameter>
<
Name>managedEntityID</Name>
<
Value>$Target/Id$</Value>
</
Parameter>
<
Parameter>
<
Name>computerName</Name>
<
Value>$Target/Host/Property[Type="Windows!Microsoft.Windows.Computer"]/PrincipalName$</Value>
</
Parameter>
</
Parameters>
<
TimeoutSeconds>300</TimeoutSeconds>
</
DataSource>
</
Discovery>

Lessons in System Center Operations Manager Discovery

Some quick notes before I forget them.

General Hints

  • Run through the Management Pack Authoring Guide first. It’s useful but I feel it didn’t give me enough of an explanation of what’s actually going on under the covers. Much of the inner workings of system center remain a mystery to me.
  • Setup a single all in one dev environment to allow yourself the chance to really play around when you’re not sure what’s going on. Then load on an agent onto your normal desktop workstation/laptop and test discoveries out on that machine at the same time as you begin deploying your management packs into the production environment.
  • Get a copy of Savision Live Maps – a 5 diagram copy is freely available. It has a simple, easy to use interface for finding your class instances and testing out your discoveries. And, I’ve just discovered – there’s a new version out – v5 is RTM - gotta try it.

Registry Discovery

  • Use the filtered registry discovery not the unfiltered discovery. This ensures you can get the Build Event Expression page which enables you to test for the existence of a registry key.

WMI Discovery

  • Use wbemtest.exe to try out queries and drill down into the CIM classes.
  • Documentation to help you with WMI is again weak. Classic example – at time of writing this there’s a community comment to highlight the fact that the SQL Server 2008 ComputerManagement namespace is incorrectly labelled (should be root\Microsoft\SqlServer\ComputerManagement10 – you can verify by using select * from __namespace under root\Microsoft\SqlServer).

Powershell Script Discovery

Thursday, May 06, 2010

Adding SCOM Class Properties to the DGML

Updated transform for SCOM/OpsMgr management pack diagramming with DGML. This time I added in the SCOM class attributes as properties on DGML nodes – perhaps I should’ve used categories?? The legend in the visual studio viewer then lets you highlight parts of the class model based upon properties.

scr1

Next thing would be to include attributes about relationships as properties on the links. Also, what about referencing other system center management packs?

<?xmlversion="1.0" encoding="utf-8"?>
<
xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns="http://schemas.microsoft.com/vs/2009/dgml" version="1.0">
    <
xsl:outputmethod="xml" indent="yes"/>
    <
xsl:templatematch="/">
        <
DirectedGraph>
            <
Nodes>
                <
xsl:apply-templatesselect="ManagementPack/TypeDefinitions/EntityTypes/ClassTypes/ClassType" />
            </
Nodes>
            <
Links>
                <
xsl:apply-templatesselect="ManagementPack/TypeDefinitions/EntityTypes/RelationshipTypes/RelationshipType" />
            </
Links>
            <
Categories>
            </
Categories>
            <
Properties>
                <
PropertyId="Accessibility" Label="Accessibility" Description="Accessibility" DataType="System.String" />
                <
PropertyId="Abstract" Label="Abstract" Description="Whether this is an abstract class" DataType="System.Boolean" />
                <
PropertyId="Base" Label="Base" Description="Whether this is a base class" DataType="System.String"/>
                <
PropertyId="Hosted" Label="Hosted" Description="Whether this is hosted" DataType="System.Boolean"/>
                <
PropertyId="Singleton" Label="Singleton" Description="Whether this is singleton" DataType="System.Boolean"/>
            </
Properties>
        </
DirectedGraph>
    </
xsl:template>
    <
xsl:templatematch="ManagementPack/TypeDefinitions/EntityTypes/ClassTypes/ClassType">
        <
xsl:elementname="Node">
            <
xsl:attributename="Id">
                <
xsl:value-ofselect="@ID"/>
            </
xsl:attribute>
            <
xsl:attributename="Accessibility">
                <
xsl:value-ofselect="@Accessibility"/>
            </
xsl:attribute>
            <
xsl:attributename="Abstract">
                <
xsl:value-ofselect="@Abstract"/>
            </
xsl:attribute>
            <
xsl:attributename="Base">
                <
xsl:value-ofselect="@Base"/>
            </
xsl:attribute>
            <
xsl:attributename="Hosted">
                <
xsl:value-ofselect="@Hosted"/>
            </
xsl:attribute>
            <
xsl:attributename="Singleton">
                <
xsl:value-ofselect="@Singleton"/>
            </
xsl:attribute>
        </
xsl:element>
    </
xsl:template>

    <
xsl:templatematch="ManagementPack/TypeDefinitions/EntityTypes/RelationshipTypes/RelationshipType">
        <
Link>
            <
xsl:attributename="Source">
                <
xsl:value-ofselect="Source"/>
            </
xsl:attribute>
            <
xsl:attributename="Target">
                <
xsl:value-ofselect="Target"/>
            </
xsl:attribute>
        </
Link>
    </
xsl:template>
</
xsl:stylesheet>

Monday, May 03, 2010

WMI Web Site Discovery with System Center Operations Manager

Learning how to perform a discovery with WMI under SCOM/OpsMgr isn’t easy – documentation is appalling and there’s few examples on the net. Here’s my, eventually, successful attempt with some lessons learned along the way.

First off a quick description of the original problem – I want to discover multiple instances of web services running on servers.

After much, much time spent building a single server SCOM/OpsMgr development environment (just exactly what do people call this product anyway?) and figuring my way tentatively through SCOM and the SCOM Authoring Console I defined a class based on Microsoft.Windows.LocalApplication. If you ever have any issues with this stuff a good place to go is the Operations Manager Authoring discussion board – one of my questions and the helpful response from Elizabeth 1978 is here.

clip_image002[17]

Then I created a discovery based upon a wmi query against the root\webadministration namespace to retrieve a list of sites. (Try it out in Powershell first: get-wmiobject –query “select * from site” –namespace root\webadministration.)

 clip_image002[19]

 

Now try to simulate this (learning to use the simulate tool as an experiment in itself – remember to load the MP into SCOM then click the arrow heads to attach to your development RMS so expressions can be resolved)… and it doesn’t work.

clip_image002[21]

clip_image002[23]

clip_image002[29]

Strange. Then by trial and error I find that the asterisk doesn’t allow SCOM to pick up the data element for the name from the WMI query. Basically it seems to need you to identify each value you want returned in the WMI query to be sure you can map it in the Authoring Console mapper screen.

So this works:

clip_image002[25]

You can check by running a simulation.

clip_image002[27]

And to be doubly sure I imported the management pack into the SCOM environment with a monitor that targeted the class and checked when the ASP.NET Apps v4.0.30319 Requests/Sec counter was in excess of 10 requests/second.

As a web service I used a wcf service – just a simple hello world affair (in F# – why is there no template in visual studio??) and executed from a powershell console.

1..1000 | foreach-object {(new-object system.net.webclient).downloadstring(“http://localhost/service.svc”) }

Thursday, April 22, 2010

“Ah Ha” Moment

So if I can convert a system definition model from OpsMgr/SCOM to directed graph markup language… then presumably I could use VS2010 to create my architecture model and then generate the framework for my OpsMgr/SCOM management pack!

Update... it seems to me that none fo the vs2010 model types make sense with scom. Guess that it might be better waiting till the system center team publish a modelling tool.

Wednesday, April 21, 2010

SCOM Class Hierarchy to Directed Graph Markup

As part of a struggle to get to grips with management pack authoring for Systems Center Operations Manager I wanted a way to diagram the relationships between classes. You can do it with the OpsMgr Authoring kit and Visio… but I don’t have it on my machine. However, with the management pack being xml, it seemed a simple job to transform to directed graph markup language to view inside Visual Studio 2010. Here’s a sample xsl transform.

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:template match="/">
<DirectedGraph xmlns="http://schemas.microsoft.com/vs/2009/dgml">
<Nodes>
</Nodes>
<Links>
<xsl:for-each select="ManagementPack/TypeDefinitions/EntityTypes/RelationshipTypes/RelationshipType">
<Link>
<xsl:attribute name="Source"><xsl:value-of select="Source"/></xsl:attribute>
<xsl:attribute name="Target"><xsl:value-of select="Target"/></xsl:attribute>
</Link>
</xsl:for-each>
</Links>
<Categories>
</Categories>
</DirectedGraph>
</xsl:template>
</xsl:stylesheet>




This seems to work with the sample management pack created on Brian Wren’s blog for an article about embedded Powershell scripts. Using the transform and either the Altova xslt1 processor or the native transform in VS2010 I got a dgml file that displays like this.



image

Monday, January 18, 2010

PowerShell Data Munging

There’s a data munging exercise on Rosetta Code which was missing a PowerShell solution. Since I spend a lot of time importing data to graph and present in my day job – often using PowerShell, I thought I’d add a couple of PowerShell options for the import of the data in the Data Munging 2 problem.

First just using iteration alone to look up good values.

$dateHash = @{}
$goodLineCount = 0
get-content c:\temp\readings.txt
ForEach-Object {
$line = $_.split(" `t",2)
if ($dateHash.containskey($line[0])) {
$line[0] + " is duplicated"
} else {
$dateHash.add($line[0], $line[1])
}
# split up the 24 instrument values and count the total number of entries with flag >=1
$readings = $line[1].split()
$goodLine = $true
if ($readings.count -ne 48) { $goodLine = $false; "incorrect line length : $line[0]" }
for ($i=0; $i -lt $readings.count; $i++) {
if ($i % 2 -ne 0) {
if ([int]$readings[$i] -lt 1) {
$goodLine = $false
}
}
}
if ($goodLine) { $goodLineCount++ }
}
$goodLineCount




And secondly taking advantage of the regular expression syntax.



$dateHash = @{}
$goodLineCount = 0
ForEach ($rawLine in ( get-content c:\temp\readings.txt) ){
$line = $rawLine.split(" `t",2)
if ($dateHash.containskey($line[0])) {
$line[0] + " is duplicated"
} else {
$dateHash.add($line[0], $line[1])
}
$readings = [regex]::matches($line[1],"\d+\.\d+\s-?\d")
if ($readings.count -ne 24) { "incorrect number of readings for date " + $line[0] }
$goodLine = $true
foreach ($flagMatch in [regex]::matches($line[1],"\d\.\d*\s(?<flag>-?\d)")) {
if ([int][string]$flagMatch.groups["flag"].value -lt 1) {
$goodLine = $false
}
}
if ($goodLine) { $goodLineCount++}
}
[string]$goodLineCount + " good lines"




The F# solution on the site includes use of seq.forall – I thought maybe this would be useful to implement for the PowerShell solution as well – but I couldn't figure it out. Fortunately the good folks at stackoverflow helped out on that...