Outlook: Decline Meeting Invite but Save a Copy

Outlook doesn’t offer a built-in feature to decline a meeting invitation whilst keeping a copy of the meeting in your calendar. In this blog post I’ll explain how to add this in.

Why is this even useful though? Sometimes I get invited to a meeting that I can’t / won’t attend but want to keep the attached papers for, or at least a record of when the meeting took place and who was invited. I’d like to let the organiser know that I’ve received their invitation but won’t be attending, however Outlook’s “Decline” response auto-deletes the invitation and appointment – so I’ve lost all that good information. Alas there’s no option to change the behaviour of “Decline”.

What I want is a button I can click (from the meeting invitation email) that will send a “decline” response back to the organiser, keep (a copy of) the meeting in my calendar, and mark the appointment as “Free”.

This is actually pretty easy in Outlook 2013, and I’ve drawn on some neat code to accept or decline a meeting request using VBA from Diane Pomersky at Slipstick, and a handy sub-routine to copy attachments on Outlook items from Sue Mosher.

Here’s the VBA code you need to add:

Public Sub SaveAndDecline()
    Dim oAppt As AppointmentItem
    Dim cAppt As AppointmentItem
    Dim oResponse

    Set cAppt = GetCurrentItem.GetAssociatedAppointment(True)
    Set oAppt = Application.CreateItem(olAppointmentItem)

    With oAppt
        .Subject = "DECLINED: " & cAppt.Subject
        .Location = cAppt.Location
        .Body = cAppt.Body
        .Categories = cAppt.Categories
        .Importance = cAppt.Importance
        .Start = cAppt.Start
        .Duration = cAppt.Duration
        .BusyStatus = olFree
        .RequiredAttendees = cAppt.RequiredAttendees
        .OptionalAttendees = cAppt.OptionalAttendees
        .Resources = cAppt.Resources
        .ReminderSet = cAppt.ReminderSet
        .ReminderMinutesBeforeStart = cAppt.ReminderMinutesBeforeStart
        .ResponseRequested = cAppt.ResponseRequested
        .AllDayEvent = cAppt.AllDayEvent
        Call CopyAttachments(cAppt, oAppt)
        .Save
    End With

    Set oResponse = cAppt.Respond(olMeetingDeclined, True)
    oResponse.Send

    Set cAppt = Nothing
    Set oAppt = Nothing
End Sub

Function GetCurrentItem() As Object
    Dim objApp As Outlook.Application
 
    Set objApp = Application
    On Error Resume Next
    Select Case TypeName(objApp.ActiveWindow)
    Case "Explorer"
    Set GetCurrentItem = objApp.ActiveExplorer.Selection.Item(1)
    Case "Inspector"
    Set GetCurrentItem = objApp.ActiveInspector.CurrentItem
    End Select
    Set objApp = Nothing
End Function

Sub CopyAttachments(objSourceItem, objTargetItem)
    Set fso = CreateObject("Scripting.FileSystemObject")
    Set fldTemp = fso.GetSpecialFolder(2) ' TemporaryFolder
    strPath = fldTemp.Path & "\"

    For Each objAtt In objSourceItem.Attachments
        strFile = strPath & "outlook_att_copy-" & objAtt.FileName
        objAtt.SaveAsFile strFile
        objTargetItem.Attachments.Add strFile, , , objAtt.DisplayName
        fso.DeleteFile strFile
    Next

    Set fldTemp = Nothing
    Set fso = Nothing
End Sub

Notice that I’ve copied quite a few more properties of the appointment over – setting the Free/Busy status to Free on the way.

Once you’ve added this to a VBA module in Outlook you can associated it with a button on the Main / Home ribbon for easy access when reading email.

Digital Photography: The Raw or JPG Question

Most digital photographers who go beyond simply taking a few snapshots encounter the question of whether to “shoot in raw” or work with the JPG images produced by most cameras. There are some excellent guides on the internet that explain the technical details behind this decision and offer balanced lists of pros and cons for you to consider. For me though, these articles seemed to say it was just a trade-off between quality and storage space. In terms of technical details that’s a fair assessment, but it’s only when this makes a significant difference to your photography that it actually matters.

When you take a picture with a digital camera, the sensor is exposed to light which produces a pattern of electrical signals that are read off by the camera electronics. This relates to the amount of red / blue / green light falling on each part of the sensor, and a bit of maths can convert the signal details into a picture. The maths needs some other inputs to convert the electrical signals into the coloured pixels making up the resulting image, and these take into account various details such as white balance, ISO, lens distortion and image enhancement features such as sharpening. The raw image format for your camera typically stores the direct readings from the camera sensor, plus the other settings at the time you took the photograph. The JPG image from your camera stores the resulting picture after the maths has been done – and for a common and fairly wide range of scenes it’ll look great.

The key difference here is that, provided that you have software that can replicate the raw -> JPG conversion that your camera can do (and in many cases this is actually possible on the camera itself), then you can re-run the conversion with other settings. For example, lighting conditions might have changed but you didn’t adjust the white balance on your camera, or maybe you underexposed an image in manual mode – no problem, you can simply go back and create a second rendering of the image using the raw data with adjusted settings for the correct white balance or effectively raising the ISO. That’s amazing – it means that you can worry less about your camera settings and more about photographing things that might never happen again.

Raw images typically also record more detail in the image (this is the quality factor) – often using 12-, 14-, or even 16-bits per colour per pixel, whereas most JPG output will be at 8-bits per colour per pixel. The most wonderful thing about this is that the extra detail recorded in very dark (or very light) parts of your image – whilst all but invisible to the human eye – can be used to recover detail using the Shadow and Highlight recovery tools in your photo processing software.

I use a Nikon D3300 which can give me the Nikon proprietary NEF (raw) format, and/or a JPG output. A common situation where I struggle with taking photographs is at stage performances. Anyone who has tried to photograph a school play will be familiar with the combination of low-level lighting, moving subjects, an awkward location to photograph from, and the pressure to come out with a photo that does justice to the performers (and the photographer). I like to ensure that the exposure is fast enough to avoid motion blur, but this often results in very dark images even with the largest aperture available, or sometimes I want to narrow the aperture to increase the depth of field. I also don’t want to use flash as it’s often the case that the ambient light is key to the mood of the image.

By using the raw format I can shoot with my desired exposure and aperture settings (often using manual mode and/or keeping the ISO low) bring the dark images down to my computer, run them through Shadow recovery (I use Adobe Lightroom – often the Auto Tone feature does 90% of the work for me) and adjust the sharpening a little, and then print or generate a JPG for uploading to Flickr. Sometimes a little bit of smoothing is needed to help reduce any noise that is inherent in low-light photography.

What started off looking like an almost completely black image ends up as a nice clear picture with the main subjects picked out in the ambient light – and any highlights aren’t blown out because I’ve used Shadow recovery rather than just increasing the overall exposure. Yes, my raw files are around 12MB where the JPGs were only 4MB, but at the rate I’m storing pictures it’s going to take several years to fill up my 2TB drive – so I’m happy to say “Hurrah for shooting in raw!”.

Follow-up / Review Meetings in Outlook

Sometimes you use Outlook to arrange a meeting with a bunch of people, and the last action you take is to organise a review or follow-up meeting a few days later. Wouldn’t it be handy if you could “clone” the existing meeting request to another date with small adjustments to the timing, location, and invitee list?

The trouble is, whilst the Copy command in Outlook allows you to create a duplicate event, this has limited use for all-day events, and doesn’t enable you to amend meeting details before it is saved to your calendar. Outlook provides handy Quick Action operations to let you create a new Task from a Message, or create a new meeting with preset Subject and Recipients, but nothing to help with creating new meetings based on an existing appointment.

The good news is that help is at hand! The good folk over at HowTo-Outlook have provided a guide to adding a Create new meeting based on this meeting button for all recent versions Outlook.

Their simple guide provides a short piece of VBA code that you can cut-and-paste into Outlook and link to a new “Clone meeting” button on your toolbar/ribbon. It took me about 2 minutes to get this in place, and it’s already saved more time than that.

Towards the end of their guide, HowTo-Outlook show how you can make modifications to the fields that are copied to the new appointment. I found that a number of key fields were missing from the original code, so here’s my modified block under With olApptCopy:

With olApptCopy
        .Subject = olAppt.Subject
        .Location = olAppt.Location
        .Body = olAppt.Body
        .Categories = olAppt.Categories
        .Importance = olAppt.Importance
        .Duration = olAppt.Duration
        .RequiredAttendees = olAppt.RequiredAttendees
        .OptionalAttendees = olAppt.OptionalAttendees
        .Resources = olAppt.Resources
        .ReminderSet = olAppt.ReminderSet
        .ReminderMinutesBeforeStart = olAppt.ReminderMinutesBeforeStart
        .ResponseRequested = olAppt.ResponseRequested
        .AllDayEvent = olAppt.AllDayEvent
End With

This will copy over you attendee lists, any resources needed, the meeting duration, importance flag, reminders, and whether a response from attendees is requested.

Decrypt (or Encrypt) NAS4Free Config

I’ve written about NAS4Free before – it’s a super NAS solution that packs a tonne of great features and offer good performance even on older hardware, ideal for a home file/media server. During a recent ugprade (9.3 to 10.2) I found myself wanting to decrypt the encrypted backup that you (should) normally take prior to any upgrade attempt. There doesn’t seem to be much documentation on this elsewhere, so I’ve recorded a procedure here in case anyone else wants to do this.

I’m doing this under Windows 7 Enterprise 64-bit, but there are straight forward equivalents for most (all?) of this on other platforms.

The key element to all of this is that NAS4Free uses a plain and sane way to create the encrypted configuration file: the configuration is first encrypted with the admin password (and “salt”) using the AES-256-CBC cipher, the result is then base64 encoded, and finally compressed with GZIP.

Decrypt NAS4Free Config

You can download an unencrypted config from NAS4Free, but it’s generally not a good idea to store these as they contain the passwords for your admin and user accounts in plain text. Also, if you are recovering from a broken NAS4Free installation then you may not be able to download a config at all, in which case being able to see the settings from a previous installation may help you figure out what broke it.

  1. You should start out with an encrypted backup file from NAS4Free (System | Backup/Restore), the default filename will be something like C:\Temp\config-nas4free.local-20160213090000.gz.
  2. Begin by decompressing this file. I use 7-zip, so I can just right-click the file and choose Extract here.N4F-Decrypt-Config-GzOff
  3. Now you can decrypt this file using OpenSSL. A Windows binary is available from Shining Light Productions. I installed the 64-bit Windows OpenSSL binaries to the default location (C:\OpenSSL-Win64) so hold shift and right-click that folder and choose Open command window here.N4F-Decrypt-Config-RunOpenSSL
  4. OpenSSL can remove the base64 encoding and decrypt using the command
    openssl enc -aes-256-cbc -d -a -in C:\Temp\config-nas4free.local-20160213090000 -out C:\Temp\config-nas4free.local-20160213090000.xml. Unless you’ve done some previous work with OpenSSL then you’ll get a warning that can safely be ignored, and a prompt for the decryption password. With NAS4Free 9.3 and earlier this will be your admin account password, for later versions it is the encryption password you entered when you downloaded the configuration.
    N4F-Decrypt-Config-OpenSSL
  5. This will drop your NAS4Free config into an XML file ready for you to browse. Note that NAS4Free does not allow you upload an unencrypted configuration.

Encrypt NAS4Free Config

Used in conjunction with the decryption routine above, this would enable you to download an encrypted configuration file (or take a previous one), make modifications to it, and upload / restore the result. This is not for the faint-hearted or inexperienced user as it could seriously break your NAS4Free setup and potentially damage the data you have store in it – you should be confident that you know what you’re doing before heading down this route.

  1. Start with your unencrypted XML configuration, I’ll assume that’s in C:\Temp\config-nas4free.local-20160213090000-new.xml
  2. Use OpenSSL to encrypt and base64 encode this, same as step 4 above, but with a slightly different command: openssl enc -aes-256-cbc -a -in C:\Temp\config-nas4free.local-20160213090000-new.xml -out C:\Temp\config-nas4free.local-20160213090000-new
  3. Right-click the output file and use 7-Zip to create a new archive.N4F-Encrypt-Config-Gzip
  4. Set the archive type to GZIP and click OK.N4F-Encrypt-Config-Gzip2
  5. Your compressed, encrypted config will now be ready to upload.

High-Def Settings for Handbrake

I’ve been using Handbrake to rip films for years – it does a great job – but my preferred settings have evolved over time as I’ve learned from successes and failures on an ever widening digital library. Some key refinements took place when I switched from DVD to Bluray recently, as part of the inexorable move to high-def and the wonderful visual detail that it brings. Here are my settings, with the key elements of rationale behind them.

First up, the outline process: I rip my DVD and Bluray media using MakeMKV, encode using Handbrake, store with Nas4Free, and watch through Kodi.

When ripping, find the main feature and then keep the video track, your preferred audio track (I use the DTS-Core as the DTS-Master is just huge and I don’t have the decoder – or ears – to benefit from it just now), and all of the subtitle tracks (you’ll want all the ones in your own language, and it’s quicker than unchecking each one that you don’t need).

Load it up into Handbrake, and choose the following options:

Output & Picture Settings:

Handbrake-Picture-Settings

Output Settings:

  • Container: MKV

Picture Settings:

  • Width: match source, usually 1920 for HD films
  • Anamorphic: Loose
  • Modulus: 2
  • Cropping: Automatic

You’ll notice that the source was 1920×1080, but that Handbrake is going to automatically trim a horizontal strip of 140 pixels (or thereabouts, depending on your film) at the top and bottom. Don’t worry, this is expected on an HD film, and will still result in 1080p ouput – it just saves having to encode those black strips all the way through.

Note that the size here has a huge impact on output size. The French film Switch is in 4:3 aspect ratio at 1080p which means that there are no black bars to trim off – there is nearly 50% more screen to cover than usual so the encoded output is proportionately larger.

Filters Settings

Handbrake-Filters-Settings

  • Decomb: Decomb, Default
  • Everything else is Off

For some very grainy films (read Matt Gadient’s post on de-noising in Handbrake) you might need to adjust the Denoise filter. I found Bourne Legacy to benefit from some denoising – strong spatial and a little temporal denoising did little damage to the final quality but reduced the file size by around 40%.

Video Settings

Handbrake-Video-Settings

  • Video Codec: H.264
  • Framerate: As source
  • Constant Quality: 21
  • x264 Preset: Very Fast
  • x264 Tune: Film (adjust to suit media)
  • H.264 Profile: High
  • H.264 Level: 4.1

Two things that might be contentious here.

First of all the RF setting of 21. I used to encode DVDs with a much lower (higher quality) rate factor, but with the extra detail of a high-def source it just doesn’t seem to need it, and RF 21 is a fair balance on file size for me too. Adjust to taste I guess.

The second contention is the x264 preset of Very Fast. This turns off lots of the x264 features that are designed to enable better quality & smaller file size at the expense of encoding speed. The trouble is, after extensive testing on a variety of films, I just didn’t find the trade-off worthwhile. Very Fast – uniquely – seems to hit the quality point bang on, often gets the best compression of all the presets, and (of course) is pretty nippy.

Audio Settings

Handbrake-Audio-Settings

  • Add the DTS-Core track with these settings:
    • Codec: AC3
    • Bitrate: 448
    • Mixdown: 5.1 Channels
  • Add the DTS-Core track again with these settings:
    • Codec: AAC (avcodec)
    • Bitrate: 160
    • Mixdown: Dolby Pro Logic II

I add an AC3 and an AAC sound track as whilst we mainly watch films on our media systems that can handle AC3, sometimes other family members want to watch on “less enabled” devices.

Subtitles Settings

Handbrake-Subtitles-Settings

  • Add a suitable subtitle track for your preferred language, leave all options unticked

This would correspond to just the second row in the screenshot above. Why did I add the first? This was from an encode of The Bourne Ultimatum which has those teletype overlays telling you the location of a spy setting (e.g. “Moscow, Russia”). I wanted to have these shown, but not the full subtitling. It turned out for this source that rather than using the “Forced” flag to identify these, a separate subtitle track (track 28) had been provided and set as default. I used VLC Player to determine which track I wanted – just played the relevant section through with each of the subtitle tracks.

Conclusion

I’m getting output of around 3.5GB for a 2hr film – that’s 1080p and and AC3 soundtrack which is perfect for my combination of playback kit.

I hope these settings may be of use to others!

Why Aren’t All Recipes Like This?

Working in IT means that I’m pretty familiar with process diagrams and explaining technical terms in plain English. I also cook a fair bit and just recently it occurred to me that, despite being around for centuries, recipes are still written like Victorian science experiments.

Traditional recipes don’t make it easy to work out up front where or when you’re going to need each ingredient, when the busy points are and when you can let things cook on while you wash up or grab that glass of wine. You have to read the whole thing in detail before you can get a feel for how many steps there are. Each ingredient is duplicated – once in the ingredients list and again in the method. It’s not always clear when you are going to be preparing things in parallel (marinate one item while chopping another) or series (cook 1 minute then add an ingredient). These are things that process diagrams are highly effectively at communicating quickly and easily.

I tried re-writing a recipe as a process diagram, and found that it really was easy to work with. So why aren’t all recipes like this one?

Beef and Kale Recipe - Process Version

Beef and Kale Recipe – Process Version

Upgrading NAS4Free Embedded

I chose NAS4Free as my home storage solution. As the name suggests, it’s free, but it’s also packed with file storage goodness such as UPNP/DLNA and DAAP support. This put it a touch ahead of similar packages that stop once they’ve got CIFS support and a dabble of all-but-useless things like FTP.

For my own configuration, a 6-year old desktop PC seems to run everything fast enough, even with ZFS on the storage. It’s only got a Core2 Duo 2.13GHz CPU and 4GB DDR2 standard issue RAM, and the drive is a Seagate 2TB SATA drive. Only one drive inside the system – experience to date indicates that user error and malware are more likely to deal a data-blow than disk failure. As a result I prefer to take backups to an external disk (I can also keep a copy off-site to protect against fire and theft).

I opted to deploy using the embedded NAS4Free version on a bootable USB. After a bit of hassle turning off the Intel MBEx BIOS features which prevented a USB boot, it all worked a treat. For the initial setup I followed the official installation guide – download and burn the LiveCD image, boot off it and install to a USB key (I did this on the NAS itself, but you could just as easily do this from your normal machine as it doesn’t touch any of your other drives). Insert the USB key into my NAS box (with a keyboard and monitor temporarily attached), boot up and go through the first time configuration, job done.

When it came to my first upgrade I hit the 9.3.0.2 issue that the firmware upgrade (System|Firmware) wouldn’t work. I had two options, both wrapped by saving and restoring my NAS4Free configuration: (1) write an updated LiveCD and use this to install a NAS4Free bootable USB with the updated embedded NAS4Free image on it; (2) write the updated embedded image file directly to USB (following some good instructions) and save loads of waiting around.

Obviously I went for option (2) as it was quicker – and it worked really smoothly. There was an error message in System|Firmware about the boot partition being too small, but it was clear that this wouldn’t affect operations. Alas, it does affect upgrades and when the next one came along neither the web GUI firmware nor the LiveCD would upgrade my USB key. So I had to sort it out properly:

  1. Download the latest Live CD image (remember to pick the right architecture) and burn to CD;
  2. Boot the LiveCD, insert a new USB key, select option 3 to install the embedded version (the options changed between versions – check carefully) – you can do this step on any PC as it doesn’t touch the internal drives;
  3. Take an encrypted backup of your NAS4Free configuration (System|Backup/Restore);
  4. Shutdown NAS4Free, remove your old USB boot key, insert the new one, and boot;
  5. When you hear the start-up beeps, you are ready to login to the NAS4Free web GUI. However, it’ll be on the default IP address of 192.168.1.250 so unless you happen to be using this subnet too (and that’s a fairly common situation so you may be lucky) you’ll either need to plugin a keyboard and monitor to configure this, or use something like WinIPConfig to temporarily add another address in that subnet (e.g. 192.168.1.15);
  6. Once you’ve got access to the Web GUI, login with the default credentials (admin:nas4free) and set the admin password to the one you used before in System|General|Password;
  7. Restore your saved configuration (System|Backup/Restore) – this will automatically reboot and you’ll be up and running (NB: the restore will fail if you haven’t set the admin password correctly).

Easy! This method took about 15 minutes in all, but meant that my USB embedded image was now correctly sized for future in-place upgrades.

Going Walking? Pack It All!

I’d managed to get three overnight bags, two children, the birthday presents, and the camp bed into the car ready to head down to a family weekend. As I reversed out of the drive the usual panic set in – I’d obviously forgotten something critical, just not sure whether it was underwear, toothbrushes, or the laptop (you’re never really off duty when you work in IT). Nope, I remembered packing them all. Great!

We must have made it a whole mile down the road before I remembered that the laptop battery was dead and I’d left the charger at home. Given the traffic at that time of day it was a 15 minute round trip to remedy the mistake.

Heading back our of our road, my eldest daughter sighed and said “Daddy, you should make a list so you don’t forget things.” Quite right. So I did. One list for when we go camping, and one for going walking. A few friends heard about my lists, but rather than giving me one of those “Aw, old age setting in” expression, they asked for a copy. I guess it gets us all in the end!

Anyway, here are my lists. I tend to adjust them a bit for each trip – with a pen rather than electronically.

Get the check lists here:

UCISA Support Services Conference 2014

Last week I attended this year’s UCISA SSC at Crewe Hall (which is near Crewe – ha!). The conference pulls in staff working in IT support roles from across UK Higher Education, from every corner of the land and every echelon of our organisations. The theme for this year was Supporting the Business with a focus on relationship management and customer service.

The quick version of this post is that I really enjoyed the conference and have come back with renewed enthusiasm, a whole raft of links with counterparts elsewhere who are facing similar challenges to us, and a stack of ideas for improvement that will take until the next USSC to implement!

This is a long post, so I’ve tried to make each session stand-alone – i.e. you can read one session and walk away with something gained.

Day 1 (2nd July)

We’re All in Customer Services Now (Carolyn Blunt)

Did you hear about the Sky customer who called up to say that their service was down, had been for days, and their young child was going to miss the most important show of the week…Peppa Pig? The engineer who was despatched went out on his/her own initiative and turned up at the door to fix the problem, but with a surprise – he had a small cuddly Peppa Pig toy for the young child whose TV show had been missed.

Do we give our staff the freedom and resources to show initiative, to go the extra mile, to make a difference? Or do we confine them with processes, KPIs, and machine-like automation and efficiency? Neither is good in extreme, but both have something crucial to offer when you’re seeking customer satisfaction.

Discussions With a Difference (Service Portfolio)

DwD means break-out groups with a facilitator helping to open up a discussion on a given topic. Each session is different, and I’m sure that everyone takes away their own unique version of what happened.

My take on this was that no-one really knows what a Service Portfolio should look like. Several people referred to ITIL’s “Service Catalog plus future and retired services”, whilst others had some kind of high-level summary of the IT department’s output.

What did seem to be common is that this is probably not near the top of anyone’s to-do list.

Business Showcase (FrontRange)

I can’t really comment here as I was on stage for this one. I briefly outlined the challenge that Oxford’s IT Services department faced around consolidation of our IT support functions, and how we came to select FrontRange HEAT SaaS as the toolset to underpin our new operations. Steve Gardner (FrontRange UK & Ireland Sales Director) then presented the story from where FrontRange began working with us – how the relationship has developed and how the toolset has been implemented at Oxford.

Big Ideas: Innovation at University of Surrey (Gareth Edwards)

If the presenter’s name is familiar then you might remember that Gareth Edwards worked in Oxford (Local) IT before moving to Surrey.

His talk provided a brief insight into how his team set up to stimulate and collect ideas for change – innovation, engagement, ideas, and invention. At the heart of this was a system that allows people to post up their ideas – however crazy – and a periodic vote/poll to indicate interest. Top ideas were then taken forwards in some way or another. This operated across IT Services (at Surrey) and elicited some 65 ideas, of which 8 have now been completed, and a further 9 are in the pipeline.

Some of the key devices to make this work were:

  1. Recognise that failure is an option – this is innovation and invention so the risks are high, but we’re guaranteed not to win if we don’t try
  2. Awards (see later talk on gamification) helped stimulate ideas and encourage engagement
  3. Tactics – voting became tactical (one idea receive huge numbers of votes from one part of the organisation and none from elsewhere), and awards ended up being used tactically to bring out quiet areas or recognise input even when ideas weren’t taken forwards immediately

Pecha Kucha (Various)

Pecha Kucha, that fast-paced helter-skelter presentation style that has become popular in IT circles, made the last formal session for the day. Jim Higham (Kent) reported on their approach to Change Management; Martin Putwain (MMU) gave us an insight into their approach to supplier engagement when buying service management software; Paul Mazumdar (Cambridge) talked eloquently about the challenges of staff development in federated organisations (he talked about Camrbidge but described the situation at Oxford equally well); John Ireland (yep, me again) looked at recent thoughts around incident management in the distributed IT environment at Oxford; Tony Brett (more Oxford!) managed to fit a 40-minute talk into 6m40s on the subject of good communication and assertiveness – and won an award for his PK; and Chris Sexton (Sheffield – but also on Oxford IT Committee) subtly broke all the rules of the PK session and inspired us to love cows! (You had to be there).

Day 2 (3rd July)

Continuous Service Improvement (Lorraine Brown, St Andrews)

This is an IT Services team that put in and won SDI 1-star certification, went right back and up’d their targets to get 2-star certification, and then went round the circuit again and got 3-star certification! Of course, their sights are now set on 4-star certification.

Whilst certification might not be the end goal, it was clear that this has driven fundamental and highly visible improvements for customers and, increasingly, for the business.

A very simple mechanism that proved highly successful was an “improvements register” – a simple Excel spreadsheet of improvements that were needed to attain the next target level. Their use of incentives to get user feedback, and an eye catching web site that focussed on user services from the very front page showed a clear commitment to customer experience. Lessons to be learned here for the rest of us!

Business Relationship Management (Dean Phillips, Aberdeen)

Aberdeen’s IT organisational structure has a Director in charge of four main divisions: Application Management, Service Management, Infrastructure Management, and Business Relationship Management. Thus is shown the importance of BRM.

If we were taking on BRM then Dean would suggest: Relationship Managers need to be senior posts – reporting to the Director (cf. senior lecturer level) and full-time; Make sure the role is clear so it doesn’t become a catch-all for stuff that the Service Desk doesn’t deal with; Have a strategy.

(Dean also showed us an impressively simply 1-side-of-A4 strategy – something for us to aspire to prehaps?)

I met up with Dean later on in the day and discussed asked how the BRMs actually worked in practice. At Aberdeen there is one for each of their 6 main colleges, and each works 4 days a week in the college, attending committees and meetings, and one day a week back in IT to round up with their counterparts from other colleges and feed into IT management.

Panel Session: What Does a Successful Relationship Look Like?

Time flew by on this, indicating that a very useful time was had. Rhys Davies (Leeds) tackled the question head-on, offering that a successful relationship is “one that you wish to continue through choice”.

An interesting part of this discussion was around partnership rather than customer/supplier models, and equality in the relationship. The discussion here led me on a train of thought that started with an IT user who has no real choice or alternatives about which IT supplier they use and the IT supplier is excited about IT but disengaged from the users’ raison d’etre; Now contrast that with a supplier who, still being the only supplier, finds out about and takes a genuine interest in the user’s goals; Now imagine a situation where we – in-house IT – share our customer/users’ (research & education) goals – that could be an exciting world where the value of IT increases dramatically through synergy and understanding of the real business of the University.

Discussions With a Difference (Managing Customer Expectations)

The second DwD session, and a new topic for me. Three clear messages for me:

  1. We can get more value out of our IT departmental rep relationships if we all come together and report back on user expectations & needs. A quick poll can indicate which needs are widespread and which are local. Food is a good incentive for getting reps together!
  2. Providing Customer Services training for staff outside of the service desk. “We are all in Customer Services now!” (did you read the first session on day 1?)
  3. Digital natives love self-help. Video guides and PDF instructions for things like diagnosing and fixing VPN issues go down really well and reduce calls to the service desk.

One IT: Integrated Approach to Service Delivery (Rhys Davies, Leeds)

What might we recognise from Rhys’ introduction as issues for Oxford:

  • SMT concerned about high levels of investment in IT but feeling of not getting much return;
  • “Academic Freedom” cited as an excuse for user to do whatever they want, regardless of supportability, efficiency, and effectiveness;
  • Change fatigue after 10+ years without real fruition;
  • No-one could state what “Good IT” would look like;
  • Lots of capital money around, but small operational budgets – created unsupportable ivory towers of IT project output.

The solution? Look at IT in the round, as a whole, instead of picking off fragments here and there.

Rhys had a great slide with a Ronseal tin of “One IT Service” in the middle. Their governance then focussed on two strands: Service Delivery and Change Delivery (aka projects) – moving away from a strand per service line.

Rhys also mentioned UniVault. All I’ll say here is that we should look at it.

Gamification (Brian Smith, Edge Hill)

An inspiring and highly amusing look at the use of gamification (rather than games, per se) in academic and service delivery contexts.

Two take-aways for me here.

The first was Dan Pink’s motivational theory. I’ve always struggled to accept that Mazlow’s Hierarchy of Needs was a comprehensive explanation of what makes some people “driven” and others “parked”. Looking at this from the three angles of “sense of purpose”, “autonomy”, and “mastery” seems to fit our industry far better and explains why some people will work late into the evenings on an interesting challenge, whilst others will drift around not quite dealing with the issue in front of them.

The second was the behavioural matrix:

Behavioural flow
This explains how people whose skills are increasing will need a greater challenge to avoid boredom, whilst those who are given ever harder challenges without the opportunity to increase their skills to match will hit anxiety. The “flow” in the centre channel is an area where skill level and challenge intensity increase in tandem – something that game designers are seriously good at in order to keep us engaged and wanting more!

Business Showcase (Bomgar)

Doug Sharples (Bomgar) and Ben Faire and Lindsay Roberts (Cardiff) talked through their recent implementation of Live Chat and Remote Support for their combined Libraries and IT Service Desk. It sounded great and is something we’re looking at, but the quote of the day come up in the middle of their session, in reference to service desk analysts not being put on phones and live chat at the same time:

It’s not fair to split their brains like that!

Establishing Utility and Warranty for Service Collaboration (Sandra Whittleston, Lecturer in ITSM, Northampton)

Did you know that ITILv3 Service Strategy was rushed and hasn’t been well received by industry to this day? Apparently so.

People were getting tired by this point and the session struggled to find a rhythm, however there was a gem in there. Noting ITIL’s favourite maxim that “utiltiy x warranty = value”, a magic quadrant of utility vs warranty can be used to prioritise various IT service changes according to their potential value to the business. The table towards the end of Page 2 of Sandra’s handout provides a graphical representation that could come in handy.

Day 3 (4th July)

Report on the UK HE Service Desk Benchmark (Sally Bogg, Leeds)

Did you know that 35% of UK HE IT Service Desks offer 8am-8pm opening, and 19% provide a 24/7 service?

Or that on average we spent 30% of our time firefighting, 25% of our time implementing ITIL, and 13% of our time justifying our existence. This only left 20% of our time to speak with customers and build relationships!

70% of UK HE Service Desks are adopting/have adopted ITIL compared with 62% across IT as an (horizontal) industry sector.

Only 38% of us have written SLAs (mind you, other talks noted that customer satisfaction is taking over from SLA performance as a measure of success).

This was the first year that a UK Higher Education benchmark had been carried out, and provides a handy comparator for us to see how we’re doing against our peers and against IT as an industry sector. Many thanks to UCISA, Cherwell, and the SDI for this.

Partnership Working & Supporting Students (Neill Clark, Glasgow Caledonian Students Association)

This talk provided some ideas about how “the student voice” can be given a place in University decision making. The case in point was design and construction of a new building, where inclusion of the student perspective enabled the design to be adapted to meet unanticipated but forwarding-looking changes in what students want for study, domestic, and social space.

If there was a clear take away from this talk for me it was not to second-guess what students want.

How to be Radical – Living with 100 Things (Jonathan Munn)

In summary, this chap broke up with his girlfriend, decided to sell ALL his stuff, and held onto just 73 things. About 65 of those things are what you need to remain decent in public – i.e. clothes. 2 more were the bags that he carries it all round in, nomadic style.

What he found was an amazing freedom and ability to focus on what is really important in his life. It was empowering or inspirational, maybe both. What should we learn (not to split up with our partner if we like all the stuff we’ve built up together?):

  1. Your amygdala (Lizard brain) makes you want to eat, sleep, stay alive, and reproduce. It is the bit of your brain that makes you want to stay in bed in the morning. “Beat the Lizard” and get out of bed – you’ll do twice as much if you start now!
  2. Stop talking. Make the decision. Make everyone drink a pint of water and stand up in meetings so they don’t drift off into pointless waffle.
  3. Simplify everything and eliminate distractions; Focus on what really matters.
  4. Failure is an option. Better to fail now that in six years when the stakes are much higher.
  5. For each day, write down 3 things you’re grateful for, and 3 things you’re going to acheive that day.

We finished by making a commitment to acheive in the next 3 months something that we’ve been delaying. We shared our commitments with another delegate, who will be in touch to make sure we deliver…!

Conference Close

All good things must come to an end, in this case with awards and lunch. Lots to digest on the way back though, and I don’t mean the salmon pate (!)

Enhanced Software Updates for Windows

You may have read my previous post about managing Windows updates with WPKG. I’ve been using this system for over a year now, and it has saved me a huge amount of time and provided peace of mind that the applications on my various Windows machines are properly up-to-date.

Things have grown from those early days, and there are now 91 “packages” under management, including approximately 57 different applications, 3 sets of application configuration, and 20 plugins.

Here are a few tips / patterns that I have adopted to make life easier along the way. If you’re using Wpkg and are interested in a chat, or any of my package files, then do feel free to get in touch on LinkedIn or by email.

WPKG-GP

WPKG-GP is a Group Policy Extension for Windows Vista/7 integration, and effectively hooks your application checks, installations, and upgrades into the pre-login screen where progress of MS Windows updates are displayed.

The other advantange of this way of doing things is that it uses a user-space network connection to the fileshare where your Wpkg source files are held, thus avoiding some hiccups in Wpkg where the System user can’t establish the network connection required to update in the background.

The reliability and visiblity of updates shot up when I installed this.

Package Templates

I have now built a set of template package files for each of the main installer types – MSI, InstallShield (Basic, MSI, Web MSI, etc), NSIS, and InnoSetup. The contain my favourite basic models for installing these types of package, including the relevant silent install switches.

Each of my packages now has a header section providing key information for upgrades and a version history, e.g.:

<!--
 VERSION HISTORY

 2.2.4 (2014-07-08)
 - New upstream release 2.2.4

 2.2.2 (2014-04-13)
 - Original package, based on upstream 2.2.2
-->
<!--
 Download URL : http://sourceforge.net/projects/pdfsam/files/pdfsam/
 Notification : SourceForge update subscription
 Package type : MSI
-->

These templates have improved consistency over the package files, and made it quite a bit quicker to write a new package for a new application.

Spotting Updates

That Notification line above is handy – it reminds me how I find out about updates. The most common options are:

  1. Subscription to a maillist or sourceforge update service;
  2. Built-in application checks (“hey a new version is ready!”); and
  3. urlwatch – a Linux tool that does a nightly check of specified webpages to look for changes (in the stated version number).

Applications Now Under Wpkg

I guess someone might be interested in knowing which applications I’ve got set up. Here’s a sample of the more interesting ones:

  • Productivity: LibreOffice; BizAgi; Open Workbench; XMind;
  • PDF Tools: PDF Split and Merge; PDF eXchange; SumatraPDF
  • Graphics: Dia; GIMP; InkScape; IrfanView,
  • Browsers & plugins: Firefox; Adobe Flashplayer Plugins; WebEx
  • Email & plugins: Thunderbird
  • Media: Audacity; iTunes; MediaMonkey; Stream What You Hear; VLC Player; XBMC
  • Utilities: 7-Zip; CCleaner; CutePDF; Infrarecorder; Inssider; MalwareBytes AM; Notepad++;WinMerge;
  • File Transfer: FastStone; HTC Sync Manager
  • Libraries: .Net; Java JRE; Silverlight
  • Network: Citrix ICA Client; Kerberos for Windows; OpenAFS; PuTTY; Citrix AnyConnect VPN; WireShark
  • Other: TortoiseSVN