Outlook: Decline Meeting Invite but Save a Copy

Outlook doesn’t offer a built-in feature to decline a meeting invitation whilst keeping a copy of the meeting in your calendar. In this blog post I’ll explain how to add this in.

Why is this even useful though? Sometimes I get invited to a meeting that I can’t / won’t attend but want to keep the attached papers for, or at least a record of when the meeting took place and who was invited. I’d like to let the organiser know that I’ve received their invitation but won’t be attending, however Outlook’s “Decline” response auto-deletes the invitation and appointment – so I’ve lost all that good information. Alas there’s no option to change the behaviour of “Decline”.

What I want is a button I can click (from the meeting invitation email) that will send a “decline” response back to the organiser, keep (a copy of) the meeting in my calendar, and mark the appointment as “Free”.

This is actually pretty easy in Outlook 2013, and I’ve drawn on some neat code to accept or decline a meeting request using VBA from Diane Pomersky at Slipstick, and a handy sub-routine to copy attachments on Outlook items from Sue Mosher.

Here’s the VBA code you need to add:

Public Sub SaveAndDecline()
    Dim oAppt As AppointmentItem
    Dim cAppt As AppointmentItem
    Dim oResponse

    Set cAppt = GetCurrentItem.GetAssociatedAppointment(True)
    Set oAppt = Application.CreateItem(olAppointmentItem)

    With oAppt
        .Subject = "DECLINED: " & cAppt.Subject
        .Location = cAppt.Location
        .Body = cAppt.Body
        .Categories = cAppt.Categories
        .Importance = cAppt.Importance
        .Start = cAppt.Start
        .Duration = cAppt.Duration
        .BusyStatus = olFree
        .RequiredAttendees = cAppt.RequiredAttendees
        .OptionalAttendees = cAppt.OptionalAttendees
        .Resources = cAppt.Resources
        .ReminderSet = cAppt.ReminderSet
        .ReminderMinutesBeforeStart = cAppt.ReminderMinutesBeforeStart
        .ResponseRequested = cAppt.ResponseRequested
        .AllDayEvent = cAppt.AllDayEvent
        Call CopyAttachments(cAppt, oAppt)
        .Save
    End With

    Set oResponse = cAppt.Respond(olMeetingDeclined, True)
    oResponse.Send

    Set cAppt = Nothing
    Set oAppt = Nothing
End Sub

Function GetCurrentItem() As Object
    Dim objApp As Outlook.Application
 
    Set objApp = Application
    On Error Resume Next
    Select Case TypeName(objApp.ActiveWindow)
    Case "Explorer"
    Set GetCurrentItem = objApp.ActiveExplorer.Selection.Item(1)
    Case "Inspector"
    Set GetCurrentItem = objApp.ActiveInspector.CurrentItem
    End Select
    Set objApp = Nothing
End Function

Sub CopyAttachments(objSourceItem, objTargetItem)
    Set fso = CreateObject("Scripting.FileSystemObject")
    Set fldTemp = fso.GetSpecialFolder(2) ' TemporaryFolder
    strPath = fldTemp.Path & "\"

    For Each objAtt In objSourceItem.Attachments
        strFile = strPath & "outlook_att_copy-" & objAtt.FileName
        objAtt.SaveAsFile strFile
        objTargetItem.Attachments.Add strFile, , , objAtt.DisplayName
        fso.DeleteFile strFile
    Next

    Set fldTemp = Nothing
    Set fso = Nothing
End Sub

Notice that I’ve copied quite a few more properties of the appointment over – setting the Free/Busy status to Free on the way.

Once you’ve added this to a VBA module in Outlook you can associated it with a button on the Main / Home ribbon for easy access when reading email.

Advertisements

Follow-up / Review Meetings in Outlook

Sometimes you use Outlook to arrange a meeting with a bunch of people, and the last action you take is to organise a review or follow-up meeting a few days later. Wouldn’t it be handy if you could “clone” the existing meeting request to another date with small adjustments to the timing, location, and invitee list?

The trouble is, whilst the Copy command in Outlook allows you to create a duplicate event, this has limited use for all-day events, and doesn’t enable you to amend meeting details before it is saved to your calendar. Outlook provides handy Quick Action operations to let you create a new Task from a Message, or create a new meeting with preset Subject and Recipients, but nothing to help with creating new meetings based on an existing appointment.

The good news is that help is at hand! The good folk over at HowTo-Outlook have provided a guide to adding a Create new meeting based on this meeting button for all recent versions Outlook.

Their simple guide provides a short piece of VBA code that you can cut-and-paste into Outlook and link to a new “Clone meeting” button on your toolbar/ribbon. It took me about 2 minutes to get this in place, and it’s already saved more time than that.

Towards the end of their guide, HowTo-Outlook show how you can make modifications to the fields that are copied to the new appointment. I found that a number of key fields were missing from the original code, so here’s my modified block under With olApptCopy:

With olApptCopy
        .Subject = olAppt.Subject
        .Location = olAppt.Location
        .Body = olAppt.Body
        .Categories = olAppt.Categories
        .Importance = olAppt.Importance
        .Duration = olAppt.Duration
        .RequiredAttendees = olAppt.RequiredAttendees
        .OptionalAttendees = olAppt.OptionalAttendees
        .Resources = olAppt.Resources
        .ReminderSet = olAppt.ReminderSet
        .ReminderMinutesBeforeStart = olAppt.ReminderMinutesBeforeStart
        .ResponseRequested = olAppt.ResponseRequested
        .AllDayEvent = olAppt.AllDayEvent
End With

This will copy over you attendee lists, any resources needed, the meeting duration, importance flag, reminders, and whether a response from attendees is requested.

Decrypt (or Encrypt) NAS4Free Config

I’ve written about NAS4Free before – it’s a super NAS solution that packs a tonne of great features and offer good performance even on older hardware, ideal for a home file/media server. During a recent ugprade (9.3 to 10.2) I found myself wanting to decrypt the encrypted backup that you (should) normally take prior to any upgrade attempt. There doesn’t seem to be much documentation on this elsewhere, so I’ve recorded a procedure here in case anyone else wants to do this.

I’m doing this under Windows 7 Enterprise 64-bit, but there are straight forward equivalents for most (all?) of this on other platforms.

The key element to all of this is that NAS4Free uses a plain and sane way to create the encrypted configuration file: the configuration is first encrypted with the admin password (and “salt”) using the AES-256-CBC cipher, the result is then base64 encoded, and finally compressed with GZIP.

Decrypt NAS4Free Config

You can download an unencrypted config from NAS4Free, but it’s generally not a good idea to store these as they contain the passwords for your admin and user accounts in plain text. Also, if you are recovering from a broken NAS4Free installation then you may not be able to download a config at all, in which case being able to see the settings from a previous installation may help you figure out what broke it.

  1. You should start out with an encrypted backup file from NAS4Free (System | Backup/Restore), the default filename will be something like C:\Temp\config-nas4free.local-20160213090000.gz.
  2. Begin by decompressing this file. I use 7-zip, so I can just right-click the file and choose Extract here.N4F-Decrypt-Config-GzOff
  3. Now you can decrypt this file using OpenSSL. A Windows binary is available from Shining Light Productions. I installed the 64-bit Windows OpenSSL binaries to the default location (C:\OpenSSL-Win64) so hold shift and right-click that folder and choose Open command window here.N4F-Decrypt-Config-RunOpenSSL
  4. OpenSSL can remove the base64 encoding and decrypt using the command
    openssl enc -aes-256-cbc -d -a -in C:\Temp\config-nas4free.local-20160213090000 -out C:\Temp\config-nas4free.local-20160213090000.xml. Unless you’ve done some previous work with OpenSSL then you’ll get a warning that can safely be ignored, and a prompt for the decryption password. With NAS4Free 9.3 and earlier this will be your admin account password, for later versions it is the encryption password you entered when you downloaded the configuration.
    N4F-Decrypt-Config-OpenSSL
  5. This will drop your NAS4Free config into an XML file ready for you to browse. Note that NAS4Free does not allow you upload an unencrypted configuration.

Encrypt NAS4Free Config

Used in conjunction with the decryption routine above, this would enable you to download an encrypted configuration file (or take a previous one), make modifications to it, and upload / restore the result. This is not for the faint-hearted or inexperienced user as it could seriously break your NAS4Free setup and potentially damage the data you have store in it – you should be confident that you know what you’re doing before heading down this route.

  1. Start with your unencrypted XML configuration, I’ll assume that’s in C:\Temp\config-nas4free.local-20160213090000-new.xml
  2. Use OpenSSL to encrypt and base64 encode this, same as step 4 above, but with a slightly different command: openssl enc -aes-256-cbc -a -in C:\Temp\config-nas4free.local-20160213090000-new.xml -out C:\Temp\config-nas4free.local-20160213090000-new
  3. Right-click the output file and use 7-Zip to create a new archive.N4F-Encrypt-Config-Gzip
  4. Set the archive type to GZIP and click OK.N4F-Encrypt-Config-Gzip2
  5. Your compressed, encrypted config will now be ready to upload.

High-Def Settings for Handbrake

I’ve been using Handbrake to rip films for years – it does a great job – but my preferred settings have evolved over time as I’ve learned from successes and failures on an ever widening digital library. Some key refinements took place when I switched from DVD to Bluray recently, as part of the inexorable move to high-def and the wonderful visual detail that it brings. Here are my settings, with the key elements of rationale behind them.

First up, the outline process: I rip my DVD and Bluray media using MakeMKV, encode using Handbrake, store with Nas4Free, and watch through Kodi.

When ripping, find the main feature and then keep the video track, your preferred audio track (I use the DTS-Core as the DTS-Master is just huge and I don’t have the decoder – or ears – to benefit from it just now), and all of the subtitle tracks (you’ll want all the ones in your own language, and it’s quicker than unchecking each one that you don’t need).

Load it up into Handbrake, and choose the following options:

Output & Picture Settings:

Handbrake-Picture-Settings

Output Settings:

  • Container: MKV

Picture Settings:

  • Width: match source, usually 1920 for HD films
  • Anamorphic: Loose
  • Modulus: 2
  • Cropping: Automatic

You’ll notice that the source was 1920×1080, but that Handbrake is going to automatically trim a horizontal strip of 140 pixels (or thereabouts, depending on your film) at the top and bottom. Don’t worry, this is expected on an HD film, and will still result in 1080p ouput – it just saves having to encode those black strips all the way through.

Note that the size here has a huge impact on output size. The French film Switch is in 4:3 aspect ratio at 1080p which means that there are no black bars to trim off – there is nearly 50% more screen to cover than usual so the encoded output is proportionately larger.

Filters Settings

Handbrake-Filters-Settings

  • Decomb: Decomb, Default
  • Everything else is Off

For some very grainy films (read Matt Gadient’s post on de-noising in Handbrake) you might need to adjust the Denoise filter. I found Bourne Legacy to benefit from some denoising – strong spatial and a little temporal denoising did little damage to the final quality but reduced the file size by around 40%.

Video Settings

Handbrake-Video-Settings

  • Video Codec: H.264
  • Framerate: As source
  • Constant Quality: 21
  • x264 Preset: Very Fast
  • x264 Tune: Film (adjust to suit media)
  • H.264 Profile: High
  • H.264 Level: 4.1

Two things that might be contentious here.

First of all the RF setting of 21. I used to encode DVDs with a much lower (higher quality) rate factor, but with the extra detail of a high-def source it just doesn’t seem to need it, and RF 21 is a fair balance on file size for me too. Adjust to taste I guess.

The second contention is the x264 preset of Very Fast. This turns off lots of the x264 features that are designed to enable better quality & smaller file size at the expense of encoding speed. The trouble is, after extensive testing on a variety of films, I just didn’t find the trade-off worthwhile. Very Fast – uniquely – seems to hit the quality point bang on, often gets the best compression of all the presets, and (of course) is pretty nippy.

Audio Settings

Handbrake-Audio-Settings

  • Add the DTS-Core track with these settings:
    • Codec: AC3
    • Bitrate: 448
    • Mixdown: 5.1 Channels
  • Add the DTS-Core track again with these settings:
    • Codec: AAC (avcodec)
    • Bitrate: 160
    • Mixdown: Dolby Pro Logic II

I add an AC3 and an AAC sound track as whilst we mainly watch films on our media systems that can handle AC3, sometimes other family members want to watch on “less enabled” devices.

Subtitles Settings

Handbrake-Subtitles-Settings

  • Add a suitable subtitle track for your preferred language, leave all options unticked

This would correspond to just the second row in the screenshot above. Why did I add the first? This was from an encode of The Bourne Ultimatum which has those teletype overlays telling you the location of a spy setting (e.g. “Moscow, Russia”). I wanted to have these shown, but not the full subtitling. It turned out for this source that rather than using the “Forced” flag to identify these, a separate subtitle track (track 28) had been provided and set as default. I used VLC Player to determine which track I wanted – just played the relevant section through with each of the subtitle tracks.

Conclusion

I’m getting output of around 3.5GB for a 2hr film – that’s 1080p and and AC3 soundtrack which is perfect for my combination of playback kit.

I hope these settings may be of use to others!

Upgrading NAS4Free Embedded

I chose NAS4Free as my home storage solution. As the name suggests, it’s free, but it’s also packed with file storage goodness such as UPNP/DLNA and DAAP support. This put it a touch ahead of similar packages that stop once they’ve got CIFS support and a dabble of all-but-useless things like FTP.

For my own configuration, a 6-year old desktop PC seems to run everything fast enough, even with ZFS on the storage. It’s only got a Core2 Duo 2.13GHz CPU and 4GB DDR2 standard issue RAM, and the drive is a Seagate 2TB SATA drive. Only one drive inside the system – experience to date indicates that user error and malware are more likely to deal a data-blow than disk failure. As a result I prefer to take backups to an external disk (I can also keep a copy off-site to protect against fire and theft).

I opted to deploy using the embedded NAS4Free version on a bootable USB. After a bit of hassle turning off the Intel MBEx BIOS features which prevented a USB boot, it all worked a treat. For the initial setup I followed the official installation guide – download and burn the LiveCD image, boot off it and install to a USB key (I did this on the NAS itself, but you could just as easily do this from your normal machine as it doesn’t touch any of your other drives). Insert the USB key into my NAS box (with a keyboard and monitor temporarily attached), boot up and go through the first time configuration, job done.

When it came to my first upgrade I hit the 9.3.0.2 issue that the firmware upgrade (System|Firmware) wouldn’t work. I had two options, both wrapped by saving and restoring my NAS4Free configuration: (1) write an updated LiveCD and use this to install a NAS4Free bootable USB with the updated embedded NAS4Free image on it; (2) write the updated embedded image file directly to USB (following some good instructions) and save loads of waiting around.

Obviously I went for option (2) as it was quicker – and it worked really smoothly. There was an error message in System|Firmware about the boot partition being too small, but it was clear that this wouldn’t affect operations. Alas, it does affect upgrades and when the next one came along neither the web GUI firmware nor the LiveCD would upgrade my USB key. So I had to sort it out properly:

  1. Download the latest Live CD image (remember to pick the right architecture) and burn to CD;
  2. Boot the LiveCD, insert a new USB key, select option 3 to install the embedded version (the options changed between versions – check carefully) – you can do this step on any PC as it doesn’t touch the internal drives;
  3. Take an encrypted backup of your NAS4Free configuration (System|Backup/Restore);
  4. Shutdown NAS4Free, remove your old USB boot key, insert the new one, and boot;
  5. When you hear the start-up beeps, you are ready to login to the NAS4Free web GUI. However, it’ll be on the default IP address of 192.168.1.250 so unless you happen to be using this subnet too (and that’s a fairly common situation so you may be lucky) you’ll either need to plugin a keyboard and monitor to configure this, or use something like WinIPConfig to temporarily add another address in that subnet (e.g. 192.168.1.15);
  6. Once you’ve got access to the Web GUI, login with the default credentials (admin:nas4free) and set the admin password to the one you used before in System|General|Password;
  7. Restore your saved configuration (System|Backup/Restore) – this will automatically reboot and you’ll be up and running (NB: the restore will fail if you haven’t set the admin password correctly).

Easy! This method took about 15 minutes in all, but meant that my USB embedded image was now correctly sized for future in-place upgrades.

UCISA Support Services Conference 2014

Last week I attended this year’s UCISA SSC at Crewe Hall (which is near Crewe – ha!). The conference pulls in staff working in IT support roles from across UK Higher Education, from every corner of the land and every echelon of our organisations. The theme for this year was Supporting the Business with a focus on relationship management and customer service.

The quick version of this post is that I really enjoyed the conference and have come back with renewed enthusiasm, a whole raft of links with counterparts elsewhere who are facing similar challenges to us, and a stack of ideas for improvement that will take until the next USSC to implement!

This is a long post, so I’ve tried to make each session stand-alone – i.e. you can read one session and walk away with something gained.

Day 1 (2nd July)

We’re All in Customer Services Now (Carolyn Blunt)

Did you hear about the Sky customer who called up to say that their service was down, had been for days, and their young child was going to miss the most important show of the week…Peppa Pig? The engineer who was despatched went out on his/her own initiative and turned up at the door to fix the problem, but with a surprise – he had a small cuddly Peppa Pig toy for the young child whose TV show had been missed.

Do we give our staff the freedom and resources to show initiative, to go the extra mile, to make a difference? Or do we confine them with processes, KPIs, and machine-like automation and efficiency? Neither is good in extreme, but both have something crucial to offer when you’re seeking customer satisfaction.

Discussions With a Difference (Service Portfolio)

DwD means break-out groups with a facilitator helping to open up a discussion on a given topic. Each session is different, and I’m sure that everyone takes away their own unique version of what happened.

My take on this was that no-one really knows what a Service Portfolio should look like. Several people referred to ITIL’s “Service Catalog plus future and retired services”, whilst others had some kind of high-level summary of the IT department’s output.

What did seem to be common is that this is probably not near the top of anyone’s to-do list.

Business Showcase (FrontRange)

I can’t really comment here as I was on stage for this one. I briefly outlined the challenge that Oxford’s IT Services department faced around consolidation of our IT support functions, and how we came to select FrontRange HEAT SaaS as the toolset to underpin our new operations. Steve Gardner (FrontRange UK & Ireland Sales Director) then presented the story from where FrontRange began working with us – how the relationship has developed and how the toolset has been implemented at Oxford.

Big Ideas: Innovation at University of Surrey (Gareth Edwards)

If the presenter’s name is familiar then you might remember that Gareth Edwards worked in Oxford (Local) IT before moving to Surrey.

His talk provided a brief insight into how his team set up to stimulate and collect ideas for change – innovation, engagement, ideas, and invention. At the heart of this was a system that allows people to post up their ideas – however crazy – and a periodic vote/poll to indicate interest. Top ideas were then taken forwards in some way or another. This operated across IT Services (at Surrey) and elicited some 65 ideas, of which 8 have now been completed, and a further 9 are in the pipeline.

Some of the key devices to make this work were:

  1. Recognise that failure is an option – this is innovation and invention so the risks are high, but we’re guaranteed not to win if we don’t try
  2. Awards (see later talk on gamification) helped stimulate ideas and encourage engagement
  3. Tactics – voting became tactical (one idea receive huge numbers of votes from one part of the organisation and none from elsewhere), and awards ended up being used tactically to bring out quiet areas or recognise input even when ideas weren’t taken forwards immediately

Pecha Kucha (Various)

Pecha Kucha, that fast-paced helter-skelter presentation style that has become popular in IT circles, made the last formal session for the day. Jim Higham (Kent) reported on their approach to Change Management; Martin Putwain (MMU) gave us an insight into their approach to supplier engagement when buying service management software; Paul Mazumdar (Cambridge) talked eloquently about the challenges of staff development in federated organisations (he talked about Camrbidge but described the situation at Oxford equally well); John Ireland (yep, me again) looked at recent thoughts around incident management in the distributed IT environment at Oxford; Tony Brett (more Oxford!) managed to fit a 40-minute talk into 6m40s on the subject of good communication and assertiveness – and won an award for his PK; and Chris Sexton (Sheffield – but also on Oxford IT Committee) subtly broke all the rules of the PK session and inspired us to love cows! (You had to be there).

Day 2 (3rd July)

Continuous Service Improvement (Lorraine Brown, St Andrews)

This is an IT Services team that put in and won SDI 1-star certification, went right back and up’d their targets to get 2-star certification, and then went round the circuit again and got 3-star certification! Of course, their sights are now set on 4-star certification.

Whilst certification might not be the end goal, it was clear that this has driven fundamental and highly visible improvements for customers and, increasingly, for the business.

A very simple mechanism that proved highly successful was an “improvements register” – a simple Excel spreadsheet of improvements that were needed to attain the next target level. Their use of incentives to get user feedback, and an eye catching web site that focussed on user services from the very front page showed a clear commitment to customer experience. Lessons to be learned here for the rest of us!

Business Relationship Management (Dean Phillips, Aberdeen)

Aberdeen’s IT organisational structure has a Director in charge of four main divisions: Application Management, Service Management, Infrastructure Management, and Business Relationship Management. Thus is shown the importance of BRM.

If we were taking on BRM then Dean would suggest: Relationship Managers need to be senior posts – reporting to the Director (cf. senior lecturer level) and full-time; Make sure the role is clear so it doesn’t become a catch-all for stuff that the Service Desk doesn’t deal with; Have a strategy.

(Dean also showed us an impressively simply 1-side-of-A4 strategy – something for us to aspire to prehaps?)

I met up with Dean later on in the day and discussed asked how the BRMs actually worked in practice. At Aberdeen there is one for each of their 6 main colleges, and each works 4 days a week in the college, attending committees and meetings, and one day a week back in IT to round up with their counterparts from other colleges and feed into IT management.

Panel Session: What Does a Successful Relationship Look Like?

Time flew by on this, indicating that a very useful time was had. Rhys Davies (Leeds) tackled the question head-on, offering that a successful relationship is “one that you wish to continue through choice”.

An interesting part of this discussion was around partnership rather than customer/supplier models, and equality in the relationship. The discussion here led me on a train of thought that started with an IT user who has no real choice or alternatives about which IT supplier they use and the IT supplier is excited about IT but disengaged from the users’ raison d’etre; Now contrast that with a supplier who, still being the only supplier, finds out about and takes a genuine interest in the user’s goals; Now imagine a situation where we – in-house IT – share our customer/users’ (research & education) goals – that could be an exciting world where the value of IT increases dramatically through synergy and understanding of the real business of the University.

Discussions With a Difference (Managing Customer Expectations)

The second DwD session, and a new topic for me. Three clear messages for me:

  1. We can get more value out of our IT departmental rep relationships if we all come together and report back on user expectations & needs. A quick poll can indicate which needs are widespread and which are local. Food is a good incentive for getting reps together!
  2. Providing Customer Services training for staff outside of the service desk. “We are all in Customer Services now!” (did you read the first session on day 1?)
  3. Digital natives love self-help. Video guides and PDF instructions for things like diagnosing and fixing VPN issues go down really well and reduce calls to the service desk.

One IT: Integrated Approach to Service Delivery (Rhys Davies, Leeds)

What might we recognise from Rhys’ introduction as issues for Oxford:

  • SMT concerned about high levels of investment in IT but feeling of not getting much return;
  • “Academic Freedom” cited as an excuse for user to do whatever they want, regardless of supportability, efficiency, and effectiveness;
  • Change fatigue after 10+ years without real fruition;
  • No-one could state what “Good IT” would look like;
  • Lots of capital money around, but small operational budgets – created unsupportable ivory towers of IT project output.

The solution? Look at IT in the round, as a whole, instead of picking off fragments here and there.

Rhys had a great slide with a Ronseal tin of “One IT Service” in the middle. Their governance then focussed on two strands: Service Delivery and Change Delivery (aka projects) – moving away from a strand per service line.

Rhys also mentioned UniVault. All I’ll say here is that we should look at it.

Gamification (Brian Smith, Edge Hill)

An inspiring and highly amusing look at the use of gamification (rather than games, per se) in academic and service delivery contexts.

Two take-aways for me here.

The first was Dan Pink’s motivational theory. I’ve always struggled to accept that Mazlow’s Hierarchy of Needs was a comprehensive explanation of what makes some people “driven” and others “parked”. Looking at this from the three angles of “sense of purpose”, “autonomy”, and “mastery” seems to fit our industry far better and explains why some people will work late into the evenings on an interesting challenge, whilst others will drift around not quite dealing with the issue in front of them.

The second was the behavioural matrix:

Behavioural flow
This explains how people whose skills are increasing will need a greater challenge to avoid boredom, whilst those who are given ever harder challenges without the opportunity to increase their skills to match will hit anxiety. The “flow” in the centre channel is an area where skill level and challenge intensity increase in tandem – something that game designers are seriously good at in order to keep us engaged and wanting more!

Business Showcase (Bomgar)

Doug Sharples (Bomgar) and Ben Faire and Lindsay Roberts (Cardiff) talked through their recent implementation of Live Chat and Remote Support for their combined Libraries and IT Service Desk. It sounded great and is something we’re looking at, but the quote of the day come up in the middle of their session, in reference to service desk analysts not being put on phones and live chat at the same time:

It’s not fair to split their brains like that!

Establishing Utility and Warranty for Service Collaboration (Sandra Whittleston, Lecturer in ITSM, Northampton)

Did you know that ITILv3 Service Strategy was rushed and hasn’t been well received by industry to this day? Apparently so.

People were getting tired by this point and the session struggled to find a rhythm, however there was a gem in there. Noting ITIL’s favourite maxim that “utiltiy x warranty = value”, a magic quadrant of utility vs warranty can be used to prioritise various IT service changes according to their potential value to the business. The table towards the end of Page 2 of Sandra’s handout provides a graphical representation that could come in handy.

Day 3 (4th July)

Report on the UK HE Service Desk Benchmark (Sally Bogg, Leeds)

Did you know that 35% of UK HE IT Service Desks offer 8am-8pm opening, and 19% provide a 24/7 service?

Or that on average we spent 30% of our time firefighting, 25% of our time implementing ITIL, and 13% of our time justifying our existence. This only left 20% of our time to speak with customers and build relationships!

70% of UK HE Service Desks are adopting/have adopted ITIL compared with 62% across IT as an (horizontal) industry sector.

Only 38% of us have written SLAs (mind you, other talks noted that customer satisfaction is taking over from SLA performance as a measure of success).

This was the first year that a UK Higher Education benchmark had been carried out, and provides a handy comparator for us to see how we’re doing against our peers and against IT as an industry sector. Many thanks to UCISA, Cherwell, and the SDI for this.

Partnership Working & Supporting Students (Neill Clark, Glasgow Caledonian Students Association)

This talk provided some ideas about how “the student voice” can be given a place in University decision making. The case in point was design and construction of a new building, where inclusion of the student perspective enabled the design to be adapted to meet unanticipated but forwarding-looking changes in what students want for study, domestic, and social space.

If there was a clear take away from this talk for me it was not to second-guess what students want.

How to be Radical – Living with 100 Things (Jonathan Munn)

In summary, this chap broke up with his girlfriend, decided to sell ALL his stuff, and held onto just 73 things. About 65 of those things are what you need to remain decent in public – i.e. clothes. 2 more were the bags that he carries it all round in, nomadic style.

What he found was an amazing freedom and ability to focus on what is really important in his life. It was empowering or inspirational, maybe both. What should we learn (not to split up with our partner if we like all the stuff we’ve built up together?):

  1. Your amygdala (Lizard brain) makes you want to eat, sleep, stay alive, and reproduce. It is the bit of your brain that makes you want to stay in bed in the morning. “Beat the Lizard” and get out of bed – you’ll do twice as much if you start now!
  2. Stop talking. Make the decision. Make everyone drink a pint of water and stand up in meetings so they don’t drift off into pointless waffle.
  3. Simplify everything and eliminate distractions; Focus on what really matters.
  4. Failure is an option. Better to fail now that in six years when the stakes are much higher.
  5. For each day, write down 3 things you’re grateful for, and 3 things you’re going to acheive that day.

We finished by making a commitment to acheive in the next 3 months something that we’ve been delaying. We shared our commitments with another delegate, who will be in touch to make sure we deliver…!

Conference Close

All good things must come to an end, in this case with awards and lunch. Lots to digest on the way back though, and I don’t mean the salmon pate (!)

Enhanced Software Updates for Windows

You may have read my previous post about managing Windows updates with WPKG. I’ve been using this system for over a year now, and it has saved me a huge amount of time and provided peace of mind that the applications on my various Windows machines are properly up-to-date.

Things have grown from those early days, and there are now 91 “packages” under management, including approximately 57 different applications, 3 sets of application configuration, and 20 plugins.

Here are a few tips / patterns that I have adopted to make life easier along the way. If you’re using Wpkg and are interested in a chat, or any of my package files, then do feel free to get in touch on LinkedIn or by email.

WPKG-GP

WPKG-GP is a Group Policy Extension for Windows Vista/7 integration, and effectively hooks your application checks, installations, and upgrades into the pre-login screen where progress of MS Windows updates are displayed.

The other advantange of this way of doing things is that it uses a user-space network connection to the fileshare where your Wpkg source files are held, thus avoiding some hiccups in Wpkg where the System user can’t establish the network connection required to update in the background.

The reliability and visiblity of updates shot up when I installed this.

Package Templates

I have now built a set of template package files for each of the main installer types – MSI, InstallShield (Basic, MSI, Web MSI, etc), NSIS, and InnoSetup. The contain my favourite basic models for installing these types of package, including the relevant silent install switches.

Each of my packages now has a header section providing key information for upgrades and a version history, e.g.:

<!--
 VERSION HISTORY

 2.2.4 (2014-07-08)
 - New upstream release 2.2.4

 2.2.2 (2014-04-13)
 - Original package, based on upstream 2.2.2
-->
<!--
 Download URL : http://sourceforge.net/projects/pdfsam/files/pdfsam/
 Notification : SourceForge update subscription
 Package type : MSI
-->

These templates have improved consistency over the package files, and made it quite a bit quicker to write a new package for a new application.

Spotting Updates

That Notification line above is handy – it reminds me how I find out about updates. The most common options are:

  1. Subscription to a maillist or sourceforge update service;
  2. Built-in application checks (“hey a new version is ready!”); and
  3. urlwatch – a Linux tool that does a nightly check of specified webpages to look for changes (in the stated version number).

Applications Now Under Wpkg

I guess someone might be interested in knowing which applications I’ve got set up. Here’s a sample of the more interesting ones:

  • Productivity: LibreOffice; BizAgi; Open Workbench; XMind;
  • PDF Tools: PDF Split and Merge; PDF eXchange; SumatraPDF
  • Graphics: Dia; GIMP; InkScape; IrfanView,
  • Browsers & plugins: Firefox; Adobe Flashplayer Plugins; WebEx
  • Email & plugins: Thunderbird
  • Media: Audacity; iTunes; MediaMonkey; Stream What You Hear; VLC Player; XBMC
  • Utilities: 7-Zip; CCleaner; CutePDF; Infrarecorder; Inssider; MalwareBytes AM; Notepad++;WinMerge;
  • File Transfer: FastStone; HTC Sync Manager
  • Libraries: .Net; Java JRE; Silverlight
  • Network: Citrix ICA Client; Kerberos for Windows; OpenAFS; PuTTY; Citrix AnyConnect VPN; WireShark
  • Other: TortoiseSVN

Web Browser Safety: Cookie Control in Firefox

I’m a huge fan of Firefox and use it as my main browser. I found myself reviewing my cookie settings recently and thought it might be worth documenting this for future reference and in case anyone finds it helpful.

The issue I’m concerned with here is the profusion of cookies for tracking, auto-complete forms, and so forth. Call me paranoid, but the potential for 3rd party cookies to be in collusion activities is something I’d prefer to avoid (try the Firefox Collusion Add-on if you’d like a picture of potential collusion interactions). If you’re a regular web user and don’t have any cookie controls in place then you might get a scare by looking in Tools > Options > Privacy > Show Cookies.

Many cookies are quite unnecessary – try switching to private browsing in Firefox to see how life would look if you deleted all your cookies. However, some cookies do make life a lot easier. At Oxford University a lot of websites are secured with Webauth and use “session” cookies to store information needed for single sign-on to work; cookies also help to make interactive websites work smoothly.

I use some Firefox settings that enable me to approve the helpful cookies – on a long-term basis – but filter out the advertising, tracking, and otherwise unnecessary stuff. Here’s how I set it up…

Step 1: Configure Firefox to ask your permission when a website wants to set/change a cookie

Open the privacy settings options at Tools > Options > Privacy and set the History and Cookie options highlighted in the screenshot below:

FF_Cookies-Privacy_Settings

Step 2: Clear out your old cookies

Still in the privacy settings dialog, click “Show Cookies” on the screen above, choose Remove All Cookies

FF_Cookies-Privacy_Cookies

Step 3: Work normally and selectively allow / deny cookies

Just go ahead and browse. When a website wants to set/change a cookie, Firefox will pop up a dialog like the one below. Roughly speaking, for sites you recognise and trust you want to tick the “Use my choice for all cookies from this site” and allow the operation, and for sites that look undesirable (e.g. ads.doubleclick.com) you’ll want to tick that box and deny the operation. For sites that you’re not sure about you could “Allow for session” which will clear the cookie out when you close Firefox.

FF_Cookies-Confirm_Cookie

Step 4: Check your cookies in a few weeks/months time and be amazed!

Obviously, in your first few hours you’ll find yourself allowing/denying a lot of cookies. This will die down as you build up a list of sites that you allow cookies for though. If you go back to your list of cookies (Step 2, above), then you should find that it is much shorter than before – and you’ll recognise nearly all of the sites on there.

You may have spotted a number of additional privacy options in the screenshots above. Personally, I normally set “Accept third-party cookies” to Never – but this can cause problems with some sites. You can find more information in the Mozilla support article Disable third-party cookies in Firefox to stop some types of tracking by advertisers. The tracking options at the top of that dialog can also help discourage some “nice” tracking sites from setting cookies or using other tracking mechanisms – but it’s a bit like the olden days of robots.txt files: you’re dependent on someone else honoring the preference you’ve expressed.

UCISA Support Services Conference 2013: Day Three

Day 3: Thursday 4th July

Thursday morning started with a pair of shorter plenaries looking at two recent service desk initiatives.

The first was a Bomgar deployment at UCA. Being distributed over five sites led UCA to feel that a remote support solution was an absolute must – this would also help with the Senior Officer who uses a Mac at home and needed assistance in setting it up with his printer – a familiar story! They had introduced Bomgar as a tool to fit this need, in order to drive up first-line-fix rates and reduce load on second-level / technical teams. Their previous tools were tied to particular platforms, and therefore the quality / level of service available depended on whether a user was running Windows, OS X, Linux, or something else. Bomgar were also part of the supplier exhibition, and were promoting their recently developed direct integration with the Cherwell service desk tool.

This was followed by a review of the recent Service Desk Initiative (SDI) certification undertaken at the University of Leeds. They are a 14 FTE service desk, with distributed ITSS, handling around 80,000 calls each year. They had been trying to move from “catch and dispatch” or “break/fix” to a customer focussed model of working and were wanting an independent assessment of their operation. SDI was chosen for this. An initial 2-day assessment unpicked everything they did, and took a deep look at their documentation, processes, training, etc. The score wasn’t where Leeds had hoped, but this initial assessment is followed by a series of recommendations to help you acheive your target rating. After 6 months of hard work, Leeds made it through their certification assessment at their desired target level. This is the kind of work that a Service Desk Manager does, and it was hard. Leeds are now hoping to build on their score and go back for recertification at a higher level in future.

During the break I met with the head of the desktop PC team at Royal Holloway. They manage around 900 student PCs and a further 1200 staff machines – so a similar size of user base to Oxford and picking up similar deployment scenarios. Their organisation comprises 5 service desk staff and 5 desktop technicians, using WSUS/SCCM for management. Interestingly, student calls go to the service desk, and all staff calls go straight through to the technicians. The current push in this team is for collaboration tools – providing collaboration from the desktop using Microsoft Link and Yammer. This will be a move away from a plethora of self-selected tools currently in use.

The final session of the conference was a look at Social Media as a communication channel, and brought together many ideas that were raised in other talks. Six main types of SM were identified: blogs, networks (groups based on people’s affinities with each other), content communities (groups based on content type such as YouTube), bookmarking services, micro-blogging like Twitter, and wikis. Internal SM platforms – used for employees to communicate with each other – tend to get low take-up because these don’t build on existing networks / profiles / communities but require people to re-invest in a less established arena, often with little idea of the likely benefits. Working in SM requires us to think about “the 4 C’s”:

  1. Creation (of content) – what is relevant, existing, interesting?
  2. Context – how do people find our content (search engines, sharing, discovery)?
  3. Conversation – listening, interpreting, and responding to other SM publishers, and the reaction to our own content;
  4. Conversion – why are we doing this at all? We must have a goal, which will often be based on someone doing something (buy our product, solve a problem for themselves, find and use a new service, …)

There are lots of stories about SM faux-pas. Customer Service often features in SM scare stories. A video showing how to open a Kryptonite cycle lock using the cap of a bic biro stirred up a lot of chatter, and Kryptonite responded after about 10 days (with a refund/upgrade for customers) – however this incident is estimated to have cost them around $18m.

What is also important though is that the preferred SM platforms change rapidly. Even now, as businesses are getting to grips with blogging, Twitter, and Facebook, students coming up from school are using SnapChat and the Vine. In a few years these two will also be outmoded. Our approach needs to be to develop an overarching strategy that remains applicable as the platforms change, and recognise that flexibility can be powerful here.

The conference finished with a final chance to meet colleagues, and the starting of a journey round UK HEIs of a conference trophy, to be passed from person to person on a tour around the UK, and arriving back at next year’s conference with a series of photos of interesting things learned at each institution.

UCISA Support Services Conference 2013: Day Two

Day 2: Wednesday 3rd July

Breakfast, and time for the first meeting of the day, with colleagues from Plymouth and Liverpool John Moores. Do we have security training for all our staff (IT or otherwise)? At Oxford we have policies, a toolkit, lots of guidance, and some systems, but the only routine training with periodic “refreshers” is on DSE assessment and recruitment (although some security training is on the way as I type). The pattern seems good – initial training, followed by periodic updates (frequency as appropriate), via a self-service interface with some means of checking that the user has taken on board key aspects of the learning. At LJM all IT staff go through a similar programme for DPA, anti-bribery, and diversity training. Sounds like it has potential for Oxford, and the quid pro quo is that our ISBP toolkit is of interest to Plymouth and LJM.

The first plenary was from Druva, who have provided a “dropbox alternative that meets corporate data storage and security requirements” to the Economics Dept and Business School at Warwick. I spoke to Druva the day before, and the solution sounded interesting – compliance (policy) driven storage with refresh and commit to/from a local copy from multiple clients. The talk also had some interesting points – our users are shouting for “style”, “freedom”, “productivity”, and “convenience”, whilst our employers are demanding “managed risk”, “privacy”, “security” and “protection” – the ever present availability vs. security tension. Unfortunately the rest of the talk didn’t reassure me. “Druva is NOT a backup company” we were told, but then “…and we can deliver all of these core features because, at its heart, Druva is a backup company”. Later on “We configured it to use port 22 as we already had a hole in the firewall and didn’t want to open any more holes up for obvious security reasons”. As configured at Warwick, users can restore to any 12hr snapshot in the last two weeks, from any device (and there is a web interface for retrieval too). One other note, from Warwick’s Economics Dept, was that they had abandoned whole disk encryption due to (a) direct cost, (b) support and productivity cost through lost passwords, and (c) impact on imaging and desktop management.

Plenary 2, and Steven Beavis (Cherwell) talked about measuring customer satisfaction. His key point was that at the end of the day, that’s what it is all about. He suggested that customer satisfaction follows primarily from two areas – perception of value, and ease of access to desired services. Metrics should be balanced across four main areas: Productivity, Efficiency, Satisfaction, and (as a result) Satisfaction. Deriving metrics is a cycle – you set Goals, define Critical Success Factors that reflect acheivement of the Goal, set Key Performance Indicators to tell you when the CSFs are hit, record Metrics which can be used to derive KPIs, and generate feedback as a result, which then helps to set next year’s Goals. Users are increasingly requesting a focus on satisfaction – “I want it fixed, now” – rather than operational targets (we had 10,000 calls last month). Further, reports should be actionable – there is no point in reporting a figure unless it leads to actions for improvement.

This led into a lightning talk about surveying. Lots of people had anecdotal evidence of high levels of apathy – low response rates, mid-range scores, etc. A few success stories were related though. Noel Bruton’s “Random 5” – 5 callbacks per week to users who had tickets closed to gauge their satisfaction in 3 questions or so – this had worked well for one person. Another got a 30% response rate by sending all callers a “Less-than-one-minute” survey. The idea of an email with (two or) three HTML buttons – Poor, OK, Good – to all callers won some favour. One University IT department had sent staff out with bright t-shirts and a list of 12 (yes or no style) questions; these staff then took up post near doors to campus cafe’s etc and asked just one question of anyone walking through – of course most people gave up the answer before they realised that they were being surveyed, and a very high rate of data collection was acheived.

The morning closed with a talk from Edinburgh University about UniDesk. Edinburgh, St Andrews, and the University of Abertay had all been looking to buy new ITSM tools, and thought that doing this together would get better leverage. This led to thinking about sharing the cost of developing ITSM processes, and then the whole hog – a shared service desk. TOPdesk came in as a fourth partner, bringing their ITSM tool and commercial knowledge. From kick-off in June 2008 the whole thing was done in c. 30 months (Nov 2010).

The initial shared service had Incident Management and Request Fulfilment, and extension to incorporate Problem Management, Change & Release Management, and Configuration Management (or at least a CMDB) followed. Sheffield Hallam University have taken this up for their service desk.

UniDesk’s life had not been without lessons to be learned – and many of these seemed to be good advice for our own service desk project. Five things came though:

  1. Keep it simple – no customisations for the initial partnerships makes it easy to maintain, avoids politics, keeps costs down, and makes it much simpler when potential new partners come along. You need to decide not to overengineer it from the outset!
  2. This isn’t going to save the bank – although efficiencies may lead to small savings, these are not generally substantial, especially for the initial group who set it up.
  3. A simple business model is critical. UniDesk agreed a cost per annum, based on JISC bandings of instituation size – but then usage was unlimited (i.e. cost is absolutely defined for the service period ahead – “fill y’r boots!”).
  4. Trust is very important. Each insitution had to make compromises, adopt aspects of the others, give up customisations of their own. This can only be done successfully if you trust that everyone is in this for mutual benefit.
  5. You need to be rigorous in accounting for costs (e.g. staff time) in the shared service – as resources come from the partner organisations, but people will want to question how much the are being charged (the transparency aids building of trust).

Edinburgh are also using the shared service desk for their Finance and Registry teams. They are getting on well with using the tool, but don’t have the ITIL knowledge to “get” the processes – especially the two-part closure (resolve first, then close with data cleansing).

Innovative Communications was the topic of conversation before lunch. Most people seemed to be focussing on how to engage with students (“lose the suit, grab a hoodie” was the key to success here). Ideas that broke out of the usual molds included:

  • Poster promenades – IT and Library advertising their wares in departments (cake made these events popular);
  • Open Days – a sort of IT conference for academics and researchers – a tour of the data centre had proven particularly alluring!
  • A roadshow, going round the University to show off new/exciting developments. Short talks were recorded and delivered by video to save key staff from spending loads of time on this;
  • The use of 90-second videos to showcase services and projects had been well received. Several Universities used video as a core part of their communications, and had their own YouTube channel;
  • One University had a robot at their freshers fair – you could ask the robot a question and somehow it would try to provide an answer.

Over lunch I met up with two Relationship Managers from Hull University. This is not part of the student welfare scheme, but the bit of IT that links out across the University to do Business Relationship Management (a bit like Internal Account Management). Their role is to speak to departments about current issues, wishlists, strategies, plans etc, arrange for specialist staff to be available for suitable discussions, and advise on new developments / IT strategy. All engagements / contacts are recorded and reviewed, and the information collected is made widely available across the IT department. `The role is tightly integrated with Communications – although Hull are yet to get staff for this. More than one person is needed for this, although with BRM activities spread over several staff, one person could coordinate several relationships. It was seen as key to success that relationships are trunked – i.e. one person oversees / handles relationships with a group of related customers. This helps to join up the customer needs, and fosters sharing / peer support. In terms of internal links and dependencies needed to deliver in this area, Relationship Management falls cleanly into the Customer Services area of IT, but depends a lot on Application and Web Development as these are the areas where most issues seem to crop up.

The afternoon kicked off with a discussion of Herriot Watts’ reorganisation and merger of IT and Libraries. One of their larger issues was a 10 minute walk between the library and the data centre, and several people who had merged library and IT help desks a few years ago are now separating these functions out again – it seems that students want library help from libraries and IT help from IT.

Next we heard from Dan Batchelor (University of Wolverhampton) who is the outgoing president(?) of the Student Union there. The union had been failing, and he led them to ditch the bar/club/table-footy culture and become a support body for students. Part of this saw them become “the Student Voice” in IT decision making, and they really did seem to have great engagement through formal and informal channels at both individual and organisational levels. This meant that strengths established in one year were not lost when the executive was re-elected for the next. His story was very persuasive and I, amongst others, plan to investigate the potential of our own unions (OUSU in Oxford’s case) to help link with students.

The last formal session of the day was a talk about surviving in changing and challenging times. Paul McGee was certainly an animated and engaging speaker, and many people will doubtless remember some of his sound bites: “The future does not belong to the strongest, but to those most able to adapt (Darwin)”, “Shut Up, Move On”, the receptionist whose name card read “Director of First Impressions”, and the fact that “change makes us uncomfortable”.

Thus ended the second day, giving way to the second evening, the conference dinner, the after dinner talks, and the discussions of service desk and managed desktop issues into the wee hours.