Michael Moulsdale

- Information technology thought leadership

Implementing an IT Governance framework for an organisation with a federated business structure.

25 August, 2014
Case Studies


Due to the size of the organisation, 700 plus employees, there was a clear need for an IT Governance strutucture to be implemented as part of the IT Transformation project. In order to meet this need, there were a number of challenges would need to be taken into account.

  • There was no exisitng IT Governance structure from either a high level strategic business level, or from a tactical departmntal level. Although this meant there was a lot of work to be done, it also meant that a structure could be desgined from scratch to acurately meet the needs of the organisation.
  • Not only was the organisation reasonably large (750+ employees), it was structured in a pseuo federated manner. As business units were created around independantly generated grants, there were no clear business reporting lines. Although these business units were reliant on IT functions, the roles, responsabilities, and linkages were unclear.
  • mainly due to the federated nature of the organisation, there was no formal business startegy that IT could hook into and be measured against.
  • Although there was an informal business risk register, there wasn’t a defined approach to risk or risk appetite, and no systematci method of managing risk.
  • IT had no monitoring systems, or even management applications, in place to create metrics about the IT environment


When looking at the problem and the set of challenges, this at first looked to be a complex exercise, with the federated nature of the problem posing some immediate challneges on how to create a governance group. The answer, as with most things, was to start simply and small and work from there.

  • An informal IT steering committee was created that included the Executive Director, the COO, and me as Head of IT. The Executive Director was involved to a large extent at the being, but then took a back seat for a large part of the project, until near the end. This was not ideal, but was the best way to manage things with regards to the nature of the situation.
  • The organisation had a monthly forum that was attended by all stakeholders, for the various operations, and business groups, to present what their situation was at that time. The meeting was far to large to conduct a meaningful discussion, however I used this as a mechanism to present what was going on within IT. The reports started off as being mainly descriptice, but inclded better information as the project progressed. The primary message at the being and throughout was that IT was transparent and were happy to share success and failures. This therefore became an IT Governance reporting group, but without any strong feedback.
  • The first data related activities were to create a risk register and a problem log, both of which were made available to be read by the entire organisation. As there was no organiisational risk strategy, this was initially created by takinga  common apporach stance to looking at the major risk within IT that would have an affect on the business. The problem register was created by looking at; all recurring incidents, areas that were causing multiple incidents, and items that were inhibiting the use and advancement of IT services. This register was updated on a weekly basis, and published on a monthly basis.
  • Next an ITIL environment was implmented to address the tactical departmental governance deficiencies. This started by implmented an Excel based change control solution that augmented a basic ticketing system. These aspects were quickly replaced by an ITIL compliant help ddesk solution that included incident management, change management, problem management, and asset management. This syetm then abled IT to better anlyse what was going on within IT and it customers and address any issues appropriately.
  • In order to better manage and understand the environment, but to also allow better information about the environment, a suite of enterprise monitoring systems were implemented. Once these were implemented a series of database scripts were created to provide high-level real-time reports on the status of the environment. These data were added to the monthly governance reports.
  • As the environment became more sophisticated and stable, and an emphais was placed on IT security Governance. The approach was to apply a governance structure that was based on ISO 27001 and therefore included a more systematic approach towards risk management. This was implemented over a six month period with more detailed reporting generated. With these activitie in place an IT Security Council was formed that included the COO, the organisation’s regulatory officer, another member of IT, and the chair of the organisation’s Data Governance Committee.
  • The Data Governance Committee was an intiative started jointly between IT, Data Statistics, and the Executive Director. This committe included a representives from each of the federated groups.
  • A programme office was created that was largely based around PRINCE2, but also included Agile Methodology for the development team. Initially these frameworks were implemented using Excel as a reporting tool, evolved into the Redmine project management system, and were finally transofrmed into  a SharePoint environmeent.
  • Finally the subject of an IT Steering Committee was re-addressed. As the organisation was going through a number of organisational changes it was not feasible to create a single committee that would be sustainble. The apporach of having 5 separate monthly meetings with business heads was created that ensured that the needs of the various organisations were being met, and that all members of the organisation understood what the status of IT was and where it was heading.


At this point the transofrmation project was coming to an end and a permanent Head of IT was employed. During the haandover process it was reasonably easy to handover a fully developed IT Governance structure that included detailed document and was completely transparent. The piece that was missing was a more coherent IT Steering Committe, and this would become one of the new Head’s primary challenges.

Cyber Essentials, a good thing

8 August, 2014
Notes from the field

Over the past 10 years I have porvided Interim IT Director services to a wide variety of organisations. These have ranged from start-ups, to small 100 person organisations, to larger 700+ employee organisations. A common theme amongst all of these organisations has been a lack of a systematic approach to information security. For the larger organisation the simplist approach has been to look towards ISO 27001. However, this is a bit daunting, and in many cases unnecessary, for the smaller organisations. However, they still needed an approach that could be recoognised. Furthermore, when contacting suppliers and asking about their apporach to security, the response has been either guarded or down right hostile. So all in an all a massive problem; both in terms of actual risk and of perception of security within business.

It’s for these reasons that I am big fan of the UK Government’s Cybe Essentials scheme. With this scheme company’s are required to satisfy a number of criteria with regard to their apporach to IT security. This can be either self checked to gain the basic accreditation or verifieed by an external auditor for the Cyber Essentials Plus badge.

Having looked through the audit questions, I belive the criteria are set at a sensible level so that small companies will not feel this unecessary beuracracy, but at the same time provides some really good IT security practices. I also belive that in carrying out exercise, directors’ of small comapnies will get a good understanding of the kind of thing their organisations should be doing. The badge is also appropriate for larger organsations, and will lead them nicely into ISO 27001 if they want to go on to that route.

So to my mind, an excellent initaitive by the Government, and one that all comapanies in the UK should be paying careeful attention to.



The Importance of Documenting Transition Programmes

8 July, 2014
Notes from the field

Although this seems an obvious thing to say, it is something that is often overlooked, and so I’m going to talk briefly about the importance to document any major project, and especially a large transition programme. By this I do not mean documenting the various components of the programme, but document what happened during the process itself.

Start with a baseline

When conducting any technical improvement project we would always get a baseline of how things are performing in order to fully understand the improvements that have been implemented. With the case of a large transition project I would advocate a baseline approach that goes much further than pure technical performance counters. The areas that I would consider documenting would include the following:

  • Perception of users of the system
  • Quality of the system
  • How did things visually look
  • What did the staffing levels and skills matrix look like
  • What were the capabilities of the system
  • What was the process that clients had to go through to achieve certain activities
  • What were the time and costs to achieve various activities

Document the process

Keeping the premise that people have short memories I further believe that it is important to document the process of the project, again I think think this should go further than the standard project management documentation that is needed to adequately manage and report on the project. I think it is important to get to the heart of how the project proceeded and the people that were involved. The obvious solution is to keep a diary of the project, being careful what you write in there. In addition using the topics from the baseline create stage reports of how things have changed from that baseline.

A Picture is Worth a Thousand Words

As the old saying is often misquoted, a picture is worth a thousand words. I would add that photographs, diagrams, and video snippets are essential components of documenting the transition process. Not only will they bring back memories more vividly, but they’ll also probably raise a few laughs.

So what is it for

OK, so you’ve create a baseline, kept a video diary, and photographed key moments, so what is it all for? From a pure project management perspective it will provide an extra way of presenting information in the project closure and lessons learned. However, these data are much more important than the dry aspect of the project management process.

The information can also be used in the following ways

  • Publish on your Intranet to tell the story of the project to your none IT colleagues
  • Have a way to remind people of how things used to be
  • Put together a presentation for all the project team members to remind them of all the work they have done, and therefore as a mechanism to show appreciation.


So before you start your next transition project, start documenting!

Creating the dive management website

Notes from the field

A little over 2 years ago I had a conversation with a local dive shop owner who made the comment that he was not happy with his current online database, from this conversation I came up with the concept of divemanagement.com which is now available to any scuba diving centres around the world.

Initially the brief was to create a database that could be used by the Dive Centre’s management team to carry out the following activities. Manage customer details, manager customer orders & invoices, manage dive trips, manage courses, and manage their dive equipment including maintenance. As we discussed the criteria and how the system should work, it quickly became obviously that what needed to be a simple system from an end user point of view was going to be quite complex from a database and coding point of view. What’s more the owner wanted the database to be accessible by all his staff who were based in three location, at least 100km apart from each other. We therefore determined that this needed to be an online database with a secure and easy to use interface.

Looking at the amount of work that this was going to take, we evolved the idea into creating a system where any dive shop around the world could sign-up for the service and have a personalised interface to carry out the same activities as the own I had been talking to. This meant an extra layer of complexity for the application but not that much in comparison to the total complexity of the application.

We therefore spent several weeks working through user stories and usage scenarios to fully understand how the system needed to work, and what aspects were idiosyncrasies of this local dive centre and what were things that all dive centres would need to use. To ensure the best feedback on this I interviewed a number of dive masters who had worked all over the world, taking them through each of the scenarios and getting their feedback on what was too complex, what needed to be added, and what was a local oddity.

From here a database design was created an the job of architecting the application began.

Due to previous experience I decided to use Ruby on Rails (v3.xx), making use of the Acts as Tenant gem for the multi-tenant aspect of the application. In addition I made the decision to following a Test Driven Development paradigm from the start. With these decisions made I started to create basic usage scenarios to begin developing the application in an Agile way. With these approaches we had confidence in code quality and application functionality as the development progressed. With regular test sessions with the owner and his team we were able to tweak the application as it went along to ensure it met all of their requirements.

In late 2013 the application was finally complete and handed over to the owner for usage. As the application was used in production by their team I was able to further enhance the application as time went on. When all of the payment processing applications were finally completed in mid 2014, the application went live to the general public.

Now it is live I am now in continuous upgrade mode to add features that have been requested, both from the development process and from new customers.

So that is the very brief history of how divemanagement.com became live and is now being used

The need to teach your steering committee

3 July, 2014
Notes from the field

As project, programme, and portfolio managers we have all been taught the need to communicate effectively with our stakeholders and most importantly to our steering committee. Moreover we have learned that no matter how interested and engaged that steering committee is, they are generally, quite rightly, from the business side of things, and therefore have no time for jargon or techno-babble.

We have therefore learned to write reports that stick to high level project progress written in a way that all members of the steering committee can understand, where possible using business language. And this is definitely how I would advocate reporting on projects if you want to keep your your steering committee sweet.

However, in a recent project that involved the re-write a business critical software solution that was being written on a different database and software development platform I saw the need for something slightly different.

The problem that I saw is that the steering committee didn’t have even a basic understanding of what a database is, and more importantly the difference in technical terms between the database that was being migrated away from and the database that was being migrated to. In normal circumstances I would say this is not a problem, and should be managed by the project manager.

However, as the stakeholder group were technologists, just not computer technologists, they gave the impression that some things were understood, and that they should be able to input into how some things were being done. This of course led to a number of issues where basic concepts were not fully understood, but arguments and decisions were being made on their perceived understanding. The more the project progressed, the more that the stakeholder group were unable to admit that actually, they didn’t really know what was being discussed, even though they had championed various positions.

What became clear to me at the end of the project, is that training should have been carried out on some of the technical principles that would be important throughout the project. Although this is obvious for this very specific project, I feel that maybe this would be relevant in a number of projects with less technical people on the steering committee.

I would therefore advocate the following

  1. Carry out a 1 hour session at the beginning of the project to expose the steering committee to the concepts and terminology that will be important throughout the project
  2. Create a 1 page summary sheet of that training session
  3. If the project is longer than 3 months, carry out refresher sessions where necessary. To help save face, call these additional terminology sessions, but go over previously discussed concepts

However, this is not an excuse to talk geek-speak to your steering committee. Don’t get carried away with the thing. they are still none technical, and this is not an excuse to litter steering committee meetings with jargon.

Sony announces a 185TB cartridge, but who’s going to use it?

5 May, 2014
Notes from the field

Sony announced that they have developed a cartridge capable of storing 185TB data, http://www.bbc.com/news/technology-27282732.

On the face of it this suggests that backing up to tape is not quite yet dead and will be a welcome relief to those organisations that are required to backup to tape by their regulators. But is this really the way to go? With modern disk systems coming down in price by the hour, and the ability to carry out block level backups, that allow backing up multiple terabyte systems in seconds, why would you want to backup to tape which will take many hours or days to complete?

It may be that you have an archive solution on disk, and that you may want to back-up a monthly copy of the archive to tape. This would allow for very fast backup of everything within that month using your block level backup and restore solution, but to also be able to keep a permanent a full version of your month end backups.

My other thought with so much data on one cartridge, I hope that they have improved the reliability of the tapes in terms of reading from old tapes. The last time I used tape backup was back in 2004, and when we recovered from tape there were many instances when the tape was not able to be ready. And these were tapes that were written to using state of the art backup systems and stored offsite in climatically (and electromagnetically) controlled environments.

So well done on achieving a remarkable technical breakthrough, but who is going to use it and how?

Windows 8.1, It’s not (all) about the Windows Button

19 January, 2014
Notes from the field

windows-8-logo1I’ve been using Windows 8.1 for some time now, and so is worth writing about this much anticipated and somewhat controversial update to Microsoft’s desktop operating system.

Windows 8 was released in late 2012 to much hype and a fair amount of gnashing of teeth. As I have written before I felt that Microsoft had made bold decision to create an operating system that would work equally well on a traditional notebook / desktop computer as well as on tablet computers. For the most part I felt that they had done a great job and after a couple of days of transition really enjoyed the interface. However, in using the system, it wasn’t quite right, this wasn’t because of some major design flaw, and was certainly not in the same category as ME or Vista, but there was something not quite right. The press were a little more forthright and the focus of the complaint was about how the Windows button had been removed. Although there was a sense that a lot of the criticism was coming from the Mac fanboys, this sense that the operating system wasn’t any good did seem to seep into the consciousness of many people.

Microsoft therefore announced that they  heard and understood this criticism and would be releasing version 8.1. So in late 2013 Windows 8.1 was released to the world. Yes it had a Windows button, but the functionality was very much different to the button in previous versions. In 8.1 it takes you to the tablet view so that you may choose an application to run in a more visual way. Right clicking on the Windows button allows you to select more advanced features that were a bit of a pain to access in the original version, so in that respect I am much grateful for the re-appearance of the button.

However, Windows 8.1 is much much more than the introduction of a button. With Windows 8.1, it feels as though the entire operating system has had a tune up and feels much tighter than the original version, with 8.1 things just work. Below are the areas that have been addressed.

Microsoft still want you to use the search button to find all of your applications, tools, documents, music, videos, and utilities, as well as searching the net. However it now works much better and you have a greater level of confidence that things are working. And this is a nod to how the world seems to want to use computers, use search rather than browsing to a folder.

The Windows  Store now actually works. Whereas before it always seemed a bit hit and miss if the store worked and finding applications was a painful experience. In 8.1 the store has its own search box, and finding applications is very easy. It’s a shame as this seems to be a pretty obvious thing to get right, but in 8.0 it was far from right.

The music app is now much more feature rich and if you have a Windows phone, an Xbox, or use Microsoft cloud, you find yourself in a highly connected environment where sharing music across all your devices is incredibly easy.

In 8.0 there was the choice of installing either the Windows App or Desktop version of SkyDrive, or to install both. What’s more you had little control about what you were downloading. The multiple versions also caused the behaviour to be at best clumsy and sometimes woefully erratic with files being re-downloaded many times. In 8.1 there is just the app version of SkyDrive and is now easy to configure and provides a much higher level of confidence in what it is doing. Again, if you buy into the ecosystem concept, using SkyDrive in a Windows environment provides a whole new level of integration.

With regards to tablets there have been a number of incremental changes that make using a tablet more of a joy. Although I liked using a Windows 8.0, I always had a nagging feeling it was not quite as well put together as an iPad. With 8.1 the whole experience is much more enhanced and you feel that the operating system was built for the tablet.  With these changes I can see tablets being genuine contenders for laptop replacements for people who are using all but the most intensive of desktop applications.

One of the more exciting changes is that the operating system has now been optimised for use on 8″ tablets so that they may compete with the iPad mini and Samsung Note. My initial experience with the Lenovo 8″ tablet is extremely good. However, I’m still struggling to work out how to restrict the number of devices I need when there is the option of a 4″ Windows Phone, 6″ Windows Phone, 8″ Windows Tablet, 11″ Windows Tablet, or full notebook computer. I’ll leave thoughts on that knotty problem for another time.

So all in all I’d say Windows 8.1 is a massive upgrade from 8.0 but in lots of small ways, so that the whole is greater than the sum of the parts. It’s just a shame these changes were not there from day 1.

Post acquisition integration case study for two email systems with a total of 11,500 mailboxes

Case Studies


My team were briefed by the CEO that the firm were in the final stages of acquiring another firm. On the day that news was to be released both email systems should be fully integrated, but we were not allowed to contact the other firm’s IT team until 45 days before the completion date, and the deal was seen as a merger not an outright takeover. This raised the following challenges.

  • There was no advanced knowledge of the other firms email system in terms of technology or deployment methodology
  • Once we were allowed to talk with the other firm, we would have to present a solution in a way that would foster engagement rather than hostility.
  • There would be very little time for testing and planning once we were officially allowed to communicate with the other firm
  • We would be reliant on a reliable network link to be able to carry out the integration.


From these challenges and the allocated task that was required to be done within 2 months, it was clear that some decisive actions were needed to be taken.

  • Through some unobtrusive indirect methods we were able to determine the technology being used in the other firm, and from the number of employees stated on their website were able to have a reasonable estimate of the size of their environment. Using these parameters my team carried out a number of tests to determine the best way to integrate the two systems. Once we had a good idea of how the migration was to take place, extra hardware was procured to help with the process.
  • Once communications were allowed, we exchanged architecture diagrams and set-up a meeting at the other firms offices. At the same time a network link was commissioned that had a 30 day lead time. With the extra information to hand we fine tuned our tests and prepared a project plan on how the migration would take place and how this would affect day to day activities.
  • A series of meeting were held with the other firm’s IT department where our plans were presented and approved. We therefore quickly moved into fine tuning the plan and get more information about the other environment. The tests we then re-run with the other team and using near production data.
  • The plan was for a big bang approach, but with an easy back out route. The acquired firms email system would be migrated into our system and then tested. As both systems were Microsoft Exchange, and the client application being Outlook, no configuration changes were needed on the client side.
  • Contracts were finalised late on Friday, at which point the acquired companies email system was shut-down and the migration was initiated. All databases were fully migrated and tested by 4pm on Sunday afternoon. When their employees returned to work on Monday, and the announcement had been made, the two systems were fully integrated.
  • The only issue that was encountered was that some permissions on a few public folders had been corrupted and had to be resolved manually, this was completed by Tuesday afternoon.


Due to two highly professional teams and a large amount of preparation, the two email systems were able to be integrated in one weekend. This allowed company wide emails announcing the success of the merger of the two firms to be sent by the CEO on Monday morning.

Developing Rails apps under Windows. No really

16 October, 2013
Notes from the field

Common wisdom states that if you are going to develop a Ruby on Rails app then you probably be doing it using a Mac. And  there is good reason for this wisdom. First Rails id built into OS X, it easy to upgrade, and installing MySQL into the UNIX subsystem is also a doddle. Once you have all of that working, running rails or rake commands from the Shell is running Rails in a native manner and so works ever so easily. Add to that the awesome TextMate application used for editing your rails app then there are very good reasons to use a Mac. But there’s more, installing Rails on a Windows machine can be best described as a hack, and I have never been confident that things are totally working the way you want them to work.

About two years ago I decided to make the transition back from OS X to Windows, this was largely to do with the machines I was using at work, however with the launch of Windows 8, and now 8.1, I see the Windows OS being a beautiful OS and has some very powerful linkages with Microsoft’s cloud based ecosystems. So what to do?

Well in truth it is a bit of cheat, but if you read on, I’ll argue that the same cheat should be done by the Mac community.

The simple answer is to use Virtual Box. Virtual Box is free desktop virtualisation software that can then have other operating systems installed within it and run at the same time as the host operating system. In this case I use the excellent Bitnami Linux Ubuntu image that comes with a complete Rails and MySQL stack prebuilt. So with that you are good to go. One caveat is that if you are really pushing the boundaries and using a Windows tablet, these machines currently only support 32 bit operating systems (including virtual operating systems) and so you won’t be able to use the prebuilt Bitnami stack. However, you can use a standard Ubuntu 32bit distro and then add the Rails stack yourself.

What I then do is create a shared folder on the host machine that the Linux virtual machine connects to. The Virtual Box documentation explains how to do this very nicely. At this point you have your Rails server running on Linux, but with the code on your Windows machine so that you can use your favourite application to edit it, in this case I use Sublime Text 2. moreover you can either have that directory within your SkyDrive / DropBox folder (careful to purge your log files though!) and so have a ‘real-time’ backup, or use the GitHub gui to sync the files to GitHub.

You can then use application such as Putty, or SecureCRT (I prefer the latter for a more feature rich application) to create an SSH session to your Linux server. So you now have a Shell and carry out all those groovy rails and rake commands. Installing SQLyog then allows you to fully control your database.

So there you have it, you are running Rails in an OS it was meant to run in, but you control the environment in an environment you are comfortable with.

I’d actually take this one step further an advocate that Mac users should also use this method to develop their Rails apps, albeit within OS X. The reason for this is twofold.

  1. The Mac UNIX subsystem is most likely not exactly the same as your server that will the production code. And so you will find occasions when the code runs fine on your dev environment, but raises errors in production. However, if you build your  dev Linux server to be of the same type as your production environment, that will mitigate the issue of different platforms
  2. As the Mac UNIX subsystem is used for all functionality for the Mac, there is the chance that by installing other apps on the Mac, or carrying out activities within UNIX not to do with Rails, you may mess up your Rails dev environment. So having that dev environment within a VM mitigates any risk to the environment.

I have now been developing in this mode for the past 18 months and am really enjoying it. So there you have it, how to develop Rails apps in Windows, and actually how the Mac crowd should actually be using their Macs to develop Rails Apps.

Organisational change case study for a distance learning college with 3,000 students

13 September, 2013
Case Studies


After an initial assessment of the department, its capabilities, and its relationship with the wider college the following issues were identified.

  • As the college grew the Operations department had taken on more and more responsibilities to the point where the mix of disciplines was not appropriate and often misaligned to the core goals of that department
  • There was an organisational malaise as there was a string ‘them and us’ culture between academic and operational staff that was not healthy for the challenges ahead
  • Due to a previous cycle of underfunding there was not an appropriate level of resources (personnel or equipment) to run the department to a standard that the college required
  • There was a poor orientation of office space that reinforced the them and us culture. Academics were on the first floor with spacious personal offices, operations were on the ground floor in noisy poorly appointed offices. The way these offices were separated also created divides within the operations teams.


From these challenges and the allocated task that was required to be done within 9 months, it was clear that some decisive actions were needed to be taken. As with all organisation change projects, the success of the project would largely be determined by how engaged those affected would be, and how much resistance to change there was going to be.

  • The first action was to propose an eventual structure and organisational change plan to the executive board, along with a plan of how those changes would be carried out. Once that structure and plan was approved, the following changes were made.
  • The sales and student service teams were moved to the business development department. This provided a department that was wholly engaged in the act of attracting and retaining students.
  • A new department was created that included the exams and assignments teams. Again, this created a department that was wholly engaged in managing the academic progression of students and was closely aligned with the academic side of the organisation.
  • The printing services team was moved under the control of the finance director. At first this was a somewhat odd decision, however, due to the heavy interaction between finance and that team ended up being another well aligned team that was focused on their goal of sending printed materials out to students.
  • What was left were the IT support, IT development, graphic design, and instructional design teams. As a major part of the new structure was to implement a virtual learning environment that would be supported by the two IT teams and looking at the overall goals of these teams, the department was rebranded knowledge presentation. As part of this branding those teams were brought together to look at what their own goals were to create a mission and vision. As they were actively part of this work there was a high level of buy-on.
  • Office space was re-aligned to provide better communication between the knowledge presentation teams
  • New roles were created within the two IT teams and the instructional design team, these roles added strength and depth to the teams, which then helped accelerate the other activities that were required as part of the overall engagement, and the long term goals of the College’s Principal.
  • Finally a permanent head of department was recruited. As the restructure had been all but completed by that time, and so there was a very well defined department, this was a much simpler exercise that would have been at the start of the exercise.


As with all organisational changes there was initial scepticism and resistance to change. However, as there was continuous conversation and delivery of promises throughout the project the outcomes were over whelming positive. Some specifics included:

  • An energised team that was focused on the goal of helping the college increase its student numbers
  • A high level of quality of the services that were provided to the college
  • An departmental structure that able to implement and support and new virtual learning environment
  • Better relationships with academic staff due to more well defined roles, responsibilities, and expectations.


This was another clear case of the challenges and potential outcomes of large organisational projects. It is often clear that external perspective can provide clarity on what needs to be done, and can gain rapid acceptance by the executive board. There is always scepticism and resistance to change by the people on the ground. However, if they are included in the conversation throughout the process, treated with respect, given clear roles & responsibilities, and then given the space and resources to do their job successfully, there can be some excellent results.

Latest thoughts Latest thoughts

  • Not a hater, but ... just because a lot of people watched it, doesn't mean they liked it. next nmbrs be more interesting,
  • do you have any plans to update the code in your ruby motion book? Some of it now looks to be broken,
  • RT : Using Azure Tags to Manage Your Cloud Spend ,
  • RT : Walmart announcing it's removing Confederate flag merchandise from its stores is the most depressing civil-rights triumph of all time.,
  • Which of these easy passwords are you most guilty of using at some point? via ,

Tag Cloud