20 December 2018

How to speak with customer-facing colleagues


"Farming looks mighty easy when your plow is a pencil and you're a thousand miles from the corn field." 
 Dwight D Eisenhower


One of my favourite things to do in a service improvement role is spending time with customer-facing colleagues (by 'customer', I could also mean citizen, service user, client, patient, etc.)  I used to work in customer-facing roles myself, and miss being able to make an immediate difference.  I also find it incredibly useful for understanding how a service can be improved.

At the same time, I appreciate that some people don't enjoy the prospect of going to the coalface.

In this post, I'm going to give some reasons why you absolutely should spend time with customer-facing colleagues.  I'm also going to offer some advice and a few questions you could ask your colleagues.  These questions have worked really well for me, from a service improvement perspective.

I'm hoping that, by sharing this advice, some of you become a little less reluctant to go to the front-line, and feel encouraged to give it a go.  If you already spend time with customer-facing staff, reading on may help you get even more from the experience.

Why do it?


If you're not a customer-facing worker for example, if you're a manager or work in a support role (IT, HR, Policy, somewhere in the 'corporate centre', etc) I'd say it's essential you spend regular time at the front-line.  I'd also recommend doing it if you are a customer-facing worker and want to learn more about another team.  Why?  Here are just a few reasons:
  • The legendary business writer Tom Peters has referred to a conversation he had with Howard Schultz,  former CEO of Starbucks*.  It's a huge organisation, yet despite the inevitable bureaucracy, hierarchy, meetings, and reports, Schultz says he visits at least 25 Starbucks stores a week.  He apparently said that "...we still sell one cup at a time, one customer at a time, one server at a time.  I need to see it, touch it, and feel it".  Peters calls this MBWA (Managing By Wandering Around).  If you're from a Lean background, you might call this the equivalent of doing a gemba walk or genchi genbutsu.
  • As the picture below says better than words could, your customer-facing colleagues are the ones who have the greatest knowledge of what matters to the customer, and what gets in the way of helping the customer.  They often have the best insights in to how you can improve service.  I always find colleagues respond well when I spend time with them and seek their views.  They especially love it when I bring a leader from the organisation along with me they appreciate the leader taking them seriously, and the leader becomes more highly respected as a result.  If you're not getting these insights directly, you're missing big opportunities to improve your organisation and build important relationships with your colleagues.  

  • Human Factors expert Steven Shorrock writes in this superb blog post about the 'four varieties of human work'.  I've summarised and simplified these in the quadrants below.  Work as imagined is what you think people do.  Work as prescribed is what the policy or procedure says people should do.  Work a disclosed is what people say about their work.  None of these represent work as done, what Shorrock describes as "the most important yet most neglected variety of human work".  And you're not going to get anywhere near to work as done by reading reports, staff survey results, process maps, or fancy charts.  You need to spend some time 'in the work' with your customer-facing colleagues.  

Questions to ask customer-facing colleagues


The following questions have worked really well for me, from a service improvement perspective.  I initially borrowed and adapted some questions recommended by the folks at Vanguard.  When I ask these questions, people tend to really engage, and their answers give lots of information and clues about how and where the service could improve.

  • Where does your work come from, and how fit for purpose is it when it arrives?  This one is good to ask people who receive work from someone else.  Referrals, for example.  It gives you some clues about the quality of work they receive, and how much rework they have to do to get it right.
  • Can you show me what you do with the work when it arrives?  As a general principle, it's best to aim for 'show' rather than 'tell'.  This gets you closer to work as done, instead of work as disclosed (see above).
  • Why do you do it this way?  This may demonstrate the rules, policies, and procedures that staff are expected to follow.  Or they may reveal workarounds they have to do to get around constraints.
  • What gets in the way of you doing a good job?  If you're only going to ask one question, ask this one.  I find people really open up to this one, pointing out many areas where they are hindered in doing what matters to the customer.  Keep asking "what else?" to learn even more.  
  • What wastes your time?  Similar to the last one.  I often find I start to learn more about the type and frequency of failure demand in the answers to this one.
  • How much, and how often?  A supplementary question for any of the above.  This helps you to start to quantify the waste in the service.  For example, I once found that around 60% of referrals received by one team needed to be reworked.  
  • How do you know if you are doing a good job? You'll often find out here if people are clear about purpose, or (linked to the previous question) if the focus is instead on trying to make the numbers look good.
  • How does change happen around here?  This will tell you how people feel, in terms of being involved and engaged with change.   

I recommend trying to ask these questions naturally in conversation rather than looking like you are coming around with a clipboard and a survey.  It may be worth committing a few of them to memory first.  

And most importantly, when you ask these questions, listen!  As my friend Jo Gibson says, "give people a damn good listening to".  Remember you're here to learn, not to judge or defend current ways of working.  As Dale Carnegie wrote, "become genuinely interested in other people".  And demonstrate what Amy Edmondson calls 'situational humility' make sure people know that you don't think you have all the answers, be curious, and emphasise that we can always learn more.

Over to you...


What are your thoughts?  What other reasons are there to spend time with customer-facing colleagues?  What can put people off doing it?  And what do you think of the questions I suggested?

Please feel free to comment below, or share this on Twitter or LinkedIn, or with someone who might appreciate it.


*Yes, I know Starbucks don't pay enough tax and put too much sugar in their drinks.  That's not the point here.  The point is that the CEO of an undeniably successful and huge organisation found it essential to regularly spend time with customer-facing colleagues and literally 'smell the coffee'.  

31 August 2018

Do 40 points mean your football team is safe from relegation?

This post is about football (soccer) and variation of data over time.  I hope you're a fan of both!  If you're not, you're still likely to find at least some of what follows useful in the workplace.

In the English Premier League, it's a commonly held assumption that when a team reaches 40 points they will be safe from getting relegated to the next league down.

I recently read an article from the Guardian newspaper that challenged this belief.  It suggested around 36 points would mean safety.  If you're not familiar with this 'rule of thumb', I recommend reading the article before continuing.  You'll find it here.

Like at work, when people start throwing single numbers around, I started to wonder:  "how have they come up with that number?"  "have they understood the variation?" and "what would it tell us if we put it in a control chart?"  I did that, and here's what it looks like:

















As you can see, it's a type of line chart, and I've plotted it as a time series.  Each dot shows the points needed to escape relegation (one point more than the relegated team) each season.

Like just about any data, there is variation.  In some seasons more points would have been needed than in others.  The three coloured lines on the chart help us make sense of that variation.

Average line


The red line in the middle is the mean average.  Although it's useful, there are problems when people only report on the average.  In my experience, the average becomes the number.  This is how they came up with 36 points in the Guardian article.  But the average doesn't take in to account variation.

For example, what's the average of 1 and 19?  And what's the average of 9 and 11?  The answer to both is 10, but the average doesn't tell us how much variation there is.  9 and 11 are much closer together than 1 and 19, but the average alone hides this information.   The same point is nicely made with this picture.

An average of 36 points (or 37 in my calculation) tells us nothing about how many points are needed to avoid relegation.  Roughly half the time more than 36 points will be needed, and the other half fewer will be needed.

Upper limit and lower limit lines


These are sometimes called the UCL (upper control limit) and LCL (lower control limit).

Data points going up and down between these lines are 'common cause' variation the normal changes in the points needed for safety between seasons.  You shouldn't pay too much attention to the differences between these points.  They represent the 'noise' in the data.  The 43 points needed in 2002-03 is just as likely as needing 31 points in 2009-10.

A mistake I often see people make is to pay attention to common cause variation, and act as if something out of the ordinary has happened when it hasn't.

These lines also help with prediction.  If you want to be confident your Premier League team avoids relegation this coming season, 44 points should be enough.  And as long as you've got 29 points you've still got a chance.  Anything less than that and you're most likely doing down to the next division.

You might have spotted a couple of data points 1992-93 and 1994-5 above these lines.  That's what's called 'special cause' or 'assignable cause' variation.  It's a signal that something is different.  How you react to this would be different to how you'd react to common cause variation.  With special causes you'd ask "what's different?" or "why the change?"

In this case, there was a change.  For the first four seasons on the chart, the league was made up of 22 teams.  After then, it goes down to 20.  From 1995-96 onwards, teams are playing fewer games and therefore accumulating fewer points.

To show the change, the chart should really look like this:

















Here are some handy 'rules' for spotting other signals in control charts.  This is where the average line becomes particularly useful.


Should football clubs set themselves a points target?


Probably not!  When you set a target, making the number becomes the focus, rather than doing the right thing in this case, the right thing for the football club and its fans.  It can also encourage a behaviour where people ease off when the targets is met, or looks like being met.  

In the 2012-13 season, Barnet Football Club were relegated from their division.  Their then manager, Edgar Davids was quoted as saying

"It's even more disappointing because we have reached all the objectives that the chairman set and reached the 51 points target but we've still gone down."

They would have probably have fared better if they'd focused on winning as many games as possible, rather than achieving an arbitrary number.


How does this relate to work?


This is all well and good when looking at sports league tables, but this blog is supposed to be about work and improving services.  With that in mind, there are some lessons we can take away from this post.

1. Be suspicious when people quote a single number often the average.  For example, I've seen the average time it takes to process benefit claims become the only figure used.  It was about 17 days.  Managers thought this was good performance.  Customers and stakeholders were given this figure, and they came to expect that's how long they'd be waiting for.   But a control chart revealed the predictable variation to be anywhere between 0 and over 100 days.  The average alone tells you almost nothing about performance.

2. Be careful not to confuse common cause with special cause variation.  I was once at a meeting where a department's figures were all 'red' because they were worse than the previous month.  The manager was asked to go away, investigate what had happened, and write a report to bring to next month's meeting.  This was a complete waste of time.  I put the data in to a control chart, and it was just normal common cause variation.  The senior managers had unwittingly reacted to is as though it was a special cause.  The next month they were back to 'green' possibly because of regression toward the mean.

3. Comparing just two data points tells you almost nothing certainly not about variation.  Although not covered in the above example, we see performance reports that compare this month to last month, now to this time last year, etc.  They might have arrows or colours applied, to indicate if performance if 'good' or 'bad' in relation to a target.  People are supposed to make judgements or decisions based on this information, with absolutely no context.  Displaying the same amount of information in a control chart would look like this:

















If you took this chart to a meeting, people would probably laugh at you or tell you to leave.  Yet it's seen as perfectly acceptable to present the same inadequate amount of information in a 'scorecard' or 'dashboard' report.

4.  If in doubt, plot the data in a control chart.  Or at the very least plot it over time.  This post has hopefully made it clear that data has no meaning without context, and that you need a way to separate signals (special causes) from noise (common cause variation).  That's why control charts were invented!

5. Don't use numerical targets.  They're arbitrary and make performance worse.  If you want to know why, have a read of my previous post on the subject.


Further reading

2 May 2018

9 reasons why targets make performance worse

"When management sets targets and makes people's jobs dependant on meeting them, they will likely meet the targets  even if they have to destroy the enterprise to do it" 
If you're in the UK, you'll be aware of the Windrush Scandal. The Home Office threatened the children of Commonwealth immigrants (mainly from the Caribbean) who arrived before 1973 with possible deportation if they couldn't prove their immigration status.

Something that contributed to the scandal was the Home Office setting "a target of achieving 12,800 enforced returns in 2017-18".  Much has been written about whether the targets existed, and if certain people in charge know about them. 

This post isn't about that.  Today I'm writing about the problems with targets themselves.  When organisations set arbitrary numerical 'performance' targets, which people or teams are judged by. 

Here are 9 reasons why targets make performance worse:

1. They narrow focus, so people concentrate on doing only what achieves the target.  This isn't the same as doing the right thing for the customer or citizen. 

For example, schools would give disproportionate attention to pupils expected to get a C or a D grade, as their target is pupils getting C or above. 

In hospitals, doctors were diverted from treating seriously ill patients to ones with minor problems in order to meet 4-hour waiting time targets.  As a result "people were left for hours covered in blood and without pain relief".   

Or Police didn't act upon warnings about child sex grooming because they were told by management to prioritise robbery and car theft, as that's what achieves the targets. 

2. People cheat so they meet their targets.  Studies have shown this.

Staff at an outsourced Police control room were making 999 calls at quiet times in order to meet their target of answering 92% of calls within 10 seconds. 

Confession time in my formative years I did something similar.  I was working on a call centre, and we had an average call handling target.  Me and a couple of colleagues worked out that calling each other and quickly hanging up was an easy way to meet our targets.  We were using our ingenuity to meet arbitrary numbers, instead of helping the customer.

3. Cherry picking. I was working with a housing repairs service.  Tradespeople were set targets for how many jobs they get through in a day.  This created a behaviour called 'sponge knocking'.  If a job looked like it would take a long time – and risk not achieving the daily target – they'd silently push a 'sorry we missed you' card through the letterbox, and scarper.

The UK government's troubled families programme was reported to have a 99% success rate of turning lives around.  However, investigations showed some councils were picking the families most likely to meet the success criteria, and not working with the ones that needed the most support. 

4. People lie.  For example, traffic wardens were fabricating evidence in order to issue parking tickets, for fear of losing their jobs if they didn't meet their target.   

5. Statistics are manipulated.  In the brilliant US crime drama The Wire they use the term 'juking the stats'.  This is brought to life in the 1.5 minute clip below (contains swearing).


This isn't just the stuff of TV drama.  For example, juking the stats has happened in the UK Police force, and at hospitals.  

6. They cause student syndrome (when a student only starts writing an essay at the last possible moment before it's deadline).  When you apply for planning permission at your local council, they'll have a government target to meet of 8 weeks to decide your application.  The chart below – from one council I worked at – shows the large proportion of applications were decided in week 7 or 8. 

It might be possible to decide the application earlier, but as far as performance measures are concerned, it doesn't matter as long as it was done in under 8 weeks. 

As a different council found: "to meet these targets, applications would get refused if they were unacceptable and the deadline was approaching. [This] meant that the applicant suffered delay, additional costs and frustration.... The Council in turn incurred additional costs either defending planning appeals or processing a second planning application”.

I once did similar analysis looking at complaints, with a 15 day response target.  No prizes for guessing which day most of the complaints were responded to on.   

7. Staff get sick.  Targets put people under pressure to meet arbitrary numbers, which causes stress. Police Officers felt "almost continually under threat of being blamed and subsequently punished for failing to hit targets".  This does not create an environment that's good for people's health. 

I've worked with call centre staff who had average call handling targets to meet.  They hated not being able to help the customer because their manager wanted them to end the call. 

At a target-driven DWP call centre, mental health-related absences tripled.  At the same government department, staff went on strike because "We are being prevented from providing good quality service because of unnecessary and unrealistic targets."

8. People do unethical stuff.  Targets are an extrinsic motivator (coming from outside the individual).  Studies have shown that extrinsic motivators undermine intrinsic motivation. 

Sometimes people do the wrong thing in order to avoid punitive consequences.  Like in this DWP example, where the employee is clearly upset.  They want to do the right thing, but at the same time they need to survive in a bad system.  

There are other examples, such as PPI mis-selling, where people are doing the wrong thing because the target is attached to a bonus (financial incentives act like a turbo-boost to targets). 

Then there was the Wells Fargo fraud in the US, where staff at the bank sold customers things they didn't ask for, in order to meet targets.  This also shows how arbitrary numerical targets are.  When John Stumpf, the CEO, was questioned about why staff had a sales target of eight financial products per customer, he said he chose the number because "eight rhymes with great". 

9. Targets can kill people.  A study looking at targets in the NHS found patients were required "to wait in queues of ambulances outside A&E Departments until the hospital in question was confident that that patient could be seen within four hours [the waiting time target]".  In one tragic instance, this practice led to the death of a 16 year old boy

*                                                 *                                                  *

I'm not blaming the people who are working to the targets.  When a manager, government, or someone in authority sets a target, people will tend to presume it's right.  The Milgram experiment showed us that.

People focus on meeting targets to avoid getting grief from their boss, to make sure they don't get sacked, or to secure a bonus.  We all have bills to pay, and can't afford to risk losing our income. 

To quote again from Deming: "A bad system will beat a good person every time".  Or to quote from John Little: "Targets turn good people in to liars and cheats"

My question to anyone reading this is – given all the points above – why do people continue to think it's a good idea to set targets?

What are your thoughts?  Please feel free to comment below, or share this post with someone who could be interested. 

23 April 2018

9 things to avoid doing when studying demand


The people at Vanguard have written a lot about studying demand in services.  Why you should do it, how you should do it, etc.  They even invented the term 'failure demand' – so I'm not going to repeat what they say here.  For a short introduction to studying demand, I recommend this one minute read from Simon Pickthall.

Instead, I'm going to share a few things I've learned that are not helpful to do when you study demand.  I've either witnessed everything in the list below, or made the mistakes myself. 

1. Do all the studying yourself.  As mentioned in a previous post, the main purpose of studying a service isn't for analysis.  It's to help people unlearn and relearn.  The best way for them to do this is by studying the service from the customer's perspective.  And studying demand is a great place to start.   

I made this mistake more recently than I would have liked.  The demand analysis became just another slide in a PowerPoint presentation.   It lost all it's impact, and definitely didn't change anyone's thinking about the service.

2. Guess what the demands are.  Get a bunch of managers together in a room.  Do some 'brainstorming' and –  based on opinions –  identify the most common type of failure demand.  Put these in spreadsheet, with actions, due dates, etc, against each type of failure demand.

Or, decide the most common types of demand in advance.  Make them in to a tick sheet, and give it to the people who receive the demand to fill in.  

Studying a service is all about learning and discovery – it's not about trying to have all the answers.

3. Use existing data.  Run a report from your CRM, using categories already defined by the consultants who implemented the CRM.  This is similar to the previous point, where you shouldn't assume in advance you know what the customer is calling about.  The difference here is you've paid someone else to do the guessing for you. 

4. Be constrained by the existing rules.  I was once with a team listening to demand for council housing repairs.  One of the most frequent demands – particularly around midday each day – was "are you still coming to today's appointment?"  The appointment was for a tradesperson to arrive at their home and carry out a repair. 

The existing appointment slot was 8am to 1pm.  For this reason, most of the team initially felt this can't be a failure demand.  The resident was calling during their appointment slot, not afterwards, so surely we've done nothing wrong?
 
They were viewing this demand through the lens of the existing organisational system, and all it's constraints.  Yes, no person has done anything wrong.  But the system could be improved to stop this demand happening in the first place.  For example, you could shorten the appointment slot.  Or, as some councils have done, ask the resident when they want you do carry out the repair, and turn up then. 

5. Don't involve the people who receive demand.  When you're sitting with them listening to demand, don't explain why your there or what you're doing.  Don't show them what you're writing down. 

It's important to show your findings to the people who receive the demand.  It removes some of the mystery about what you and your team are up to.  It's also an ideal way to validate your findings – show it to them and ask them if this looks like a 'normal' day.  Listen to what they say about it. 

6. Fear failure demand.  Look for excuses to categorise everything as value demand, for fear it will look bad or demotivate staff if you find too much failure demand.  If this happens, it starts to give you some clues about the organisational culture.  

7. Ignore the type of service you are studying.  I've made the mistake a couple of times of studying demand in a people-centred service in exactly the same way I would in a transactional service.   I've learned from this mistake now, and it's something I plan to write about in a future post.  

8. Gather too much data.  I've had people on my team before who were concerned they needed to collect data on thousands upon thousands of demands.  Yes, you need to be somewhat scientific.  But not to the extent of randomised controlled trial or anything else that requires similarly high levels or rigor. 

When have you studied enough demand?  When the people you're working with have learned what they need to learn, and when the demand has become predictable.  This means you're no longer seeing types of demand you haven't seen before. 

9.  Do nothing else.  I've seen demand analysis be treated as a one-off exercise, done in isolation.   There may sometimes be value in doing this, but you're missing out on a fantastic opportunity to improve the service if you do nothing else.

It's important to next find out how the service responds to value demand, and does what matters for the customer.  You'll then want to learn why the service responds in the way it does.  

*                                                  *                                                  *

How about you?  Have you done or seen any of these mistakes?  Are there any others you've seen that aren't covered here?  Please feel free to comment below, or share this with someone who might be curious.  

11 April 2018

Finding the purpose of a service in four steps


In my last post and the one before I discussed the importance of purpose, and the perspective it should be considered from.  Here I'm going to share some tips I've learned to help others define a purpose of a service – set out in four steps. 

1. Gather people together from different parts of the service
A service is normally made up of teams and people performing different functions (a common aspect of a command and control design).  When you split a service like this, the people in each function tend to become like characters from Rashomon, with different views about what the service exists to do as a whole.  It's helpful to bring those views together, and start aligning them around a common purpose.

2. Ask them what the purpose is
I've adapted an activity Dan Pink does.  Ask everyone "what is the purpose of [this service]?" and  instruct them to write it on Post-It notes.  Stick the answers up, and read them out.  (In local government, I've often had answers like "to carry out our statutory duties to..."   They are used to taking an internal perspective rather than a customer one). Next, have a discussion and agree a working definition of purpose.

At this point I've made mistakes.  I've been impatient.  I've wanted them to nail the purpose too early on, and wasted time debating the issue and getting nowhere. 

3. Go and study demand
Instruct the team to observe customer demands on the service.  Listen to phone calls, observe face-to-face conversations, read emails, etc.  Write down each demand in the customer's own words.  Consider what matters to the customer at the point of making a demand.  If you speak with customers, ask them "what matters?"  In a future post I'll write more about studying demand.

4. Ask them again
This time, be clear that the purpose should be from the customer's point of view.  Reflect on what you've all learned about demand and what matters to the customer.  I've always found that by now the team come up with a definition of purpose that's at least very close to representing what the whole service exists to do for the customer.  It can be useful to have a discussion about how this differs from the first version of purpose, and why it's changed. 
 
*                                                  *                                                  *
The definition of purpose is likely to keep evolving as you learn more about what matters to the customer.  That's good.  The important thing is to be crystal clear about purpose before starting to redesign the service – something to cover in a future post. 

What are your thoughts about these steps?  Would you use them?  How could you improve them?  Feel free to comment below, or share with someone who might be interested. 

4 April 2018

Purpose – from what perspective?


In my last post I discussed some reasons why a clear and meaningful purpose is important.  In this one I'll discuss the most important thing I've learned about defining the purpose of a service: define the purpose of your service from the perspective of the customer.

(Or citizen, service user, resident, client, patient, tenant, etc.  Whatever the most appropriate term is for the people the service is set up to serve.)

Too often organisations instead take an internal view.  The story of Oticon, a Danish hearing aid company, is a good one.  It's told by Lars Kolind in The Second Cycle – Winning the War against Bureaucracy.  Their stated purpose for many years was:

"Leaders in hearing technology"
 
This comes across as a bit 'meh' to me.  Is it clear?  What exactly does 'leaders' mean?  Do the employees care if they are the leaders or not?   Most importantly, does the customer care if Oticon are the leaders in hearing technology? 

As Lars says in his book "I never met any consumer who asked for the world's most advanced hearing aids... In fact I rarely met consumers who asked for hearing aids at all".  What mattered to the customer was to live a normal life with the hearing they had.  They therefore redefined Oticon's purpose as:

"Help people to live as they wish with the hearing they have"
 
I like this.  It describes clearly and succinctly what the organisation exists to do from the perspective of the customer.  I also think it does well in relation to the points discussed in my last post

They used their new purpose to redesign the organisation, so roles and functions added value for the customer and contributed to achieving purpose.  As a consequence they reduced costs, increased sales, and Oticon became profitable again.

How about you?  What's the purpose of your service – when you consider it from the perspective of the customer?  Feel free to comment below, or share this with someone who might appreciate it. 

In my next post, I'll share a few things I've learned about helping others to define the purpose of their service. 

2 April 2018

What is the purpose of purpose?


The first of W Edward Deming's 14 points for management is to "Create a constancy of purpose".  This is still very relevant today. 

For example, I recently listened to a programme called The Charity Business on BBC Radio 4.  In the third episode we learn that some charities are excellent at raising funds, but their impact is minimal.  We also learn there are other charities that make a huge impact, but are not so good at raising funds.  It means we end up giving money to the charities that are good at pulling on our heart strings and persuading us to donate, while other charities that make a difference go out of business. 

It sounds like some charities focus too much on fund-raising, to the detriment of their original purpose.  This is one reason why constancy of purpose is important – to keep everyone focused on doing the right things. 

Here are four more reasons why purpose is important:

1. Alignment.  In Fourth Generation Management by Brian L Joiner, he uses a diagram similar to the one below to highlight the importance of a clear purpose. 
The top box represents an organisation with no clear purpose.  Teams or individuals end up operating on their own – each pulling in their own direction.  It can result in chaos and dysfunction.  The bottom box represents an organisation where the purpose is clear and meaningful.  Everyone is pulling in the same direction

2. Decision making.  In this 30 second clip, Olympic rower Ben Hunt-Davis explains one of the reasons why his team won gold in Sydney.  During training, every time they had a decision to make they'd ask themselves "Will it make the boat go faster?" 


This was the purpose of their training.  It gave them clarity about what they were there to do – even if it meant missing out on trips to the pub.  Likewise, you can ask "will this help us achieve our purpose of...?" when making those difficult decisions in your organisation.

3. Motivation. In Drive by Dan Pink, purpose is one of his three ingredients of intrinsic motivation.  His research finds that "The most deeply motivated – not to mention those who are most productive and satisfied – hitch their desires to a cause larger than themselves".  Later on in the book he also says "If people don't know why they're doing what they're doing, how can you expect them to be motivated to do it?"

4. Measurement.  Purpose defines what your organisation exists to do.  If you don't have measures that relate to purpose, how will know how well the organisation is performing?  John Seddon puts it nicely with this diagram:
If measures are derived from purpose, it allows you to experiment and innovate with method.  You use your measures to see if your efforts are improving your achievement of purpose.  The mistake a lot of organisations make –  which I'll talk about in a future post – is when measures start with an arbitrary number instead of purpose. 

In my next post I'll share the most important lesson I've learned about defining the purpose of a service.  In the meantime, feel free to comment below, or share this with someone you think could be interested.

28 March 2018

Curiosity – a first step to changing thinking

The only way I've found that's worked to change management thinking is for people to go through a normative change experience.   This means that – rather than attempting to convince someone to change with well constructed arguments, people discover a new way of thinking for themselves. 

The video below gives – in less than two minutes – a great example of a US senator going through a normative experience.  It's a clip from the excellent documentary film Merchants of Doubt


By it's very nature, you can't make someone go through normative change.  As Chris Argyris says, people need to make a free choice.  They need to discover it for themselves.  One way to encourage them is to make them curious.  If they become curious then they may decide to discover more for themselves.

4 tips for making people curious

Here are few tips for making people curious.  They're in no way guaranteed to work, but at least give some structure to an approach.

1. Listen.  And be curious about the other person. There is plenty of information out there about listening skills  – I find Gerard Egan's Skilled Helper approach effective. Don't assume you know what the person's motivations, concerns, and assumptions are.  Find out from them by listening open-mindedly and without judgement.  This can take time – you may need to build a relationship over a number of conversations.  There are no 'quick wins' when it comes to changing thinking.

(The next three tips are taken mostly from the book Curious by Ian Leslie). 

2. Identify knowledge gaps.  Seek to understand the person's knowledge of the subject you want to make them curious about, and the gaps in their knowledge.  If a person already knows something about a subject, they will naturally respond my wanting to know more. When they know nothing about it they’ll find it hard to engage their brains – they can’t imagine finding it interesting or are intimidated. On the other hand, if they feel they already know lots about a subject they are unlikely to be interested in more information about it.

3. Use surprise  – but not too much.   Get the balance right between low and high surprise – when a violation of a person’s expectation is more than tiny and less than enormous. If the violation is minor, people ignore them. When they are massive they refuse to acknowledge them. 

4. Be like Agatha Christie.  Her books didn't tell you butler did it on the first page.  They make the reader curious to find out more and keep reading as clues are gradually revealed.  Likewise, don't try and tell the person all the answers straight away (this could also lead to information overload).  Instead, leave a few questions unanswered, prompting to go and discover for themselves. 

Remember that list in my last post?  Even though they may not change thinking, they can help make others curious.  Now, whenever I use these rational methods (reports, presentations, meetings, conversations, etc) I’ve given up on trying to change thinking. My aim is to tailor my message – using some of the tips above – in a way that gets people curious enough to take the first normative steps.

What has made you curious in the past?  Feel free to comment below, or share this blog with someone else who might be curious. 

19 things that don't change thinking

In my last post I spoke about the importance of thinking when it comes to improving work, and some of the mistakes I've made.

Here's a list of things that don't change thinking about management and work.  How do I know?  Because I've tried them all! 
  1. Passionate speeches
  2. Slick and well rehearsed presentations
  3. Cool looking slide decks
  4. Sending videos of other people talking
  5. Case studies
  6. Referring to academic research
  7. Meetings.  Definitely not meetings
  8. Finding examples of other people or organisations who've done things differently
  9. Strategically placed posters
  10. Asking the boss to insist certain people attend a meeting or workshop with you
  11. Process improvement workshops
  12. Lots of data presented using charts that look simply magnificent
  13. Clever comebacks during a heated discussion
  14. Designing and delivering training sessions
  15. Telling stories that take the listener on a hero's journey with obstacles to overcome along the way, and an ultimate triumph
  16. Recommending books or a chapter or even a passage in a book
  17. Re-writing an organisational policy or procedure
  18. Writing a blog post!
  19. Attempting to know all the answers
These don't work as they are attempts at rational change.  What does work?  I'll discuss some possible first steps towards doing this in my next post.

How about you?  What attempts have you made to change thinking?  What worked and didn't work?  Feel free to comment below, or share this with someone who could be curious. 

Why it's all about the thinking

For my first post, I'm going straight to what I think is the most important topic  management thinking

One of the biggest lessons I've learned is that if the thinking doesn't change, real and sustainable improvements won't happen. 

If you're familiar with John Seddon or his organisation Vanguard, you'll probably recognise this diagram, showing the relationship between thinking, the system, and performance:

thinking_system_performance
Performance means things like customer satisfaction, costs or revenue, and staff morale.  It comes from the organisational system  rules, structure, performance measures, policies, procedures, roles, IT, etc.  The system isn't there by accident or chance.  It's been designed by people (usually those in charge), based on their thinking the assumptions and theories about the design and management of work, and about the people who do the work.

Conventional organisations are based on what you might call command and control thinking.  Typically you find top-down control, decision-making removed from the work, targets, functional silos, etc.  Although often well intentioned, it leads to poorer service, increased costs, and a disgruntled workforce. 

Most of my failed attempts at improvement are because I didn't change the thinking.  At best those efforts lead to minor and short lived process improvements.  I've typically made at least one of the following mistakes:

1. Putting too much focus on the analysis, rather than changing thinking.  In the past, my focus has been on gathering data and analysing data. For example, big sample sizes of demand and end-to-end measures, or detailed process maps showing.  Data is important  really important but on its own it won't change the thinking.

2. Doing all the learning myself.  Or sometimes only doing the learning with  the front-line team, and not with the leader who has the authority to change anything.  This comes partly from the previous point my desire to get all the analysis.  But if I do all the data gathering and analysis, nobody's thinking will change.  It's also because I've not always put enough focus at the beginning on what Peter Block calls contracting: "an explicit agreement of what the [in my case internal] consultant and client expect from each other and how they are going to work together".

3. Leaving the thinking bit until last.  I used to get all my analysis done, and then finally try to introduce the thinking behind it all at the end.  This didn't work well.  Now I take any opportunity to talk about thinking right from the start.  For example, early on when listening to demand we might discover an average call-handling time target.  Before, I would have saved talking about the thinking behind it until the analysis was done.  Now I'll pause, and start asking questions like "how does this target effect the customers?" "how do your colleagues feel about it?" "why did someone think setting this target was a good idea?" "what are the assumptions behind it about how we should manage, and about people and their motivations?"

So now you know some of my mistakes.  You might be wondering a few things.  How do you change thinking?  What should you replace the old command and control thinking with?  And how will you know if the new way of thinking leads to better performance?  These are all subjects I'll attempt to cover in future posts.

For now, what do you think about thinking?  Please feel free to leave a comment below, or share this post with anyone who could be interested.

About this blog

I'm Sam, and this is my blog is about improving service about improving the lives of the people a service exists to serve. 

It's also about making it easier for the people who work in that service to get things done.  If the improvements happen to make or save some money then all the better. 

I don't claim to be an expert, but I have been doing this with intent since 2010, and know more than I did a few years ago.   I remember my early struggles, and most of the mistakes I've made.  One important thing I've learned is that if you want to make genuine improvements, you need to think differently about work.  Hence the name Rethinking Service.

I plan to share tips, advice, opinions, and things I've learned along the way most of it from other people whom I will endeavour to reference.  It's partly borne out of frustration at not being able to improve things as much as I'd like to at my current and previous workplaces.  I've never had the positional authority of someone high up in an organisational hierarchy.  Like many other people I've met, I'm hindered from making improvements by the existing thinking in the organisation.  

My ambition for this blog is to help people like me  who go to work to make things better.   I'm sure the blog will evolve as I learn more from writing it.  I'm looking forward to learning from others on my journey, so please feel free to comment, or share this blog with anyone you think could be interested. 

You can also follow me on Twitter @rethinkingserv