Twitter and the end of kindness 13 Sep 2017, 5:36 am

When you see somebody with spinach in their teeth, the kind thing to do is to tell them privately. If you tell them to their face, in front of a group of friends and strangers, you get the same end result; the spinach gets removed. However in doing so you bring attention to the problem, and shame the participant in the process. So what could have been an act of kindness, quickly turns into an act of cruelty and public humiliation.

There was a time, not so long ago, when you would contact a company directly if you had a problem with a product of service. Maybe the product got lost in the post or wasn’t as advertised, maybe the hotel room wasn’t as expected, or the food didn’t come up to scratch. In these situations you’d tell the waiter or manager, drop the company an email, or call customer support.

These days, when you see a problem, the first reaction is often to reach for Twitter and share your frustration with the world. With large companies this often comes from experience. We’ve all had conversations with banks, utility companies and airlines, which have gone nowhere, so we end up venting our frustration online.

While it’s easy for companies to brush private conversations under the carpet, it’s much more difficult to do in public, so we’ve quickly learned that if we take our criticisms to Twitter, there’s a better chance they will get dealt with.

I’ve had this experience myself. After several frustrated phone conversations with my airline of choice, I took my complaints to Twitter. They immediately responded, took ownership of the problem and sorted it out straight away. I’m now on some kind of airline social media watchlist (the good kind), re-enforcing the fact that if I complain on Twitter my problem will get solved faster than phoning customer services.

In order to avoid a public relations disaster, complaining on social media encourages the best customer service from a company. This is something the large companies could have avoided, by delivering consistantly great customer service through traditional channels. As this hasn’t happened, publicly shaming companies has become the go to way to ensure a good customer service.

If this stopped with large companies, or companies who you’ve experienced an irreconcilable service failure with, I wouldn’t mind. However this behaviour has become the standard behaviour with everybody now, from big companies to small companies, from celebrities to friends. Rather than contacting people directly, we’ve started using public shaming as a tool to correct behaviour.

I see it regularly on Twitter. A friend or follower tweets you to highlight some small problem. Maybe there’s a broken page on your website, a typo on your recent medium post, or you accidentally referenced the wrong user in a recent tweet.

It would be super easy to email or DM the person, but instead you post to their public timeline. Most of the time you mean well, and are simply trying to help. However by posting publicly you draw other peoples attention to the problem, forcing people to act out of shame and embarrassment rather than gratitude.

People usually post to the public timeline because it represents the least amount of effort to do a good thing. You don’t have to switch panes in your Twitter app, go hunting for their email address, or ask if they’d mind following you so you can direct message them. You can get it off your mind as quickly as possible and move on.

However sometimes it feels like there’s an ulterior motive. That there’s a small amount of joy to be had from spotting the person you’re following has done something wrong, and flagging it up in public. That you get the public perception of doing a good deed (which is always nice) while making a small but pointed statement that they’re not perfect in front of their friends. It’s as though you’ve spotted the spinach in their teeth, but decided that the kindest thing to do was to point it out loudly in a crowd, in front of a thousand of their closets friends.

Personally I’d prefer to know rather than not know, so I’m definitely not suggesting people stop pointing out these small errors and transgressions. However I think we should think twice before posting these things publicly, and if time allows, reach out to your friend or follower directly first. That way you’ll avoid accidentally embarrassing them, or making them feel that they have to act from a sense of public pressure.

More importantly it’s the kind and polite thing to do. It’s also going to make that person think more warmly of you, as you’ve done them a favour without seeking any recognition, while maintaining their dignity and public reputation at the same time.

It’s only a small behaviour nudge, but from now on I’m going to do my best to approach people directly first, whether it’s large companies, small businesses, website owners, followers or friends, when I notice something is amiss. I urge you to do the same.

The Golden Age of UX may be over, but not for the reasons stated 5 Aug 2017, 7:52 am

Last week an article entitled The Golden Age of UX is Over popped onto my RADAR, after causing a bit of a stir amongst the design community. If I was being generous I’d say it was a genius title, designed to spark debate amongst UX designers. If I was being slightly less generous, I’d say it was a devilishly brilliant piece of click-bait, designed to drive traffic to an agency site. Either way I had a feeling the article would annoy me, so spent the next couple of days actively ignoring it. However temptation finally got the better of me and I ended up taking the bait.

On the whole I agree with the sentiment of the title that the “Golden Age of UX” probably is over. I say that as somebody who has been working in the space since the early naughties, set up one of the first UX practices in the UK, and curates the longest running UX conference in Europe.

The field of UX started life as a small but emergent community of practice, on the fringes of conferences like SXSW and the IA Summit. It grew through the blogs of early pioneers, and through the work of consultancies like Adaptive Path and Clearleft. The community accreted around new conferences like UX Week and UX London, which, in their early years, attracted almost the entirety of the UX communities in their respective locations.

I would argue that the quality of innovation, the quality of discourse and the quality of change in the UX space peaked somewhere between 2008 and 2012. This for me could arguably be described as the golden age of UX.

As with any gold rush, news of the find spreads quickly, and as more people rush in to make their fortunes, resources get depleted. By the middle of teens, UX hyperinflation started to occur. Every freelancer and every agency added UX to their titles, without really understanding what the term meant. “UX Designer” featured in lists of the most in-demand new professions and recruiters rushed to fill the gap, with often disastrous effects. While the number of people who self identified as UX designers carried on climbing, a deep and detailed understanding of what UX actually was started to ebb away.

The meaning of UX got muddied. Was it the same as UI? Was it another name for interaction design? Where did strategy, research and IA fit in? UX vs UI memes started to form on Twitter, arguments erupted about the existence of unicorns, and seemingly nobody could agree on anything anymore.

For years I fought to maintain a clear definition of UX, one that linked back to the community of practice from which it sprang. However the tidal wave of misunderstanding and misrepresentation became too big to fight, so I eventually gave up trying. UX had become so watered down and misunderstood that popular perception no-longer represented the community I knew. I became resigned to the fact that meaning changes based on usage, and if the majority of people see UX as this lightweight blending of prototyping and UI, devoid of any deep research, awareness of business needs, or commercial imperative, so be it.

It was this latter sentiment that annoyed me about the “Golden Age” article. Criticising a discipline based on cargo-cult thinking. In-truth, UX has always taken account of business needs and market forces, while Lean start-up was little more than a reformation of user centred design for a business audience. As a result, the article was less about the golden age being over, and more the dawning realisation that they may have confused a badly drawn map with the territory.

It’s also worth noting that most people tend to associate a “golden age” with their formative years, whether it’s movies, musics, or the discovery of a new career. So it’s possible that the golden age may be over for some, but for others it’s just beginning.

Design Leadership Slack Team 30 Jul 2017, 2:30 am

I recently started a Slack Team for Design Leaders. We currently have over 300 members; mostly Heads, Directors and VPs of Design from tech companies like Spotify, Etsy, AirBnB, and Facebook, along with more traditional organisations like Virgin, Tesco, BBC and Capitol One.

We’ve been very careful building the community. As a result the signal to noise ratio is remarkably high. Recent conversations have included:

  • Discussions around recruitment and whether design tasks are a good idea.
  • Various design leaders sharing their career progression ladders.
  • An ongoing debate around the perfect team structure.
  • Whether managing Millennials required a different set of skills (the general conclusion being they don’t).
  • The challenges of managing fast growing teams.
  • Tactics for your first 90 days in role.

The criteria for joining is fairly straightforward. Are you a Head of Design, Director of Design or VP of Design*, for an in-house team? Are you generally a nice person and interested in sharing your thoughts and experience on the subject of design leadership? If so, drop me an email and I’ll hook you up.

Once you’ve joined, please feel free to lurk for a while and get a feel for the place. When you’re ready to jump in, please introduce yourself to the group, letting folks know a little about your background, the company you work for, the team you look after, and the leadership challenges that are interesting you at the moment.

We expect members of the Slack channel to respect each others privacy. If you do wish to disclose anything discussed here, please use Chatham House Rule and refrain from identifying individuals or their companies.

Members of this group are generally kind, considerate, constructive and helpful. Heated discussions may occur, but they should always be done with respect and a desire to advance the conversation. If you witness any disrespectful behaviour, please inform myself or one of the admins. We aim to resolve any conflicts peacefully and in a positive manner. However on the rare occasion this isn’t possible, we may ask individuals to leave.

  • Head, Director or VP of UX also applies.

The Real Value of Original Research 17 May 2017, 6:24 am

User-centred designers typically start a new project with a research phase. This allows them to understand the product or service through the eyes of their customers, explore the limits of the problem space, and come up with recommendations that feel at least partially informed. All useful things from a design perspective.

Sometimes organisations baulk at the idea of doing research, causing the design team to launch into their typical spiel about the value of their approach. In my experience, these objections are rarely about the value of research itself, but more around whether original research is necessary on this occasion.

All organisations of a certain size carry out research as a matter of course. They probably have a marketing department segmenting customers, understanding customer sentiment, and testing new propositions through surveys and focus groups. They also have an analytics team tracking user behaviour, testing the effectiveness of campaigns, and pinpointing areas for improvement. In preparation for this project, the product managers and BAs almost certainly did their own research to help build the business case. They probably have more information than they know what to do with.

Most organisations feel they have a pretty good handle on what’s going on inside their company; they just need you to fix it. They claim there’s no need to do more research. Instead, they will provide you with access to their analytics package, the results of the user testing report they commissioned nine months ago, and copies of their marketing personas. This, combined with a briefing meeting should be enough to get your team up to speed.

On the surface this makes sense. After all, why pay for original research if you already have the answers you need? Better to save the money and spend it coming up with a solution, especially when resources are scarce.

This attitude is completely understandable, but it hides an unusual and counterintuitive truth about the value of design research. Design research is rarely about the acquisition of new knowledge and information. Instead, the real value of design research comes from the process of gathering and analysing the results. It’s this analysis phase where the data gets processed, information gets turned into knowledge, and understanding becomes tacit.

Existing research will have been gathered to answer general business questions, so it won’t necessarily provide the insights the design team need. Instead, design research is done with a specific product or service improvement in mind; it adds nuance to the problem at hand, and allows the designer to weigh up different options and understand how the various solutions may play out.

Knowledge gained from original research is far more impactful than that gathered elsewhere. Remembering the conversation you had with a frustrated customer becomes part of the narrative, and the resulting insight becomes internalised. This is a very different experience from reading a data point in somebody else’s report, which can easily be downplayed or forgotten.

In psychology this phenomenon is known as embodied cognition—the idea that you think through problems using your whole self, rather than just your mind. This means you learn more by experiencing something in person, than you do by reading about it in a book or report.

For most designers, original research isn’t about gathering facts and data. Instead, it’s the process they use to understand the task at hand. Like warming up before the big race, research allows you to engage body and mind, get the creative muscles working, and be flexible and limber enough to tackle the challenge ahead. It isn’t something you can effectively outsource to your coach or team captain. It’s a vital part of the pre-race process.

This was originally posted on UXMas.

First Direct Trains Customers to be Phishing Victims 2 Mar 2017, 3:55 am

Banking security is a big deal and has been all over the news of late. Most of the coverage focusses on digital security and how to avoid having your account hacked. A common culprit is the Phishing attack, where a hacker sends you an email claiming to be from a trusted source, and asking for personal information like your password, mother’s maiden name, date or place of birth. Most security savvy companies have got wise to this approach, so on every email they will state clearly that they will NEVER request personal information like this.

So I was amazed when I got a phone call, with no caller ID, from somebody claiming to be from my bank. The caller said that before he could speak to me he needed to take me through security and ask me a bunch of personal questions. If you know anything about security you know that Social Engineering on one of their easiest attack vectors. With Social Engineering somebody phones up claiming to be somebody official—your finance department, your IT team, your bank—and asks you to divulge personal information they can then use to compromise your account. This is essentially the real world—or at least phone world—version of a phishing attack, and is something and good security team should be concerned about.

As somebody who cares about personal security I was shocked, so immediately called the bank to highlight this glaring security risk. However rather than caring about security holes, I was told that this was bank policy and if you didn’t want to answer the questions you could always go online or call up.

This is a terrible response as it essentially legitimises and habitualness the fact that banks can phone their customers up without notice and expect people to hand over personal information to a stranger. Security savvy folks like me would decline, but not everybody is as wary as I am. If First Direct trains its customers to hand out personal information to strangers on the phone, this opens up a massive security hole. Any fraudster can now identify First Direct customers (for instance folks who have interacted with the first Direct Twitter account recently), find their contact details online, then phone them up to extract personal security information, and then use that information to break into their account.

This feels like a crazy thing for banks to be doing. What’s more, it seems strange that banks should be conscious about this type of security weakness through digital channels, while actively encouraging it through their phone banking services. There are of course various ways banks could solve this problem, like making automated calls asking the customer to contact the bank using the number on their card. That way, the customer knows they are talking to the bank and can go through the usual security protocol. Instead, it seems that banks like First Direct are sacrificing good data security, for the sake of convenience, which should be a worry to all their customers.

Talk Tropes and Conference Cliches 30 Dec 2016, 8:53 am

Over that last 12 years of attending, speaking and organising conferences, I’ve seen a lot of talks. Probably upwards of a thousand. I’ve seen talks that have inspired me, talks that have challenged me, and talks that left me welling up. During that time I’ve seen themes start to emerge; topics our industry find fascinating and love to revisit time and time again. Many of these topics I’ve used myself, and were I ever to write a “101 things I learnt at architecture school” style book for the interaction design industry, these tropes would feature heavily.

After spending two days binge watching talks in an attempt to find the last couple of speakers for a conference I’m organising, I was amazed how regularly these tropes appeared. I was also surprised how certain traits and behaviours kept repeating themselves across speakers. So I thought I’d jot them down, on the off chance people found them useful. Either as things you hadn’t heard of before and wanted to explore further, or topics and behaviours you wanted to avoid in a search for originality.

Top Talk Tropes

One of the earliest tropes I can remember is “paving the cow paths”; the idea of designing for observed behaviour rather than imposing strict architectures of control. This concept beautifully illustrates the fields of user centred design and lean startup. It’s also one of the pervading philosophies behind the web; that there is intelligence in the system, and it will find its way around any blockage. In the retail world, “cow paths” are also known as “desire lines”, and are used to maximise product exposure. In past talks I’ve used this example to explain why milk is always placed at the back of the store, and how casinos in Vegas are designed.

cowpaths.png

If desire lines can be seen as a highly optimised user journey, the “peak-end rule” is a similar short-cut our brains make for judging how we experience said journey. Research from the field of hedonic psychology has shown that we tend to judge an experienced based on two things; the intensity of the peak condition—positive or negative—and the end state. This is one reason why the most memorable customer experiences are often the result of something bad happening. We remember the intensity of the bad experience, plus the happiness caused by a positive outcome, and the differential between the two frames our perspective. That’s not to say that we should deliberately try to manufacture negative experiences. However it does suggest that people will judge experiences more favourably that have peaks and troughs of emotion, but ultimately end well, rather than an experience that was consistently good, but not noteworthy.

peak-end-rule.jpg

A related cognitive bias is the idea of “change blindness” as illustrated perfectly but the classic basketball video. Viewers are asked to count the number of times the basketball changes hands. So fixated are they on this task, a good proportion of viewers fail to spot the 100 pound gorilla in the room, both literally and figuratively. This goes to show that even when we think something obvious is happening with our designs, many of the people using our products literally don’t notice the things we’re carefully designed.

One tool to help craft these peak experiences is “The Kano Model”. This model classifies features into three different types; basic needs, performance pay-offs, and delights. I usually describe the Kano model in talks by using the analogy of a hotel. A hotel room just wouldn’t function without a bed, a door, access to a bathroom, electricity and a few other must have items. Theses are your MVP feature set. However in order to compete in a crowded market, you can add performance pay-off features like a bigger TV or after broadband. Over time, these nice-to-have features eventually become basic needs, which is why most MVP aren’t very minimal. It’s the third type of feature in the Kano model that interests interaction designers and product managers the most. The small additions which have an unusually sizeable effect. This could be the warm cookie waiting for you at check-in, the free bottle of champagne in your room, or something as simple as a handwritten note from the cleaning staff, letting you know what the weather is going to be like tomorrow.

Kano_model.png

All these items can be mapped as peaks on some form of journey map. They can also form part of the classic “hero’s journey”, another common trope. If you’ve not heard of the hero’s journey before, it’s essentially the idea that many well known stories follow a common archetype. Somebody is given a challenge, they set off on a journey, aided by a wise confident. They overcome a series of increasingly difficult challenges, only to return back to the start a changed person. Stories like the Hobbit and Star Wars follow the heroes journey closely, which is one of the reasons they have endured so well. Narrative storytelling is all the range in the interaction design world at the moment, in large part thanks to the work of content strategists. We’ve actually used the hero’s journey framework in the case studies on our new site, making sure to cast the client as the hero, rather than ourselves.

the-heros-journey.jpg

The preceding tropes are good, but I think my favourite one has to be Stewart Brand’s “Shearing Layers” diagram from his book, How Buildings Learn. The original diagram was used to demonstrate the different speeds at which buildings grown and evolve, as well as the friction caused between layers moving at different speeds. I tend to use this as a megaphone for organisations structure and learning. Add to this the idea of pioneers, m settlers and town-planners, and you have a powerful tool for describing why different teams, disciplines and functions within organisations often struggle to work together.

shearing-layers.jpg

There are dozens of other common tropes I could mention here, like Maslow’s Hierarchy of Needs, the classic three ringed Venn diagram with whatever the speaker does shown in the middle, or the classic illustration depicting lean start-up by showing a product move from skateboard, to bike, to scooter, then finally ending up as a car. I won’t bore you with my thoughts on that particular diagram here. Suffice to say there are a lot of common trends repeating themselves in the speaker circuit, and for good reason. However if you’re goal is to present something new and original, it may be worth picking something slightly more obscure.

Common Conference Cliches

As well as picking up and using a common set of tropes over the past 12 years, I’ve also adopted common traits and behaviours from other speakers. Behaviours like asking the audience about their backgrounds, or whether anybody has heard about the topic you’re about to discuss. It’s a way of building rapport with your audience, while making yourself feel comfortable on stage. However as audiences become more savvy, asking questions like “who here has heard about Lean” becomes increasingly meaningless, especially when their response is unlikely to change the direction of your talk. I’ve witnessed several awkward moments where a speaker asked an audience if they knew about a certain thing—when they clearly did—only for that speaker to launch into a 10 minute scripted description of what that thing was, making everybody feel like they hadn’t been listened too.

A similar faux pas is spending a sizeable portion of a talk introducing who you are and where you come from. A little context can be helpful, but I once witnessed a speaker give a 15 minute bio of pretty much every job they had had in their career. By the time they reached the meat of their talk, they had completely lost the audience—in some cases literally as around 50 people had walked out. The frustrating thing is the rest of the talk was amazing, or at least what I saw of it was, as the speaker ran out of time and had to cut the bulk of the talk short.

In this case I suspect the speaker simply felt nervous and wanted to justify why they had earned the right to be on stage to the audience. However from the audiences perspective the speaker had already earned their place on the stage and were expanding on information that was already in there conference programme. So rather than being interesting or helpful, it actually came across as self indulgent and disrespectful of the audience’s time. From that point on I decided never to introduce myself on stage, assuming that if people were interested they would read the schedule or check out my online profile. I’d urge folks to do the same, although if explaining your background is important, a great way to do it is part way in. I call this “The Hollywood opener” as you throw your audiences right into there middle of the action, and only introduce them to the main character once they’re hooked.

Making the audience feel uncomfortable is never a good idea, so having an understanding about the audience and their culture really helps. I’ve seen plenty of amazing speakers have great success with audience participation on their home turf, getting folks to stand up, stretch, introduce themselves to their neighbours, or discuss something that’s challenging them at work. I’ve seen those same speakers crash and burn in more conservative regions, where force social interaction makes people feel awkward. One particularly uncomfortable incident featured an exuberant North American speaker, a room full of stoic Europeans, and a compulsion to high-five everybody in the front row.

Ironically I’ve also seen audience participation go too well, with speakers allocating 30 seconds to something that should actually take 5 minutes of more. In that situation the audience is having such a good time chatting to their neighbours, having the speaker cut them off to get back to the talk can actually be quite jarring and a little insensitive.

One of the most challenging forms of audience participation has to be the Q&A at the end of a talk. I’ve seen some fantastic Q&A session that were actually more insightful and interesting than the talks that preceded them. However I’ve also witnessed my fair share of awkward sessions where a shy audience is cajoled into asking meaningless questions, just to break the silence and make the speaker feel liked and appreciated. More often than not these Q&A sessions suck the energy out of a carefully scripted talk; like a Director being forced to explain the plot-points of a move once the credits have rolled. They also get in the way of the audience members getting coffee, grabbing some food, making an important call or a much needed comfort break. So can also be a source of discomfort to some. As such I think it’s better to finish a little early than force an unwanted Q&A session.

This brings me to my biggest bugbear of late—not least because I’ve used this one plenty of times myself. It’s people making THAT joke about being “The only thing between you and food/beer”. It was funny the first couple of times I saw a speaker say that, and its’ always resulted in a titter of approval when I’ve used that line myself. However at a recent conference I saw two consecutive speakers make exactly the same joke, to clearly diminishing returns. As such I think THAT joke has now jumped the shark, so I’m going to do my best to stop using it.

Conclusion

It’s worth noting that there’s nothing wrong with any of these talk tropes and conferences cliches in and of themselves. They all have value if used appropriately. As such there is no judgement on anybody using them. After all I’ve used most of them myself. Instead I present them to you more as an observation from years of conference speaking, attending and organising, in the hope that both new and experienced speakers find them interesting. What you do with the information, if anything, is up to you. However I thought it was worth adding that caveat as you know how touchy people on the Internet can be.I’d hate for anybody to overreact or anything like that :)

UX Design and Service Design are Growing Ever Closer 13 Oct 2016, 6:30 am

For the longest time I’ve maintained that Service Design was a specific discipline, distinct from UX Design. It’s true that they have a lot in common, like the way both fields approach problems through a user-centred lens. They also use many of the same tools, such as design games and personas. Even some of their distinctive tools, like the service delivery blueprint have similarities with our own user journey maps. But if you spent any time with a credible Service Design agency five or ten years ago, you’d easily spot the differences.

User Experience agencies typically came from a digital background, and were filled with information architects, interaction designers and usability specialists. We primarily focussed on creating products and services with a digital interface, along with the service ecosystem that supported them.

By comparison, European Service Design agencies did a lot of work on the delivery of public services—presumably because the public sector is so strong in the UK—while their US counterparts looked more towards the in-store experience. It wasn’t unusual to find a Service Design consultancy staffed with industrial designers, set-designers and commercial architects.

The two disciplines clearly shared the same ancestry, but somewhere along the evolutionary tree, they took a slightly different branch. Look at a typical UX agency and their portfolio will be full of publishing websites, mobile apps, and startups, while Service Design consultancies are more likely to show phone systems, airline check-in procedures, and better ways to deliver healthcare.

On the surface these outputs may look very different. However, as digital technology increasingly provides the platform on which these services are built, the differences are slowly being stripped away.

Just think about it. What is Uber if not a cleverly crafted service that matches car owners looking for a bit of extra cash to people looking for a ride? The interface may be digital, but make no mistake that this is a carefully choreographed piece of Service Design with lots of stuff happening behind the scenes.

Now look at the airline check-in experience. On the surface it may look like a traditional Service Design project. Dig a bit deeper however, and you’ll see that almost all of the people turning up to the desk bought their tickets online, checked-in via the airline website or mobile app, and either printed out their own boarding passes, or have them stored on their mobile phones. Surely this is a classic User Experience project?

As digital has become one of the primary ways of delivering a service experience, Service Design agencies have needed to become more digital, while digital agencies have needed to become more service oriented, to the point that it’s getting harder to differentiate the two. So much so that, at its highest level, User Experience Design has become indistinguishable from Service Design.

I’ve resisted this sentiment for a while, not least because I think the distinction still provides value to clients. However this value is rapidly diminishing as the industry continues to misunderstand and misrepresent what UX Designers do, incorrectly applying the UX label to interaction designers and digital generalists.

Government Digital Services took the decision to adopt the term Service Designer, because they understand the language of government is one of delivering services rather than products or experiences. I suspect many traditional companies feel the same way.

I expect to see more and more high end consultancies adopt the language of digital service design, as opposed to product design or experience design, to better explain what they do and separate themselves from the herd. What existing Service Design agencies will think of this trend is anybody’s guess. Will they embrace this new influx of digital service designers, or push back? Whatever happens, I have a feeling that this change is inevitable.

Developers “Own” The Code, So Shouldn’t Designers “Own” The Experience? 24 Aug 2016, 5:23 am

We’ve all been there. You spent months gathering business requirements, working out complex user journeys, crafting precision interface elements and testing them on a representative sample of users, only to see a final product that bears little resemblance to the desired experience.

Maybe you should have been more forceful and insisted on an agile approach, despite your belief that the organization wasn’t ready? Perhaps you should have done a better job with your pattern portfolios, ensuring that the developers used your modular code library rather than creating five different variations of a carousel. Or, maybe you even should’ve sat next to the development team every day, making sure what you designed actually came to pass.

Instead you’re left with a jumble of UI elements, with all the subtlety stripped out. Couldn’t they see that you worked for days getting the transitions just right, only for them to drop in a default animation library? And where on earth did that extra check-out step come from. I bet marketing threw that in at the last minute. You knew integration was going to be hard and compromises would need to be made, but we’re supposed to be making the users lives easier here, not the tech team.

When many people are involved in a project, it is very important to make sure that they have a common understanding of the problem and its solution.

Of course, there are loads of good reasons why the site is this way. Different teams with varying levels of skill working on different parts of the project, a bunch of last-minute changes shortening the development cycle, and a whole host of technical challenges. Still, why couldn’t the development team come and ask for your advice on their UI changes? You don’t mess with their code, so why do they have to change your designs around? Especially when the business impact could be huge! You’re only round the corner and would have been happy to help if they had just asked.

While the above story may be fictional, it’s a sentiment I hear from all corners of the design world, whether in-house or agency side. A carefully crafted experienced ruined by a heavy-handed development team.

This experience reminds me of a news story I saw on a US local news channel several years ago. A county fair was running an endurance competition where the last person remaining with their hand on a pickup truck won the prize. I often think that design is like a massive game of “touch the truck”, with the development team always walking away with the keys at the end of the contest. Like the last word in an argument, the final person to come in contact with the site holds all the power and can dictate how it works or what it looks like. Especially if they claim that the particular target experience isn’t “technically possible”, which is often shorthand for “really difficult”, “I can’t be bothered doing it that way” or “I think there’s a better way of doing it so am going to pull the dev card”.

Now I know I’m being unfairly harsh about developers here and I don’t mean to be. There are some amazingly talented technologists out there who really care about usability and want to do the best for the user. However, it often feels as though there’s an asymmetric level of respect between disciplines due to a belief that design is easy and therefore something everybody can have an opinion on, while development is hard and only for the specially initiated. So while designers are encouraged (sometimes expected) to involve everybody in the design process, they often aren’t afforded the same luxury.

To be honest, I don’t blame them. After all, I know just enough development to be dangerous, so you’d be an idiot if you wanted my opinion on database structure and code performance (other than I largely think performance is a good thing). Then again I do know enough to tell when the developers are fudging things and it’s always fun to come back to them with a working prototype of something they said was impossible or take months to implement — but I digress.

The problem is, I think a lot of developers are in the same position about design — they just don’t realize it. So when they make a change to an interface element based on something they had heard at a conference a few years back, they may be lacking important context. Maybe this was something you’ve already tested and discounted because it performed poorly. Perhaps you chose this element over another for a specific reason, like accessibility? Or perhaps the developers opinions were just wrong, based on how they experience the web as superusers rather than an average Jo.

Now let’s get something straight here. I’m not saying that developers shouldn’t show an interest in design or input into the design process. I’m a firm believer in cross-functional pairing and think that some of the best usability solutions emanate from the tech team. There are also a lot of talented people out there who span a multitude of disciplines. However, at some point the experience needs to be owned, and I don’t think it should be owned by the last person to open the HTML file and “touch the truck”.

So, if good designers respect the skill and experience great developers bring to the table, how about a little parity? If designers are happy for developers to “own the code”, why not show a similar amount of respect and let designers “own the experience”?

Everybody has an opinion. However, it’s not a good enough reason to just dive in and start making changes.

Doing this is fairly simple. If you ever find yourself in a situation where you’re not sure why something was designed in a particular way, and think it could be done better, don’t just dive in and start making changes. Similarly, if you hit a technical roadblock and think it would make your lives easier to design something a different way, go talk to your designer. They may be absolutely fine with your suggested changes, or they may want to go away and think about some other ways of solving the same problem.
After all, collaboration goes both ways. So if you don’t want designers to start “optimizing” your code on the live server, outside your version control processes, please stop doing the same to their design.

Originally published at www.smashingmagazine.com on August 9, 2016.

Are we moving towards a post-Agile age? 23 Aug 2016, 9:54 am

Agile has been the dominant development methodology in our industry for some time now. While some teams are just getting to grips with Agile, others extended it to the point that it’s no longer recognisable as Agile. In fact, many of the most progressive design and development teams are Agile only in name. What they are actually practicing is something new, different, and innately more interesting. Something I’ve been calling Post-Agile thinking. But what exactly is Post-Agile, and how did it come about?

The age of Waterfall

Agile emerged from the world of corporate IT. In this world it was common for teams of business analysts to spend months gathering requirements. These requirements would be thrown into the Prince2 project management system, from which a detailed specification—and Gantt chart—would eventually emerge. The development team would come up with a budget to deliver the required spec, and once they had been negotiated down by the client, work would start.

Systems analysis and technical architects would spend months modelling the data structure of the system. The more enlightened companies would hire Information Architects—and later UX Designers—to understand user needs and create hundreds of wireframes describing the user interface.

Humans are inherently bad at estimating future states and have the tendency to assume the best outcome—this is called estimation bias. As projects grow in size, they also grow in surface area and visibility, gathering more and more input from the organisation. As time marches on, the market changes, team members come and go, and new requirements get uncovered. Scope creep inevitably sets in.

To manage scope creep, digital teams required every change in scope to come in the form of a formal change request. Each change would be separately estimated, and budgets would dramatically increase. This is the reason you still hear of government IT projects going over budget by hundreds of millions of dollars. The Waterfall process, as it became known, makes this almost inevitable.

Untimely the traditional IT approach put too much responsibility in the hands of planners and middle managers, who were often removed from the day-to-day needs of the project.

The age of Agile

In response to the failures of traditional IT projects, a radical new development philosophy called Agile began to emerge. This new approach favoured just-in-time planning, conversations over documentation, and running code; effectively trying to counter all the things that went wrong with the typical IT project. The core tenets of this new philosophy were captured in the agile manifesto, a document which has largely stood the test of time.

As happens with most philosophies, people started to develop processes, practices and rituals to help explain how the tenets should be implemented in different situations. Different groups interpreted the manifesto differently, and specific schools started to emerge.

The most common Agile methodology we see on the web today is Scrum, although Kanban is another popular approach.

Rather than spending effort on huge scope documents which invariably change, Agile proponents will typically create a prioritised backlog of tasks. The project is then broken down into smaller chunks of activity which pull tasks from the backlog. These smaller chunks are easier to estimate and allow for much more flexibility. This opens up the possibility for regular re-prioritisation in the face of a changing market.

Agile—possibly unknowingly—adopted the military concepts of situational awareness and command intent to move day-to-day decision making from the planners to the front-line teams. This effectively put control back in the hands of the developers.

This approach has demonstrated many benefits over the traditional IT project. But over time, Agile has became decidedly less agile as dogmas crept in. Today many Agile projects feel as formal and conservative as the approaches they overthrew.

The post-Agile age

Perhaps we’re moving towards a post-Agile world? A world that is informed by the spirit of Agile, but has much more flexibility and nuance built in.

This post-Agile world draws upon the best elements of Agile, while ditching the dogma. It also draws upon the best elements of Design Thinking and even—God forbid—the dreaded Waterfall process.

People working in a post-Agile way don’t care which canon an idea comes from, as long as it works.. The post-Agile practitioner cherrypicks from the best tools available, rather than sticking with a rigid framework. Post-Agile is less of a philosophy and more of a toolkit that has been built up over years of practice.

I believe Lean Startup and Lean UX are early manifestations of post-Agile thinking. Both of these approaches sound like new brands of project management, and each has its own dogma. If you dig below the surface, both of these practices are surprisingly lacking in process. Instead they represent a small number of tools—like the business model canvas—and a loose set of beliefs such as testing hypotheses in the most economical way possible.

My initial reaction to Lean was to perceive it as the emperor’s new clothes for this very reason. It came across as a repackaging of what many designers and developers had been doing already. With a general distrust for trademarks and brand names, I naturally pushed back.

What I initially took as a weakness, I now believe is its strength. With very little actual process, designers and developers around the world have imbued Lean with their own values, added their own processes, and made it their own. Lean has become all things to all people, the very definition of a post-Agile approach.

I won’t go into detail how this relates to other movements like post-punk, post-modernism, or the rise of post-factual politics; although I do believe they have similar cultural roots.

Ultimately, post-Agile thinking is what happens when people have lived with Agile for a long time and start to adapt the process. It’s the combination of the practices they have adopted, the ones they have dropped, the new tools they have rolled in, as well as the ones they have rolled back.

Post-Agile is what comes next. Unless you truly believe that Scrum or Kanban is the pinnacle of design and development practice, there is always something new and more interesting around the corner. Let’s drop the dogma and enter this post-Agile world.

Renting software sucks 15 Aug 2016, 8:58 am

Back in the the olden days (c. 2000) people used to own software. When a new version of Photoshop or Fireworks came out, you’d assess the new features to decide whether they were worth the price of the upgrade. If you didn’t like what you saw, you could skip a generation or two, waiting until the company had a more compelling offering.

This gave consumers a certain amount of purchasing power, forcing software providers to constantly tweak their products to win customer favour. Of course, not every tweak worked, but the failures were often as instructive as the successes.

This started to change around 2004, when companies like 37 Signals released Basecamp, their Software as a Service project management tool. The price points were low—maybe only a few dollars a week—reducing the barrier to entry and spreading the cost over a longer period.

Other products quickly followed; accounting tools, invoicing tools, time-tracking tools, prototyping tools, testing tool, analytics tools, design tools. Jump forward to today, and the average freelancer or small design agency could have subscriptions to over a dozen such tools.

Subscription works well for products you use on a daily basis. For designers this could be Photoshop or InVision; for accountants this could be Xero or Float; and for consumers this could be Spotify or Netflix.

Subscription also encourages use—it encourages us to create habits in order to get our money’s worth. Like the free buffet at an all-inclusive hotel, we keep going back for more, even when we’re no longer hungry.

In doing so, subscription also locks us in, making it psychologically harder for us to try alternatives. Making it less likely for us to try that amazing local restaurant because we’ve already paid for our meals and need to beat the system. The sunk cost fallacy in all its glory.

Problems with the rental model become more apparent when you’re forced to rent things you use infrequently, like survey products or recruitment tools. You pay to maintain the opportunity of use, rather than for use itself.

We recently did an audit of all the small monthly payments going out of the company, and it’s amazing how quickly they mount up. Twenty dollars here and forty dollars there can become thousands each year if you’re not careful. Even more amazing are the number of products we barely used. Products that somebody signed up for a few years back and forgot to cancel.

You could blame us for our lack of diligence. However the gym membership model of rental is explicitly designed to elicit this behaviour. To encourage people to rent the opportunity, safe in the knowledge that the majority of members won’t overburden the system. Unclear billing practices and disincentives for unsubscribing—”if you leave you’ll lose all your data”—are designed for this very purpose.

Then you have the legacy tools. Products that you rarely use, but still need access to. Photoshop is a great example of this. Even if you’ve decided to move to Sketch, you know many of your clients still use Photoshop. In the olden days you would have keep an older version on your machine, costing you nothing. These days you need to maintain your Creative Cloud account across multiple team members, costing you thousands of dollars for something you rarely use.

This article was sparked by a recent Twitter storm I witnessed where Sketch users raised the idea of a rental model and vilified people who felt paying $30 a month for professional software (which currently retails at $99) was too much.

While I understand the sentiment—after all Sketch is the tool many designers use to make their living—you can’t take this monthly cost in isolation. Instead you need to calculate lifetime cost. As we all know from the real world, renting is always more expensive than ownership in the long term.

You also have to consider rental costs in relationship to every other piece of rented software our industry considers necessary. With this number continuously increasing—but no sign of legacy tool rental declining—entering the digital industry is becoming an increasingly costly prospect for new designers and developers.

The thing I find strange is that, while we’ve been trained to believe renting is the norm for software over the past 10 years, few of us think this way of physical goods. We mostly still buy houses, cars, computers and music systems, rather than renting or leasing them. Many believe ownership offers some kind of noble status, despite the environmental cost of owning atoms over bits.

When we do rent physical products, it’s rarely on a subscription basis. Instead we’ll rent an apartment in New York through AirBnB for a weekend, a Zipcar for an afternoon or a Tasker for an hour.

I’m not saying software rental is always bad. I’d just like to see more diversity in SaaS business models. I’d welcome the ability to subscribe to mature services I use on a regular basis, but rent less common tools on a per-use basis. I’d also like to retain ownership of certain tools, like Sketch, as I think this is a better model for innovation.

We talk a lot about user-centered design in the digital world. Isn’t it about time we considered business models through the same critical lens?

Page processed in 0.356 seconds.

Powered by SimplePie 1.3.1, Build 20170724043435. Run the SimplePie Compatibility Test. SimplePie is © 2004–2017, Ryan Parman and Geoffrey Sneddon, and licensed under the BSD License.

©©