Cennydd Bowles Cennydd Bowles

Scare-quote ethics

Forgive me, I need to sound off about “tech ethics”. Not the topic, though, but those fucking scare quotes: that ostentatious wink to the reader, made by someone who needs you to know they write the phrase with reluctance and heavy irony.

As you’ll see, this trend winds me up. I see it most often from a certain type of academic, particularly those with columns or some other visible presence/following. I love you folks, but can we cut this out? The insinuation – or sometimes the explicit argument – is “tech ethics” is meaningless; I have seen further and identified that the master’s tools will never dismantle the master’s house; these thinktanks are all funded by Facebook anyway; the issue is deeper and more structural.

As insights go, this is patently obvious. Of course the sorry state of modern tech has multi-layered causes, and of course our interventions need to address these various levels. Obviously there’s structural work to be done, not just tactical work.

But this is your classic ‘yes and’ situation, right? Pull every lever. Like, yes, I fully agree that the incentives of growth-era capitalism are the real problem. But we also need the tactical, immediate stuff that works within (while gently subverting?) existing limitations.

The problem with playing these outflanking cards, as we’ve seen from the web → UX → product → service → strategic design treadmill, is that as you demarcate wider and wider territory, your leverage ebbs away. You move from tangible change to trying to realign entire ecosystems. Genuinely, best of luck with it: it needs doing, but it takes decades, it takes power, and it takes politics. Most of those who try will fail.

I’m not equipped for that kind of work, so I do the work I am equipped for. Teaching engineers, students, and designers basic ethical techniques and thinking doesn’t solve the tech ecosystem’s problems. But I’ve seen it help in small, direct, meaningful ways. So I do it.

So please: spare us the scare quotes. Let’s recognise we’re on the same team, each doing the work we’re best positioned to do, intervening at the points in the system that we can actually affect, each doing what we can to help turn this ugly juggernaut around.

Read More
Cennydd Bowles Cennydd Bowles

Expo 2020

Back from a let’s-say-unusual few weeks in Dubai. I was meant to give a big talk there – dignitaries/excellencies etc, etiquette-expanding stuff – but contracted this dread virus instead and for a while entertained visions of wheezing my intubated, asthmatic last many hours away from home. Happily the vaccines did their job and while isolation was grim, my symptoms were entirely weedy. Nonetheless, now I’m recovered, I elected to head home ASAP for hopefully understandable reasons, so had to withdraw from the event.

I was able to squeeze in a quick visit to Expo 2020, however. It deserves caveats. Yes, it’s teeming with cheesy robots and sweeping popsci generalisations about AI. Yes, its primary function is soft power and reputation laundering, although the queues outside the Belarus pavilion were noticeably short. But I still found it interesting, even touching. There’s something compelling and tbh just cool about bringing the world together to talk about futures – and also to do it in a creative, artistic, architectural, and cultural way that engages the public.

Large water feature at Expo 2020

This is the kind of thing modern-era Britain finds deeply uncomfortable, I think. Excluding the flag-shagger fringe, national earnestness pokes uncomfortably against our forcefields, the barriers of cynicism we construct so we don’t have to look each other in the eye and confess our dreams. The only time fervent belonging ever really worked for us was 2012, and that was only thanks to home advantage.

But it has not escaped my attention that I’m an earnest dude and so, yeah: I enjoyed it. High-frequency grins behind the face mask, lots of mindless photos. Even the mediocre drone shows had some charm, although I drew the line at the ‘smart police station’.

Multicoloured Russian pavilion at Expo 2020

It was also a fascinating toe-dip into other cultures. I’m not likely to see musical performance from subregions of Pakistan, nor a Saudi mass-unison song – swords aloft, dramatic lighting and everything – in my everyday life. I suppose new experience is the point of travel anyway.

Osaka 2025 is a long way off temporally and spatially but, you know, I’m tempted.

Man in Arabic dress pushing a companion’s wheelchair through an artificial cloud of mist at Expo 2020
Read More
Cennydd Bowles Cennydd Bowles

Poking at Web3

Like everyone, I’ve been trying to understand the ideas of Web3. Not the mechanics so much as the value proposition, I suppose. Enough people I respect see something worthwhile there to pique my curiosity, and the ‘lol right-click’ critique is tiresome. So I’m poking at the edges.

Honestly, it’s heavy going. The community’s energy is weird and cultish, and the ingroup syntax – both technical and social – is arcane to the point of impenetrability: whatever else Web3 needs, it’s crying out for competent designers and writers.

Most of the art is not to my taste, shall we say. Some of it’s outright dreadful. That’s forgivable. The bigger problem, though, is the centrality of the wallet, the token, and so on. I’m avowedly hostile to crypto’s ecological impact and its inegalitarian, ancappish positioning. Crypto folks have promised change is right around the corner for a long time now – call me when it finally happens.

So… grave reservations. But that aside, there is something conceptually appealing there, right? Mutual flourishing, squads, communities weaving support networks that heal system blindspots. I feel those urges too. Perhaps I’m just a dreamy leftist / ageing Millennial-X cusper, though, but my current solution to this is simple: give people cash. (More on that later, but as an aside, if you’re lucky enough to have money, consider throwing some at people who are trying to carve out fairer, less exploitative tech too. It’s not a lucrative corner of the industry.)

Anyway, I’m still a Web3 sceptic, but the intentions… yeah, they’re pretty cool. If the community can become more accessible and phase out the ugly stuff (most obviously proof-of-work blockchains, but also this notion that transactions are the true cornerstone of mutuality), I’ll be officially curious.

Read More
Cennydd Bowles Cennydd Bowles

New role at the RCA

Starting as a (part-time) visiting lecturer at the Royal College of Art this week, teaching & mentoring MA Service Design students on ethical and responsible design. The next generation of designers have important work ahead, and I’m pleased to have the chance to influence them.

Read More
Cennydd Bowles Cennydd Bowles

The law isn’t enough: we need ethics

When I talk about ethical technology, I hear a common objection: isn’t the law enough? Why do we need ethics?

It’s an appealing argument. After all, every country tries to base their laws on some notion of good and bad, and uses legality as a kind of moral baseline. While there are always slippery interpretations and degrees of severity, law tries to distinguish acceptable behaviour from behaviour that demands punishment. At some point we decide some acts are too harmful to allow, so we make them illegal and set appropriate punishments.

Law has another apparent advantage over ethics: it’s codified. Businesses in particular like the certainty of published definitions. The language may be arcane, but legal specialists can translate and advise what’s allowed and what isn’t. By comparison, ethics seems vague and subjective (it’s not, but that’s another article). Surely clear goalposts are better? If we just do what’s legal, doesn’t that make ethics irrelevant, an unnecessary complication?

It’s an appealing argument that doesn’t work out. The law isn’t a good enough moral baseline: we need ethics too.

Problem 1: Some legal acts are immoral

Liberal nations tread lightly on personal and interpersonal choices that have only minor impacts on wider society. Adultery is usually legal, as are offensive slurs, so long as they’re not directed at an individual or likely to cause wider harm. The right to protest is protected, even if you’re marching in support of awful, immoral causes. Some choices might lead to civil liabilities, but generally these aren’t criminal acts. Some nations are less forgiving, of course – we’ll discuss that in Problem 3.

Even serious moral wrongs can be legal. In 2015, pharma executive Martin Shkreli hiked the price of Daraprim, a drug used to treat HIV patients, from $13.50 to $750 a pill. A dreadful piece of price gouging, but legal; if we don’t like it, capitalism’s advice is to switch to an alternative provider. (Shkreli was later convicted of an unrelated crime.)

Or imagine you witness a young child drowning in a paddling pool. You could easily save her but you choose not to, idly watching as the child dies. This is morally repugnant behaviour, but in the UK, unless you have a duty of care – as the child’s parent, teacher, or minder, say – you’re not legally obligated to rescue the child.

Essentially, if we use the law as our moral baseline, we accept any behaviour except the criminal. It’s entirely possible to behave legally but still be cruel, unreliable, and unjust. This is a ghastly way to live, and we should resist it strongly; if everyone lived by this maxim alone, our society would be a far less trustworthy and more brutal place.

Fortunately, there are other incentives to go beyond the legal minimum. It’s no fun hanging out with someone who doesn’t get their round in, let alone someone who defrauds their employer. Unethical people and companies find themselves distrusted and even ostracised no matter whether their actions are legal or not: violate ethical expectations and you’ll still face consequences, even if you don’t end up in court.

Problem 2: Some moral acts are illegal

On the flip side, some behaviour is ethically justified even though it’s against the law.

When Extinction Rebellion protestors stood trial for vandalising Shell’s London headquarters, the judge told the jury that the law was unambiguous: they must convict the defendants of criminal damage. Nevertheless, the jurors chose to ignore the law and acquitted the protestors.

Disobeying unjust laws is a cornerstone of civil disobedience, helping to draw attention to injustice and pushing for legal and social reforms. In the UK, smoking marijuana is still illegal, despite clear evidence that it doesn’t cause significant social ills. Although I don’t enjoy it myself, I certainly can’t criticise a weed smoker on moral grounds, and the nation’s widespread disregard of this law makes future legalisation look likely.

There’s also a moral case for breaking some laws out of necessity. A man who steals bread to feed his starving family is a criminal, but we surely can’t condemn his actions. Hiding an innocent friend from your government’s secret police may be a moral good, but the illegality puts you at risk too: if you’re unlucky, you might find yourself strapped to the waterboard instead.

Problem 3: Laws change across times and cultures

The list of moral-but-illegal acts grows if we step back in time. Legality isn’t a fixed concern: not long ago, it was legal to own slaves, to deny women the vote, and to profit from child labour.

Martin Luther King Jr’s claim that ‘the arc of the moral universe is long, but it bends toward justice’ gives us hope that we can right historical wrongs and craft laws that are closer to modern morality. But there are always setbacks. Look, for example, at central Europe today, where some right-wing populists are rolling back LGBTQ and abortion rights that most Western nations see as moral obligations.

If we equate the law and morality, aren’t we saying a change in the law must also represent a legitimate shift in moral attitudes? If a government reduces a speed limit, were we being immoral all those years we drove at 100kph rather than 80kph? Is chewing gum ethically wrong in Singapore but acceptable over the border in Malaysia? It can’t be right that a redrawing of legal boundaries is also a redrawing of ethical boundaries: there must be a distinction between the two.

There is, however, a trap here. Moral stances can vary across different times and cultures, but if we take that view to extremes, we succumb to moral relativism or subjectivism. These tell us that ethics is down to local or personal opinion, which leaves the conversation at a dead end. More on this in a future article, but for now I’ll point out that almost every culture agrees on certain rights and wrongs, and to make any progress we must accept some ethical stances are more compelling and defensible than others. Where moral attitudes vary, they still don’t move in lock-step with legal differences.

Problem 4: Invention outpaces the law

The final problem is particularly relevant for those of us who work in technology. Disruptive tech tends to emerge into a legal void. We can’t expect regulators to have anticipated every new innovation, each new device and use case, alongside all their unexpected social impacts. We can hope existing laws provide useful guidance anyway, but the tech sector is learning that new tech poses deep moral questions that simply aren’t covered by legal guidance. The advent of smart glasses alone will mean regulators will have to rethink plenty of privacy and IP law in the coming years.

We can and must push for better regulation of technology. That means helping lawmakers understand tech better, and bringing the public into the conversation too, so we’re not stuck in a technocratic vacuum. But that will take time, and can only ever reduce the legal ambiguity, not eliminate it. The gap between innovation and regulation is here to stay, meaning we’ll always need ethical stances of our own.

Read More
Cennydd Bowles Cennydd Bowles

Double positive: thoughts on an overflow aesthetic

[Tenuous thoughts about the last two Low albums and (post)digital aesthetics…]

I think Low’s Double Negative (2018) is a legit masterpiece, a shocking right-angle for a band in their fourth active decade. Probably my favourite album of the century so far.

To describe the album’s sound, I’d have to reach for a word like ‘disintegration’. The songs are corroded, like they’re washed in acid, or a block of sandstone crumbling apart to reveal the form underneath. The obvious forefather is Basinski’s Disintegration Loops, which uses an analogue technology (tape and playhead) to create slow sonic degradation.

Double Negative’s vocals aren’t spared this erosion: they’re tarnished and warped to the point of frequent illegibility:

Reviewers pointed out Double Negative is the perfect sonic fit for its age. Organic, foreboding, polluted: as a metaphor for the dread and looming collapse we felt in the deepest Trump years, it’s on fucking point.

Hey What, released this month, is no masterpiece. But it’s still a great album, and like Double Negative I feel it’s also suited to its time. While the music is still heavily distorted, Hey What’s distortion is tellingly different. Rather than the sound being eroded, pushed below its original envelope, Hey What’s distortions come from excess, from overflow.

The idea of too much sound/too much information is how fuzz and overdrive pedals work, but this overflow is distinctly digital, not analogue. It’s not just amps turned up to 11 – it’s acute digital clipping, a virtual mixing desk studded with red warning lights, and millions of spare electrons sloshing around. More double positive than double negative. And unlike its predecessor, Hey What spares its vocals from this treatment, letting them soar as Low vocals historically do:

Brian Eno famously said ‘Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature.’ So yes, artists were always going to mess around with digital distortion and overflow once digital recording & DAWs became mainstream. I hear some of this experimentation in hyperpop, say, while autotune is arguably in the same conceptual ballpark. Although I’m no expert in contemporary visual culture, it seems clear to me the overflow vibe also crops up in digital art, supported by the NFT crowd in particular.

‘Something is happening here’ isn’t itself an exciting thesis, but I’ve found it interesting to poke at the connotations and associations of overflow. While Double Negative is all dread and collapse, Hey What is tonally bright. The world may not have changed all that much in three years, but the sound is nevertheless that of a band that’s come to terms with dread and chooses to meet it head-on: an equal and opposite reaction.

Hey What is still messy, challenging, and ambivalent, but to me, its overflow aesthetic evokes post-scarcity, a future of digitised abundance, in which every variation is algorithmically exploited, but with the human voice always audible above the grey goo. It suggests, dare I say, that we could live in (a bit of) hope.

So I guess I’m wondering… are Low solarpunk now?

Read More
Cennydd Bowles Cennydd Bowles

Available for new projects

Wrapping up a major client, and a little time off, and at last have some capacity for future projects. Via Twitter thread, here’s a little reminder of what I do; please share with anyone who needs help making more responsible and ethical technology.

While I’m self-promoting, I’m told the World Usability Day (11 Nov) theme this year is trust, ethics, and integrity. I’m into that stuff. One advantage of the remote era is I can do multiple talks a day, so drop me a line if you need a keynote.

Oh, and I’m still interested in odd bit of of hands-on design, too. Turns out I’m still decent at it. Hit me up: cennydd@cennydd.com.

Read More
Cennydd Bowles Cennydd Bowles

Book review: Design for Safety

E7dnDjrWEAc2L_q.jpeg

Just sometimes, the responsible tech movement can be frustratingly myopic. Superintelligence and the addiction economy command the op-eds and documentaries while privacy and disinformation, important as they are, often seem captured by the field’s demagogic fringe. But there are other real and immediate threats we’ve overlooked. In Design for Safety, Eva PenzeyMoog pushes for user safety to be more prominent in the ethical tech conversation, pinpointing how technologies are exploited by abusers and how industry carelessness puts vulnerable users at risk.

The present tense is important here. The book’s sharpest observation, and the one that should sting readers the most, is that the damage is already happening. Anticipating potential harms is a large part of ethical tech practice: what could go wrong despite our best intentions? For PenzeyMoog, the issue isn’t conditional; she rightly points out abusers already track and harm victims using technology.

I’m very intentional about discussing that people will abuse our products rather than framing it in terms of what might happen. If abuse is possible, it’s only a matter of time until it happens. There is no might.

With each new technology, a new vector for domestic abuse and violence. We’re already familiar with the smart hauntings of IoT: abusers meddling with Nest thermostats or flicking on Hue lights, scaring and gaslighting victims. But the threat grows for newer forms of connected technology. Smart cars, cameras, and locks are doubly dangerous in the hands of abusers, who can infringe upon victims’ safety and privacy in their homes or even deny them a means to escape abuse.

While ethical tech books often lean closer to philosophy than practice, A Book Apart publishes works with a practical leaning. PenzeyMoog helpfully illustrates specific design tactics to reduce the risk of abuse, from increased friction for high-risk cases (an important tactic across much of responsible design), through offering proof of abuse through audit logging, to better protocols for joint account ownership: who gets custody of the algorithm after a separation?

Tactics like this need air cover. Given the industry’s blindspot for abuse, company leaders won’t sanction this extra work unless they understand its necessity. PenzeyMoog suggests public data is the most persuasive tool we have. It’s hard to argue against the alarming CDC stat that more than 1 in 3 women and more than 1 in 4 men in the US have experienced rape, physical violence, and/or stalking by an intimate partner.

Central to PenzeyMoog’s process is an admission that empathy has limits. While we should certainly try to anticipate how our decisions may cause harm, our efforts will always be limited by our perspectives:

‘We can’t pretend that our empathy is as good as having lived those experiences ourselves. Empathy is not a stand-in for representation.’

The book therefore tackles this gap head-on, describing how to conduct primary research with both advocates and survivors, adding valuable advice on handling this task with sensitivity while managing your own emotional reaction to challenging testimony.

Tech writers and publishers often seem reluctant to call out bad practice in print, but Design for Safety is unafraid to talk about what really matters. One highlight is a heartening, entirely justified excoriation of Ring. Amazon’s smart doorbell is a dream for curtain-twitchers and authoritarians, eroding personal consent and private space. PenzeyMoog argues one of Ring’s biggest defects is that it pushes the legal and ethical burden onto individual users:

‘Most buyers will reasonably assume that if this product is on the market, using it as the advertising suggests is within their legal rights.’

That legal status is itself far from clear: twelve US states require that all parties in a conversation consent to audio recording. But the moral issue is more important. By claiming the law is a suitable moral baseline, Ring pulls a common sleight of hand, but for obvious reasons (countries and states have different laws; morality and law change with time; many unethical acts are legal) this is sheer sophistry. Ring has deep ethical deficiencies: we mustn’t allow this questionable appeal to legality deflect from the product’s issues.

Design for Safety also takes a welcome and brave stance on the conundrum of individual vs. systemic change. It’s popular today to wave away individual action, arguing it can’t make a dent in entrenched systems; climate campaigners are familiar with the whataboutery that decries energy giants while ignoring the consumer demand that precipitates these companies’ (admittedly awful) emissions. Design for Safety makes no such faulty dismissals. PenzeyMoog skilfully ‘yes and’s the argument, agreeing that attack on any one front will always be limited, but contending that we should push tactical product changes while also trying to influence internal and industry-level attitudes and incentives.

‘We don’t need to choose between individual-level and system-level changes; we can do both at once. In fact, we need to do both at once.’

This is precisely the passionate but clear-headed thinking we need from ethical technologists, and it makes Design for Safety an important addition to the responsible design canon. If I have a criticism, it’s the author’s decision to overlook harassment and abuse that originates in technology itself (particularly social media). Instead, PenzeyMoog focuses just on real-world abuse that’s amplified by technology. Having seen Twitter’s woeful inaction over Gamergate from the inside, I know that abuse that emanates from anonymous, hostile users of tech can also damage lives and leave disfiguring scars. The author points out other books on the topic exist – true, but few are written as well and as incisively as this.

Design for Safety is a convincing, actionable, and necessary book that should establish user safety as a frontier of modern design. Technologists are running out of excuses to ignore it.

Buy Design for Safety here.

Ethics statement: I purchased the book through my company NowNext, and received no payment or other incentive for the review. I was previously a paid columnist of A List Apart, the partner publisher of A Book Apart. There are no affiliate links on this post.

Read More
Cennydd Bowles Cennydd Bowles

Basecamp and politics-free zones

Since social impact of tech is my bag, a few words on Basecamp’s recent statement. Most of the counters have already been made. Anyone who’s paid attention knows ‘apolitical’ is a delusional adjective: it means ‘aligned to the political status quo’. And of course there’s an embarrassing lack of privilege awareness in the statement/s. Opting out is a luxury many people don’t have.

However much they backtrack and narrow scope now, for me the message is clear: values-driven employees aren’t truly welcome at Basecamp. The leaders have that prerogative, sure, and their staff can decide whether that’s an environment they can flourish in.

I expect Basecamp will find this stance has a noticeable effect on future recruitment. The talent race is also an ethical race. But I also worry about the effect on product quality. If staff are scared to discuss ‘societal politics’ (what a tautology!), they’ll feel reticent to discuss the harms their work can do. And if you can’t discuss potential harms, you can’t mitigate them.

Read More
Cennydd Bowles Cennydd Bowles

The talent race is also an ethical race

2020 was a bad year to be a worker. Laptops and cameras invaded our homes, whiteboard collaboration was replaced by Zoom glitches and Google Docs disarray, and any vestiges of work-life separation were blown away.

And that’s if you were one of the lucky ones. Millions were simply kicked out of jobs altogether, with Covid causing unprecedented drops in employment and hours worked. Further millions of essential workers had no choice but to continue working in unsafe environments. Thousands died.

After such a testing year, however, perhaps we’re turning a corner. The Economist, a newspaper hardly known for worker solidarity, predicts the arrival of a major power transfer. Through what I imagine are tightly clenched teeth, they describe an imminent swing, a ‘reversal of primacy of capital over labour’. If so, a golden age for workers beckons.

The theory goes that the post-Covid bounce will finally unleash employees’ pent-up frustrations. Eighteen months of working from the kitchen table has convinced many staff they’re done with meagre growth opportunities, stagnant pay, office politics, and – more than anything – the commute. Axios says 26% of employees plan to quit after Covid, while remote and hybrid work will untether the labour market from local employers, allowing people to reach for opportunities in other cities and countries. Talent flight may be a hallmark of the recovery as employees desert bad companies, and competition for top candidates becomes fiercer than ever.

For early signals of what happens when employees hold the cards, look at Big Tech. In-demand technologists are finally realising they hold enormous power. Their skills make them expensive and difficult to hire, and their mission-critical roles mean employees directly control a company’s output. Lift your hands off the keyboard and nothing gets built.

Tech workers are also starting to learn the trick to exploiting this power: collective action. An individual employee may be weak but, by banding together, employees can combine strengths while diluting risks.

This sort of mobilisation makes some executives nervous, in part because it looks like labour activism. Some worker-driven tech movements do focus on established labour issues like pay and conditions, and calls to unionise are gaining pace, thanks to the efforts of groups like the Tech Workers Coalition and industry leaders such as Ethan Marcotte.

But look closer at tech worker activism and you’ll see the primary focus is ethical. Google workers famously protested Project Maven – a Pentagon contract that could be used to aid drone strikes – on ethical grounds, expressing their displeasure through thousands of signatures on an an open letter and a handful of resignations. Shortly after Maven, the Google Walkout saw thousands of Googlers take part in brief wildcat strikes over allegations of sexual harassment at the company.

Other tech giants have since seen similar organisation. Amazon has faced internal employee rebellion over climate inaction and warehouse safety; Microsoft staff have come together to protest the company’s work with ICE.

So a swing toward worker power will also be an ethical transition. Salesforce found 79% of the US workforce would consider leaving an employer that demonstrates poor ethics, while 72% of staff want their companies to advocate for human rights. As opportunities start to open up for concerned workers, many will act on these beliefs and look for more moral employers.

Where Big Tech goes, the whole sector soon follows, and sure enough, ethical activism is a rising trend among tech workers worldwide. The #TechWontBuildIt movement typifies this emerging spirit of resistance, with thousands of technologists pledging to oppose and obstruct unethical projects.

This renewed ethical energy is here to stay and, with the eyes of regulators, press, and public alike now firmly on the tech sector, execs have to recognise the risks that await if they fumble the issue. Journalists have now realised compelling stories lurk inside the opaque tech giants and are eager for tales of dissent. Disharmony sells: if even pampered Silicon Valley types are unhappy, something is deeply amiss.

But the larger risk is around talent. Without outstanding and qualified employees you simply can’t compete – particularly in hot fields like data science and machine learning – but good candidates are increasingly dubious of Big Tech’s ethical credentials.

In firing AI ethicists Timnit Gebru and Meg Mitchell, Google leaders doubtless thought they’d found an opportunity to cut two demanding employees loose and proceed on mission. Instead, the company had blundered. The story made international headlines, and Google is now mired in allegations of retaliation. Gebru and Mitchell’s manager recently resigned amid the controversy, and Google’s reputation among data scientists and tech ethicists has been severely damaged. Canadian researcher Luke Stark turned down a $60,000 Google Research grant after the dismissals, and was only too happy to go on record to discuss his decision. Seems the ethics community’s solidarity is stronger than its ties to powerful employers and funders.

Facebook has also seen its candidate pool evaporating. Speaking to CNBC, several former Facebook recruiters reported the firm was struggling to close job offers. In 2016 around 90% of the offers Facebook to software engineering candidates made were accepted. By 2019, after Cambridge Analytica, allegations of cover-ups over electoral interference, and many other scandals, just 50% of the company’s offers were accepted.

Seeing their field as virtually a lifestyle, technologists know the industry intimately and recognize that toxic companies can blight a résumé. Many Uber employees who served during Travis Kalanick’s notorious reign found it difficult to land their next role; it seems hiring managers felt the company’s aggressive, regulation-dodging culture might undesirably infect their own teams.

So as companies stretch their limbs in preparation for the looming talent race, execs must remember this is also an ethical race. Tech workers are demanding that Silicon Valley look beyond disruption and hypergrowth and instead prioritise social impact, justice, and equity. As workers become more literate and confident in collective organising these calls will only get louder. Leaders may or may not agree with their employees’ demands, but the one thing they can’t do is ignore them. Money may still talk – but if the culture’s rotten, talent walks.

Read More
Cennydd Bowles Cennydd Bowles

It’s fine to call it user testing

This linguistic canard does the rounds every few months, and UXers’ erroneous vehemence about it isn’t… healthy.

In the phrase ‘user testing’, the word user is a qualifying noun, also known as an attributive noun, or adjunct noun. As the name suggests, it modifies the second noun. (Here, testing is a gerund, a verb acting as a noun.) But that modification can have multiple meanings. Sometimes it implies of, but it can also imply, say, with, by, or for. Some languages add extra words for these distinctions; in English, we rely on the context to make it obvious.

  • ‘Mobile design’ does not mean designing mobiles. It means designing for mobile.

  • ‘Charcoal drawing’ does not mean drawing charcoal. It means drawing with charcoal.

  • ‘Customer feedback’ is not feedback on customers. It means feedback from customers.

In 20 years, I’ve never met a client or colleague who thought ‘user testing’ meant actually testing users. Maybe you have. If so, my sympathies: your project likely faces problems more serious than this labelling issue.

There are a thousand more meaningful battles to pick, folks, and you might even be right about some of them. Let this one go.

Read More