Bill Bennett: Reporter's Notebook


False power of an exclusive press release

In Dealing with grumpy editors, Dan Kaufman writes about the exclusive press release:

I don’t understand why PRs give editors exclusives – because for the most part it does the PR and their client more harm than good.You see, if a story is newsworthy then it’ll run anyway – and if it isn’t then giving it as an exclusive isn’t going to make much difference.

Kaufman goes on to say if a PR person gives an editor a decent story as an exclusive, it will upset other editors. He says piss off, but this is a family website.

This happens all the time here in New Zealand. The practice is counter-productive.

It can certainly destroy trust a PR person or company has built.

Exclusive… oh yeah?

Waking-up, reading a so-called exclusive story then later in the day getting a press release covering the same ground happens too often in New Zealand.

Often this happens when a public relations person thinks they might get sympathetic or splashy coverage of their story if they play favourites.

PRs have approached me offering to trade an exclusive for a favourable position: often the cover of a print title. They may even ask to vet the copy in return for the story. This, in effect, can mean an editor enters into a conspiracy to mislead readers.

Many ‘exclusives’ are rubbish stories

Often stories ‘leaked’ this way are rubbish – they read more like advertising than news. Editors giving the press release an early run are manipulated into becoming part of a marketing exercise.

My response to this is to stop trusting the PR person behind the leak. This means they’ll have difficulty slipping any of their future propaganda past me. In extreme cases I’ve ignored any further communication from the source. And I’ve been known to make a formal complaint to the client. In one case I had to tell a PR’s other clients I could no longer work with their agent.

And anyway, if a company thinks it is that important to get their message in a publication they should look at advertising.

If journalists do respond to your press release, make sure you know how to handle their questions professionally:

How to deal with media questions.

Grumpy editors and how to deal with them

Grumpy editors

Modern public relations people often don’t understand how the media works. Many don’t get journalism.

This wasn’t a problem in the past when most PR people were ex-journalists. Today, many publicists have never seen the inside of an editorial office.

Or if they have, they haven’t seen how editors and journalist work. They know little about what makes journalists tick. What motivates and drives reporters and editors.

Harmful PR failures

As a result many PR people end up harming their client’s chances of getting publicity. Or at least the right publicity. Instead they get in the way of journalists and annoy editors.

Which is where Dan Kaufman’s Dealing with grumpy editors gets its name. To public relations people journalists often appear grumpy, rude and obstructive.

This should not surprise anyone. You wouldn’t believe some of the nonsense editors have to put up with from PR people. Some of that nonsense passes for wisdom or craft in the PR industry.

Rubbish public relations

After 17 years before the editorial masthead Kaufman has seen some rubbish PR. He has also seen some sharp operators. In this book he provides practical advice for communications workers wanting to get an editor’s attention.

If you work in PR, you may not agree with everything Kaufman says. He tells it like it is in straightforward language. It is a valuable work, worth every cent of the ridiculously low $4.99 he is charging for the PDF version.

I can come to your offices – or meet you in a fancy restaurant – and give you the same advice for $150 an hour. So on second thoughts, don’t buy the book. Hire me instead.

Grumpy editors

In the spirit of good journalism, I should disclose my connection with Kaufman. I hired him as a junior journalist some 17 years or so ago. Hopefully he wasn’t thinking of me when he gave his book its title.

Predictable, unimaginative press releases

Many press releases are predictable.

Although original ideas occasionally slip through the net, they generally follow the same pattern:

At times it feels as if public relations people want their press releases to fail.

Bad press releases are a public relations own goal

Gorden G Andrew has a different take on the problem. He says the PR industry has effectively committed suicide by abusing the news release system to the point where journalists no longer listen.

“News releases became an anachronism. Online news portals and email killed the underlying functionality of paper releases as a news dissemination tool. The internet delivered news faster, and this was a good thing.”

Andrew says public relations will cease to exist as a profession and as a function.

No big deal, you may think. But Andrew works in marketing and worries press releases and similar communications will come to reflect poorly on the companies paying for these services.

Death by Content: How Press Release Abuse Killed Public Relations | Marketing Craftsmanship.

From 2007: Palm T|X handheld computer versus smartphone

I wrote this for the Sydney Morning Herald in 2007. It’s now a piece of history.

If smartphones haven’t killed off traditional handheld computers yet, the day can’t be far away. Sales of non-phone Palm and PocketPC devices are stagnant or falling. There’s been nothing much in the way of new hardware for a couple of years.

Sure, but something huge was on the way.

This is a pity. I’ve found my $500 Palm T|X to be one of my most productive tools. It goes way beyond managing my contact file and calendar information.

My word, what low expectations we had in those days.

The T|X has a 3.8 inch 480 by 320 display. While you wouldn’t call it large, it’s half as big again as the screen on most smartphones.

But tiny by today’s standards.

It makes reading text, browsing web pages, viewing photographs and even watching movies a better experience than squinting at a smartphone display.

Which was true at the time.

The 128MB of built-in memory doesn’t sound much by today’s standards, yet I’ve got a dozen or so applications running on my handheld and scores of stored documents. If I need more memory, I simply slot in an SD card.

That sounds even less now.

And we’re not talking about any old documents. The T|X comes with a bundled version of Documents To Go, an application that allows you to read and, in a limited way, edit, Word or Excel files. It can also be used to read .pdfs, making it the nearest thing to an electronic book.

OK, this looks a bit daft today, but at the time the T|X was a realistic ebook reader.

The T|X’s best feature is its built-in WiFi. When I’m travelling around the city, I stop for coffee where’s there’s a free hot spot and catch up on emails. Sure you can do this anywhere with a smartphone – but the bigger screen makes a difference.

WiFi is still wonderful.

I use WiFi to sync my Palm with my desktop before leaving home and then reverse the process when I return.

This was a novelty.

The T|X isn’t perfect, text entry is clumsy and the battery won’t make it through an extended working day if the wireless is switched on. Yet, all-in-all, it manages to better the specification of smartphones in most departments. When I’m on business away from home I carry a smartphone and a T|X.

No doubt a phone manufacturer will marry the features of the T|X with a smartphone before much longer – judging by the announced specifications Apple’s forthcoming iPhone could get there first.

And the rest is history

Farewell home computer pioneer Clive Sinclair

**Originally written September 2021. **

At the Guardian Haroon Siddique writes Home computing pioneer Sir Clive Sinclair dies.

Sir Clive Sinclair, the inventor and entrepreneur who was instrumental in bringing home computers to the masses, has died at the age of 81.

His daughter, Belinda, said he died at home in London on Thursday morning after a long illness. Sinclair invented the pocket calculator but was best known for popularising the home computer, bringing it to British high-street stores at relatively affordable prices.

Many modern-day titans of the games industry got their start on one of his ZX models. For a certain generation of gamer, the computer of choice was either the ZX Spectrum 48K or its rival, the Commodore 64.”

My first brush with Sinclair was as an A-level student in the UK. Before he made computers, Sinclair designed a low-cost programmable calculator.

It fascinated me and, thanks to a well-paid part-time job, I managed to buy one. From memory it could only handle a few programmable steps, but it was enough to make complex calculations.

My second job after university was working as a reporter for Practical Computing magazine. I started in January 1980 and quickly became familiar with the original Sinclair ZX80 computer.

Later that year I went to the launch of the ZX81 and met Sinclair for the first time. Over the next few years he became a familiar face.

That modest, clunky ZX81 computer changed everything. Before 1981 was out, the publishing company I worked for started Your Computer magazine which focused on small, low-cost home computers. For the first few issues I was staff reporter on both titles.

The next two years were a wild roller coaster ride. An entire industry emerged and I was in the centre of it.

ZX Spectrum was Sinclair’s definitive product

For me, Sinclair’s most important product was the ZX Spectrum. It was flawed in many ways, but it could do enough to spawn a generation of entrepreneurs and get thousands of young people into computing. I still have one in my attic.

By the time the later Sinclair QL appeared, low-cost computers with decent keyboards and storage were pushing out the minimal, low-cost options Sinclair specialised in.

By now Sinclair was Sir Clive. My last brush with his business was the ill-fated C5 battery powered vehicle. It failed and Sinclair faded from sight, later the remnants of his computer business were picked up by Amstrad.

My main memories of Sinclair were his enthusiasm and his ambitions to build devices that anyone, regardless of budget, could afford.

Another criticism of Maslow's hierarchy of needs

Maslow’s hierarchy of needs is taught as a way of understanding people’s motivations.

While flawed, the hierarchy of needs is a starting point. Managers often don’t get past first base when it comes to thinking about why people do things.

We all owe Maslow a vote of thanks for getting bosses to think about these things.

Yet Maslow’s theory is not beyond criticism. The hierarchy of needs theory misses the spiritual dimension.

Maslow says people attend to basic needs first. They then progressively deal with more complex matters until they reach a point he calls self-actualisation. This sits at the top of the hierarchy’s pyramid.

Not everyone gets that far in life.

Maslow’s crude assumptions

The theory makes crude assumptions that don’t apply to everyone. It is simplistic. The hierarchy of needs is a blunt instrument. A one-size-fits-all solution for a complex problem.

There’s a reason for that. Maslow’s idea belongs to a time and place.

Maslow was American. He first suggested the hierarchy in the 1940s. The ideas are specific to America’s individualist culture. America was rich and everyone’s lives were improving. I n America’s individualist culture middle-class people worry about their personal needs more than any collective needs. It’s all a bit “me… me… me”.

Beyond the individual

He makes no allowances for parents worrying about children or workers being concerned about colleagues.

All-in-all Maslow offers a one-dimensional view of how people think and behave. It’s a first approximation, not a finished story. Even if Maslow’s hierarchy of needs is wrong, it has value. That’s because it teaches managers that looking into people’s motivations is important. T oo often managers treat people as if there are no external forces driving them. For some managers, even thinking about people’s motivations is a foreign idea.

Putting motivation back on the agenda is a starting point for other insights.

Take a digital sabbath

I wrote this post in 2009 when spending one day a week offline was far less challenging than it is today. These days I might only get a day away from all digital screens every month or so.

Here’s the idea:

Set aside one day a week when you don’t switch your computer on.

A day when you don’t check mail, update Facebook or tweet.

No firing up the desktop for game playing.

It doesn’t need to be the same day every week. You may have to trim things according to needs and deadlines. You may only be able to manage one day a fortnight.

Go off-line and let the brain rest. Or, if not rest, allow it to change gear.

Take a break instead of constantly responding to incoming messages. Just let them pile up.

There’s always tomorrow.

You can de-stress. And before you say you find it stressful not being in constant touch with cyberspace, think again. You know that isn’t true.

The online world will go on without you.

Read books, chat to friends, play sport, enjoy the sunshine or bake muffins instead.

That way, when you get back online, you’ll be refreshed. It is like a mini holiday. It may sound like a cliché, but I work better after taking a day-long break from my computer.

Digital sabbath not original

The digital sabbath is not an original idea. If you are religious, the first sabbath came at the end of the first recorded week. The Biblical creation story says God rested on the seventh day.

Ancient Jews worked for six days then strictly observed the Shabbat when many everyday things were not allowed. They knew this was mentally and physically healthy. I first heard about the idea of a digital sabbath in an online forum – sadly I don’t recall who or where the original idea comes from.

Problems

It is harder to take even one day’s rest from the digital world if you have a smartphone, an ebook reader or if you use the computer as an entertainment hub for music and video. And you may have a job, or some other responsibilities that make going off-line difficult.

Nevertheless, I suggest you do what you can to give it a try, reconnect once a week with the analogue world.

I’m not perfect

I’d like  to report I take a full day away from my computer every week. The truth is, I don’t always manage it. Although I try to schedule a full day off each week, I generally only get a couple of full-blown digital sabbaths each month.

The Amstrad Story - book review

Alan Sugar went on to become Lord Sugar and host of The Apprentice. Amstrad was eventually sold to BSkyB in 2007. This 2010 review examines both a book about Sugar’s early success and the British class attitudes that held back UK technology companies.

Amstrad was one of Britain’s brightest retail electronics businesses in the 1980s. During a time when most British electronic companies suffered setbacks, Amstrad’s annual profits grew from £1.4 to £160 million.

Founder Alan Sugar was rated among the country’s top entrepreneurs. What made Amstrad great and what makes Alan Sugar tick?

Sadly, these questions are not answered by David Thomas’s book The Amstrad Story.

Thomas’s omissions do not make the book worthless. It has the three i’s required of any lightweight business reader it is:

Flawed Amstrad textbook

Despite its inspirational qualities, the book is flawed as a textbook for budding Sugars.

It offers no insight into Amstrad’s recipe for success. It offers no insight period. The book chronicles Sugar’s business activities with anecdotes and comments from Sugar and his business partners.

Part of the problem is Sugar’s reluctance to open himself up to public scrutiny. The man has a well-known dislike for journalists and likes to keep his personal life to himself.

As a journalist on the Financial Times, Thomas somehow managed to bypass this obstacle and gain access to some of Sugar’s thoughts and a great deal of the more favourable aspects of Amstrad’s growth period.

Puffery

Yet, for the most part the book reads like public relations puffery. Alan Sugar vetted it before publication. Only Thomas’s insistence on recording Sugar’s bad language verbatim saves it from reading like Pollyanna.

At no point did Thomas talk to any of Sugar’s rivals — he offers no critical analysis of Sugar or Amstrad.

Sugar interesting, no saint

As a journalist working in this area through most of this period in the UK, I knew of many who had much to say about Alan Sugar that was far from complementary.

Criticism, constructive or otherwise, does not diminish Sugar’s achievement. It helps us understand it.

In particular, the book does not tell us enough about how Sugar started.

It seems he left a warehouse one day with a van full of electronic goods and returned that night having sold the lot — I’d love to know how.

Shady? Who knows?

By not telling us the whole story, Thomas leaves readers with the impression there might be something shady in Sugar’s early business dealings. That isn’t fair on the readers and it isn’t fair on Sugar.

The most galling feature of this book is its Cambridge-educated author’s habit of painting Sugar as a Del-Boy or Arthur Daley-type character. Why Sugar’s design notes are reproduced along with spelling errors is beyond me.

English snobbery

The same goes for verbatim quotes complete with bad grammar or foul language. It is as if the author admires Sugar’s gumption and business brain but has to show him up as being an ignorant lout at heart.

This Del-Boy theme repeats elsewhere and it stinks of the worst kind of British class prejudice. It is a reminder of why British industry is in decline. While other nations venerate people who create new wealth the British prefer to venerate those whose ancestors made it.

Unintended revelations

Perhaps in this roundabout way the author unwittingly pulls back the curtain to show what drives Sugar: a wish to succeed and prove himself the equal or better of those born to a higher position. If making money is a way of measuring these things, Sugar proved himself.

Despite these criticisms the book has value. The stories of how Sugar planned his computers and how he eventually acquired Sir Clive Sinclair’s business are both worth reading. Amstrad was the last major UK computer maker to capture consumer markets.

Sugar’s ability to cut through distractions and get straight to the point — usually money, is spellbinding. And those nuggets of Sugar’s managerial wisdom that peek out from underneath are pure gold.

Moving up and down Maslow’s hierarchy

Abraham Maslow’s **hierarchy of needs **first appeared in 1954. The world has changed enormously over the past 55 years and critics have challenged Maslow.

You can read more about Maslow and his hierarchy of needs in Motivation and the hierarchy of needs. There’s criticism in Challenging Maslow’s Hierarchy of Needs.

Maslow’s hierarchy is often shown as a pyramid. There’s an implication people move up the pyramid as their lives improve.

For an example, over time a knowledge worker will gain skills, win responsibility and in turn earn extra income taking care of the lower levels of the hierarchy.

Self-actualisation is the prize

According to Maslow this makes it possible to move up to self-actualisation – a kind of western nirvana.

Today’s global financial crisis means many workers are moving in the opposite direction.

Being laid off is traumatic. In some cases people can be at the pinnacle of the hierarchy one day and slide all the way to the bottom the moment the pink slip appears. Finding food, shelter and warmth is suddenly the most important thing on the agenda.

Doing it over again

Of course many redundant workers pick themselves up and climb back up the pyramid. The journey is easier the second time around. Knowing the route and recognising the landmarks along the way helps.

Up to a point Maslow’s theory works well enough on the four bottom stages. You only have to look around and you will see people at each level. And occasionally you’ll notice people moving up or down in a grim version of Snakes and Ladders.

You don’t see so many self-actualised pyramid toppers.

Even in the good times before the economy nose-dived, Brahmins were thin on the ground. This would be especially so in the higher echelons of the economy (which is where you might expect to find them given the pyramid). Smug, self-satisfied bastards were everywhere, but they’re rarely what you’d call self-actualised.

What does this tell us?

Maslow’s hierarchy is a useful theory, but it’s not a pyramid. It is a four step ladder. And each step up the ladder links to slides that will take you back down again. In other words, a game of snakes and ladders.

Challenging Maslow’s Hierarchy of Needs

It’s an oversimplification, but Maslow’s Hierarchy of Needs says that you can predict how people will behave by looking at their underlying needs. Maslow believed a starving person would find food first, putting aside every other consideration, including social niceties.

Maslow’s theory has its uses. Yet most modern management experts and psychologists regard it with suspicion. One criticism is that the hierarchy of needs doesn’t take into account acts of selflessness, bravery, charity and heroism.

You might ask yourself why some German citizens hid Jews from the Nazis. Or why starving soldiers in Japanese prisoner of war camps would give up their own food to help the weak and dying. But then most economists and biologists would also find what look like irrational acts hard to explain.

Painters starving in attics

Likewise, many of the best and most creative painters and poets – who Maslow would describe as self-actualising – were in fact starving in attics when they did their best work.

Where does Vincent van Gogh sit on the hierarchy of needs?

And we can all think of examples of filmmakers, musicians and other artists whose creativity dried up when they hit the big time. Years ago I worked as a music journalist. I found that many rock bands would deliver a brilliant first album, score a huge contract, then wallow self-indulgently in the studio for album number two. Many never got the opportunity to make a third record.

Jim Clemmer and Art MacNeil make an important criticism of Maslow’s Hierarchy of Needs theory in their book “Leadership skills for Every Manager” (ISBN 0861889630). The book is out of print. But you may find a copy of it in a university library.

Spiritual dimension

Clemmer and McNeil suggest Manslow misses the point because he left people’s spiritual dimension out of the picture. They say that humans look for meaning in their lives. That meaning transcends any animalistic drives. In their words, “even starving people are not immune to the lure of higher values.”

Think of van Gogh.

A more scientific criticism was published in the 1977 edition of the learned journal, “The Annual Review of Psychology”. Here, A.K. Korman, J.H. Greenhaus and I.J. Badin say there’s no empirical (that is, researched) evidence to support Maslow’s ideas.

In fact, they argue, the empirical evidence points in the opposite direction. Other critics point out that Manslow came up with his theories after observing only a handful of people and it lacks scientific rigor.

Originally posted February 2009

We’ll leave these debates for the academics.

The important thing about Maslow’s idea is that it is a good, maybe crude, starting point for understanding what drives other people. From our point of view, managing and motivating others, the Hierarchy of Needs is a useful template that sometimes, not always, helps to explain how and why people behave.

Maslow's hierarchy of needs: Motivation

One key to understanding someone’s motivation is understanding what drives them.

In western culture individual needs dominate and other forces take a back seat. Group needs are more important in many other cultures, including Māori, indigenous Australians and Pacific Islanders.

People from these cultures put tribal or family needs before their own. Second generation immigrants from these backgrounds can follow either pattern – or both at once.

Abraham Maslow studied human driving forces and developed the ‘hierarchy of needs‘. It can help explain motivation.

Maslow’s hierarchy of needs lists human drivers in order of relative importance. Stronger, instinctive, more animal-like drivers sit at the bottom of the hierarchy. The top of the list has weaker, but more advanced, human needs.

The list ordered from bottom to top:

Physiological

This covers basic needs like breathing, getting enough food, finding a place of shelter, keeping warm and dealing with bodily functions (including sexual gratification).

In crude terms, you can’t progress up the hierarchy if you can’t breath or you are freezing to death.

Safety

People need to feel safe from physical danger. They also need physical, mental and emotional security. They get out of the firing line before dealing with higher needs.

Social

Everybody, even those who say otherwise, needs human contact and love. They also need to belong to social groups such as families, organisations, groups and gangs.

Esteem

The feelings of self-worth and self-reliance. People have a deep-rooted desire for recognition by others in terms of respect, praise and status. The flip side of this is people often have low self-esteem or an inferiority complex.

Maslow says because just about everyone in the western world has the bottom three bases covered, the esteem driver lies at the root of most psychological problems. By extension we can see this is the key to many interpersonal relationships in the workplace.

Maslow on self-actualisation

The highest need a person can have is to meet their full potential and maximise their personal development.

Maslow says people generally move up the hierarchy; progressing up the list is the essence of motivation. Once people have enough to eat, they start to look around for physical safety. Once they have esteem they move towards self-actualisation.

On the other hand if something threatens a person’s more basic needs, they will move down the hierarchy to the level necessary to protect that need.

For instance, people trade self-esteem in return for belonging to a social group. They take great risks with personal safety and don’t care about esteem if they face starvation.

Not everyone agrees with Maslow’s hierarchy, it is controversial. Despite the criticisms it makes a great practical tool for managers.

If you are managing someone and you threaten his or her security in some way, you can expect a strong reaction. People go a long way to defend themselves from threats.

On the other side of the ledger, Maslow says once a person has taken care of a particular need on the list, it ceases to be a motivating force and they progress to the next level.

Understanding your psychic contract

John Wareham writes of a psychic contract in his 1991 book, Anatomy of a Great Executive.

Wareham uses psychic contract to describe our subconscious influences.

I met Wareham in Wellington in the early 1990s and we discussed how people could become aware of their psychic contract and use it productively.

A psychic contract is a set of deals we strike with ourselves. We use these deals to define our life goals, how we approach reaching those goals and how we measure success.

Wareham says with the right kind of psychic contract, even ordinary people can do great things. On the other hand, the wrong contract hinders development.

In his book, Wareham goes into depth explaining how we can know our own psychic contracts and how we can reset goals to give ourselves permission to succeed. Knowing your psychic contract is an important part of understanding what drives you.

Here are the five keys:

What he calls the ‘prime parental injunction’ sits at the heart of your conscious. We go through our lives trying to become the people our parents wanted us to be. Even people who spend their lives trying to become exactly the opposite of what their parents wished are still influenced by this injunction.

Wareham says that three-quarters of people in western societies set out first to equal, then to marginally improve upon the way of life and status level they enjoyed in their childhood home.

This is particularly important for sales people who earn commission. It is common for a salesperson who has earned enough commission to reach their financial comfort level to sit back for rest of the month.

Understanding psychometric tests: A practical guide

_Psychometric testing _ remains controversial, yet it’s common in hiring processes. Human resource managers and recruiters see it as an efficient way of understanding candidates beyond what CVs, interviews and references reveal. While these traditional methods show skills and experience, uncovering personality traits and cultural fit proves more challenging.

The reality of modern testing

Today’s candidates often face a barrage of assessments during the hiring process. Some psychometric tests are automated, with candidates completing them on computers in recruitment offices or even remotely. Others involve paper-based tests supervised by professionals.

The key difference? Without a qualified, experienced professional to interpret results, these tests lose much of their value. The results are complex and proper analysis requires expertise that goes far beyond what automated systems can provide.

My psychometric testing experience

A decade into my career, I encountered this firsthand. After several intense interviews for a senior position, I was asked to complete a comprehensive testing session. It lasted four hours with barely a break.

I started with what appeared to be IQ tests, then moved through logical reasoning exercises. The strangest part was a lengthy exercise where I chose between seemingly random pairs of job titles in order of preference. Some pairings were obvious, others perplexing. The test was clearly designed for an American audience, featuring some job descriptions that, while comprehensible, weren’t familiar to me.

The actual psychometric tests came last. Answering the questions wasn’t difficult - in fact, the tester specifically asked me not to overthink but to trust my first response to each question.

By the end, I was emotionally drained, physically exhausted, thirsty and hungry. After a brief lunch break, I returned for a task-specific Q&A session. A few days later, an industrial psychiatrist called to discuss the results. Rather than revealing me as an “employment basket case,” the conversation was insightful and positive. He helped me see strengths I hadn’t recognized and suggested career directions I hadn’t considered. As it happens, I got the job.

Understanding what these tests actually measure

Here’s something crucial to understand: you don’t succeed or fail a psychometric test. There are no pass or fail marks. When an employer asks you to take one, they typically want to know if you’re right for a specific role. If you don’t match their needs, they may look for a more suitable opening for you elsewhere.

Some organizations use these tests like a sorting mechanism to make the best use of their employees. The theory is that tests reveal attitudes, beliefs and personality traits. This allows them to place empathetic workers with strong communication skills in customer-facing roles, while keeping more analytical, less social individuals in positions where they’ll thrive without constant interaction.

This approach is controversial. Not everyone agrees psychometric tests have real value. Reducing complex personalities to a handful of key terms is convenient, but it oversimplifies. It can lead to incorrect assumptions about how people react to various circumstances. Additionally, people change - you might get different results taking the same test on different days depending on your mood, stress levels, or recent experiences.

Cheating is pointless (and hard)

While it’s theoretically possible to game a psychometric test to show the personality profile needed for a desirable position, cheating is both difficult and ultimately self-defeating.

Well-designed psychometric tests include subtle cross-references to detect inconsistencies and identify dishonest responses. Testers can recognize when answers aren’t genuine. Showing up as inconsistent or dishonest obviously doesn’t help your case (unless perhaps you’re seeking a career where these traits might somehow be assets). You may simply appear confused or unreliable.

More importantly, cheating defeats the entire purpose. These tests exist to determine whether you’re a good fit for a particular role. Why would you want to trick your way into a position that’s fundamentally unsuited to your actual personality and strengths? Not only would you make yourself unhappy, but you’d likely set yourself up for failure.

Years ago, I interviewed John Wareham, a New Zealand-born recruitment expert who helped develop these tests. He explained that the main trick people learn is to avoid the extremes. Most tests ask you to rate things on a scale of 1 to 5 - if you want to present well, ensure the bulk of your answers cluster around the centre of this range. However, minor alarm bells ring if you fail to select any extreme answers at all. Wareham emphasised that the tests quickly detect dishonesty through cross-referencing, so answering truthfully is your best strategy.

How to get the best results

Since you can’t really “cheat” the test in any meaningful way, focus instead on presenting an accurate picture of who you are at your best:

Before the test:

Get a good night’s sleep. Clear thinking matters.

Relax and calm your nerves. This genuinely isn’t something that will hurt you. You’ll give a more accurate picture of your personality when you’re in a relaxed state of mind.

During the test:

Read the instructions and questions carefully. Reread anything unclear. If the tester says something you don’t understand before starting, ask for clarification.

Make sure you’re physically comfortable before you begin.

Don’t rush. Psychometric tests are rarely timed strictly, so work through questions carefully and consider each answer before responding.

Answer based on how you are at work, not at home or in private life. The company wants to understand you as an employee.

Respond based on how you feel currently, not how you were in the past or hope to be in the future. Organisations want to work with your current personality.

Don’t read too much into individual questions. Single questions don’t have hidden underlying meanings - the subtlety lies in how questions interconnect.

Avoid making too many extreme responses. If you’re marking things on a scale, ensure you have more middle-range answers (2s, 3s, and 4s) than extremes (1s and 5s).

Stay honest and consistent throughout.

After the test:

Ask the tester to discuss the results with you. Even if you don’t get the specific job in question, the test may offer valuable insights into more suitable career paths or aspects of your work style you hadn’t considered.

Two important concerns

Based on my experience and research, I have two lingering concerns about psychometric testing in hiring:

First, despite what professionals claim, people can learn to present themselves favourably without necessarily being dishonest. There’s a difference between outright cheating and understanding how these tests work. This raises questions about whether tests measure actual personality or test-taking sophistication.

Second, there’s a risk that managers use testing to offload decision-making responsibility. External objective measures have value, but they shouldn’t replace human judgment. There’s a temptation to rely solely on printouts and test scores without considering other compelling evidence about a candidate’s suitability.

Useful insight

Psychometric testing, when done properly with qualified professionals interpreting results, can provide useful insights. From my personal experience, I can see merit in establishing objective benchmarks that go beyond the human biases we all carry, even unwittingly. Personality genuinely is crucial when hiring, particularly for senior positions - often more important than specific skills or experience, and as important as aptitude.

The key is approaching these tests with the right mindset: they’re not about passing or failing, but about finding the right fit. Answer honestly, present yourself clearly. And remember that a “bad” result simply means that particular role might not be the best match for who you are - and that’s valuable information for both you and the employer.

Who are the knowledge workers?

Knowledge workers are taking over.

A third of American employees are already knowledge workers. The number is lower in Australia and New Zealand. Yet we’re catching up.

In developed, developing and even in some undeveloped countries they are the fastest-growing employment group. Knowledge workers outnumber industrial workers

In the developing world, knowledge workers outnumber industrial and agricultural workers. In more advanced countries they outnumber the two groups added together.

America has roughly as many as service industry workers. In most rich countries knowledge work is the most important sector in terms of economic and political clout.

A new idea

The idea that people can earn a living dealing purely with knowledge has only been around for 50 years.

Writer and management expert Peter Drucker is often credited with inventing the term. He first used the term ‘Knowledge Worker’ in his 1959 book “Landmarks of Tomorrow”.

Drucker modestly claims to be only the second person to use the phrase. He says the honour belongs to Fritz Machlup, a Princeton economist.

Drucker popularised the term. He spent 40 years expanding on the original idea, explaining its implications.

Knowledge workers misunderstood

Although the term is widely used and people generally understand what it implies, there is still much misunderstanding about its exact meaning.

One common misconception is that the term applies exclusively to people working in the information technology industry or elsewhere using products created by IT workers.

While almost all IT workers qualify, they are only a subset.

Anyone who makes a living out of creating, manipulating or spreading knowledge is a knowledge worker.

Broad church

That’s a wide definition. It includes teachers, trainers, university professors and other academics. You can categorise writers, journalists, authors, editors and public relations or communications people as knowledge workers. We’ll put aside for one moment arguments about whether the knowledge created by these people is accurate. Lawyers, scientists and management consultants are all included.

One key difference with other white-collar workers is the level of education and training. There may be some who don’t have a formal tertiary education or high-level training. They are a minority.

You need a degree, most of the time

As a rule, they have a minimum of a university undergraduate degree. That’s not always the case. Older knowledge workers tend to have less formal qualifications than younger ones. That’s partly because higher education wasn’t ubiquitous when they started out — university isn’t the only path to knowledge.

Another reason is that practical experience counts for a lot. The idea here is that each individual possesses their own reservoir of accumulated knowledge they apply in their work.

Compared with other groups of workers, they are well paid. Knowledge workers can belong to unions. But are often not organised in that sense.

This can lead to forms of genteel exploitation. Few knowledge workers get overtime payments. Yet employers expect most to voluntarily work for considerably more than the basic 40 hours a week.

Mobile

On the other hand, knowledge workers are more mobile than industrial workers and can often take their skills elsewhere at the drop of a hat. They often do.

Any employer who abuses knowledge workers’ professionalism is likely to see their most important assets walk out of the door. This applies as much today as it did when there were more jobs around.

Few governments have come to terms with the implications of having a highly mobile, highly educated, knowledge workforce. Many can quickly find a new employer if necessary, most can move freely between countries. Any nation that doesn’t look after its knowledge workforce can expect to lose it.

New Zealand knowledge workers

This applies in New Zealand. We operate a so-called progressive income tax system that, at times, appears deliberately designed to alienate knowledge workers.

The marginal and absolute rates of income tax paid by most New Zealand knowledge professionals are higher than in many competing nations.

From that point of view, Australia looks attractive.

If anything the flow of knowledge workers migrating to more benign economies is accelerating.

Drucker distinguishes between classes. High-knowledge workers include professional groups such as doctors and teachers. They deal mainly in the realm of the mind. While knowledge technologists work with their hands and brains in the IT industry, medicine and other areas.

Although both categories are growing, the bulk of growth comes from this second group.

See also: Knowledge work: Reports of its death an exaggeration.

Should touch typing be like learning to drive?

The Stringer family behind Melbourne’s Sunburnt Suburbia web site suggest Australians should be able to touch type:

In Australia, more than 90% of adults have a driver’s licence. To function effectively in the community you need one. As we attempt to become a knowledge-based economy, I think that the majority of Australians should also be able to touch type.

Maybe.

I learnt to touch type on a real typewriter as a trainee journalist long before I ever met a computer keyboard. I’ve found it a useful skill. However, I don’t presume to tell people they should to be able to do the same – that’s a decision they can make for themselves.

There are alternatives. If you don’t like typing you can always buy a tablet computer and use a pen to input information. The first generation tablets were unimpressive, but I’ve seen recent models that do a great job of turning electronic script into text. Of course, pen computing is not for everyone. My handwriting isn’t up to scratch – I suspect I’m not alone.

A more high-tech approach is to use voice-recognition software. Like handwriting recognition, voice input has improved greatly in recent years and many people swear by it. The technology is particularly popular with disabled people and those who have developed repetitive strain injuries or similar ailments.

Voice recognition companies claim 99 percent accuracy, in practice it takes a bit of getting used to and a little patience. As an aside, I first saw voice recognition demonstrated on a microcomputer (kids, ask your parents) in 1981. At the time a sales critter confided to me that the technology was just two years away from mainstream adoption. Voice has been just two years away from the mainstream ever since.

The whole idea of touch typing being an essential future skill is built around the assumption that tomorrow’s computers will be like today’s desktops and laptops. I’m not suggesting these are about to disappear, but for many people iPhones, Blackberrys and similar smart phones are replacing conventional computers.

Perhaps texting, Blackberry thumb typing or even picking out words on the iPhone‘s virtual keyboard is the real key to being a future knowledge worker.

Touch typing is a valuable skill that will serve you for some time, but I’m not convinced that having more touch typists is the key to building a knowledge economy. Interesting idea though.

This post was originally published in October 2008.

Unlock your creativity: Getting started with brainstorming

Whatever industry you work in, sooner or later you will need to generate new ideas. Dreaming up new products and services is an important part of any commercial venture. But there’s more to creative thinking than pure innovation. These days careers depend on an ability to conjure up something original.

Even if you work in a stable business where little changes from year to year, eventually you’ll rub up against a problem or challenge that requires you to think outside the square.

Imagination comes naturally to many people, but not everyone has the gift. The good news is that even people who think they lack creativity are capable of coming up with fresh insights — it’s partly a matter of practice, but it also depends on finding clever ways to shed the creativity-hindering baggage.

Brainstorming is the best tool for creative thinking teams

Brainstorming is one of the best tools for doing this. It’s a technique that has often proved its worth over the past 60 years or so and has evolved into an essential workplace discipline. Most of the world’s leading companies use it everyday. So do artists, writers, actors and other people in creative professions who need to generate fresh ideas by the truck-load.

Although you can buy software designed to speed or smooth brainstorming, it’s possible to brainstorm without any tools; all you need are two or more active brains, some ground rules and a little imagination.

The first brainstorming sessions took place in the advertising industry more than 60 years ago. In the 1930s, an advertising executive called Alex Osborn found himself becoming increasingly frustrated with the way meetings called to develop advertising strategies often stymied and not helped develop fresh ideas.

Formal meetings weren’t doing the job

At the time, be-suited executives would troop into a room for a formal business meeting and then carefully work through an agenda. The strict managerial hierarchies of the day meant that junior executives would defer to their seniors; speaking out of turn could be a career-limiting move. Not surprisingly many people were too frightened to speak out so they kept their bright ideas to themselves.

Often, concepts would be discussed in a highly combative way, so that the last man left standing (in those days it was always a man) would get his way. Usually this would be the most senior person in the room or perhaps the person with the most aggressive personality. Alternatively people would come to the meeting with great ideas, but the politics of the meeting saw them work towards a compromise — in the process the ideas would be so diluted that there was little substance left.

Osborn had a master’s degree in philosophy and a great interest in the mechanics of imagination and creativity. He realised that the barriers to inspiration needed to broken down so he devised a simple set of rules.

Four rules of brainstorming

The process defined by Osborn’s four rules was known as a “brainstorming session”. His basic set of four ideas remains the core of modern brainstorming today and its application now goes way beyond advertising. You’ll find brainstorming being used in every area of commerce, in government and even in academia.

Sydney-based problem solving facilitator John Sleigh teaches companies how to use brainstorming, he also conducts sessions. He uses Osborn’s four main rules and adds the requirement of recording all contributions so that they are clearly visible to all participants. He says, “You need a flip chart, a white board or better still, an electronic white board. When I started out in the 1970s we used to clip sheets of butcher’s paper to a table and write ideas there with a marker pen. In some ways the paper flip chart is the most user-friendly brainstorming tool of all.”

When Sleigh runs a brainstorming session he starts by asking participants “what are the issues?” He says, “I just stand there and get people to call things out. People who have done it before have no trouble with this. All the ideas are written on the flip chart or white board so that everyone can see everything.”

Anything goes

The next stage is to get people to think about possible ways of solving the problems; the rule is that anything goes. Sleigh says running a brainstorming session is different from conducting a formal business meeting and people sometimes have difficulty adjusting to the style. It requires a little training, but that shouldn’t take more than an hour. He says once people are freed of convention the ideas flow thick and fast.

If the brainstorming session is specifically geared towards solving a problem, Sleigh gets participants to define success and failure in their own words. He asks them, “What does good look like?” and the answers also go on the flip chart. Then, “What does bad look like?”

All these replies and the other to earlier questions are made into one long list of ideas, the second half of the meeting is what he calls the “tidy-up”; a process of sifting through these ideas, imposing order on the elements and looking for improvements.

Diverge then converge

Sleigh says the first part of the brainstorming process is about getting people’s thoughts to diverge; the second part is to make them converge.

It’s possible to conduct a good brainstorming session with just three people, but experts say it is more effective with a larger group of people. If you’re organising a session inside a large organisation, it’s important to get a range of people at different levels and with different responsibilities to take part because you want the subject to be looked at from as many angles as possible.

A relaxed atmosphere is essential. Some organisations have special brainstorming rooms with bean bags or comfy chairs and begin sessions by playing mood music or serving tea and biscuits. You want people to feel that they can say silly things, so one useful technique is to start the session by doing something slightly crazy like giving everyone a funny hat. A more sober but equally effective loosening up might be to start by asking people to describe their favourite pet.

Different styles of brainstorming

There are many different styles of running a brainstorming session. Some leaders ask people to think privately about matters for a set period before switching to a group session. Other go straight to the group.

In some organisations the process is a chaotic free-for-all. In others everyone is asked to contribute to the discussion before someone can speak a second time.

Some managers have tried technical solutions that work somewhat like an online discussion group operating in real-time. There are also idea-generating software packages like Idea Fisher which stimulate free thinking. All of these approaches are valid, brainstorming is not a one-size-fits-all technique.

Knowing when to stop

Perhaps the hardest part of running a brainstorming session lies in knowing when to stop. You need to make sure you generate enough ideas, but it’s good to halt the session when no more new material is forthcoming.

One strategy is to impose a fixed time limit on the meeting and work towards a deadline — this can concentrate minds wonderfully. Half an hour should be enough for most sessions, but you might need a little longer if you have a large group of participants. Most brainstorming sessions wrap up with a list of the better ideas. Depending on your goals this might be the single best suggestion, a top three, top five or even ten items.

Brainstorming.co.uk Be warned this site is plug ugly (it still has a mid-90s web look and feel). However it is useful offering a free brainstorming tutorial and a good jumping off point for beginners.

Edward de Bono (No longer online) Famous for inventing lateral thinking, Edward de Bono promotes alternatives to traditional thought processes. There’s a wealth of material here, but it primarily exists to sell books and consulting.

Idea mapping is a powerful brainstorming tool for sorting through and organising thoughts. You can use it for something as simple as writing a homework essay.

Top Ten Brainstorming Techniques A list of smart ideas to get your brainstorming sessions cooking.

What’s wrong with brainstorming? A constructive criticism of brainstorming.

The Four Rules of Brainstorming

Your Brain

If your brain was a PC, optimising its performance would be easy. You’d start by backing-up important files, cleaning out the recycle bin and defragging the hard drive.

Then you’d search for unnecessary bits of code swallowing valuable processor cycles. Next you’d check all your important programs and drivers are up to date. After that you’d schedule regular preventative maintenance breaks to stave off problems before they appear. Finally you’d install a decent anti-virus program and a firewall to keep everything safe from harm.

Thankfully, human brains do most of their necessary maintenance work on autopilot. That’s good news because with as many as 100 billion neurons to play with, your brain is considerably more complex than any existing computer and it doesn’t come with much documentation. However, there are things you can do to improve on the autopilot and keep your grey matter ticking over at maximum efficiency.

Get some sleep

The first is to ensure you get enough good quality sleep. Research studies show that even a small amount of sleep loss has a devastating effect on divergent or creative thinking. It takes longer to find key insights and reach decisions. Exactly how much sleep you need depends on your own body, but you should target a minimum of eight hours before any creative work.

Your diet can have a major impact on your ability to think. A well-balanced nutritional diet helps thought processes. Unlike most body cells, brain neurons don’t reproduce so not eating properly can kill your creativity.

Brain neurotransmitters are largely made up of amino acids; you can replenish these by eating eggs, fresh milk, liver, kidneys and cheese. Other good sources are cereals, some kinds of nuts, soybeans and brewers’ yeast. There’s some truth in the old wives’ tale about fish being good for the brain. It has a chemical called Di-Methyl-Amino-Ethanol which is linked to learning, memory and intelligence, it can also increase alertness. Avoid carbohydrates, they tend to cause drowsiness.

Caffeine can help

Caffeine is a sure-fire way to get the brain moving quickly. Research shows people think faster and clearer after a cup or two of coffee. Be wary of drinking too much, it’ll make you edgy and interfere with sleep.

Exercise and fresh air are great for creative thinkers. This can, but doesn’t necessarily, visiting the gym. Many creative workers, journalists included, find creative inspiration simply by taking a long walk — just walking around is great if your find your creativity is blocked. You may also find it easier to think creatively if you switch off external stimuli.

Lastly, like a knife, your creativity will stay sharp if you use it often, but not so often that it become blunt. Train yourself to think creatively in bursts and give yourself rest periods in between.

Indieweb for journalists

There are times when working as a journalist overlaps with the Indieweb movement.

What happened: 2017 to 2026

The ideas sketched here in 2017 largely came to pass, though not always through IndieWeb protocols. The principle—journalists owning their work and distribution—proved correct:

**Independence won: **Substack, Ghost, Microblog and personal newsletters became standard. Journalists learned to build direct reader relationships rather than depending on platform algorithms or legacy publishers.

**Portfolio control matters: ** Maintaining your own archive became essential as news organisations collapsed and old URLs disappeared. Journalists who owned their own platforms kept their work accessible.

The subscription economy: What the IndieWeb called “owning your content” evolved into sustainable business models where journalists developed direct reader relationships. The 2017 vision was correct: independence from the big tech giants became crucial for journalism sustainability.

The view from 2017

The first and most obvious overlap between journalism practices and the Indiweb is the idea of having a syndicated work portfolio. If you like, you can create a single source, feed or river of everything written or posted elsewhere.

This means linking back to stories published on mainstream media sites. I want to do this even when those sites don’t reciprocate my links.

At the moment I sometimes write a linking blog post on my site.

Linkrot doesn’t help

One problem with this is the way big newspaper sites change URLs and even drop old content. Keeping links up to date is hard work. Publishers missed opportunities to maintain permanent archives—another reason journalists need control over their own content.

The second Indieweb idea is to somehow consolidate the comments that fill different buckets at places like Facebook, Google+ and Twitter. There are also some on Disqus.

There have been times when there are two or more conversations covering much the same aspects of a story. It would be better if the interested commenters could see what others have to say and interact.

Indieweb central repository

Then there’s my unrealised idea of moving to more of a stream-of-consciousness style of reporting. This is not so much Jack Kerouac style, but more like the daily live blogs you see on sites like The Guardian. I like the idea of writing a post then update it as the story evolves. This would be easier to manage with a central repository.

Last and not least, there’s my need as a journalist to own my work outside of the big silos.

I’m not a snob about FaceBook or Google, but I am aware their shareholders get the reward for my effort when my work appears there. It won’t happen overnight, but the Indieweb may hold the key to redressing the balance in the future. The subscription economy that emerged proved this concern valid—journalists needed to own their reader relationships, not rent them from the tech giant’s social media services.

There’s a lot to be said from taking back control over how we work with technology.

More on journalism and media: This post is part of ongoing coverage about journalism independence, business models and platform control:

Originally published July 2017. Updated January 2026. Many of these ideas became standard practice as journalists built independent sites.

The Hawthorne effect

Mark Shead at Productivity 501 writes about the Hawthorne effect:

The Hawthorne effect refers to some studies that were done on how training impacts employees’ productivity at work. The studies found that sending someone to training produces employees that work harder. The funny part about it is that you still get the productivity increase even if the training doesn’t teach them how to be better at their jobs. Sending someone to training helps them feel like they are important, like the company is investing in them and they are valuable. Because of this, they work harder.

An explanatory note at the bottom of Shead’s post points out the original tests were to do with changing light levels. You can read Shead’s original story at Hawthorne Effect : Productivity501.

It’s worth reading the Wikipedia entry on the Hawthorne effect. There’s also a good definition of the effect at Donald Clark’s site: The Hawthorne effect.

Clark writes:

The Hawthorne effect – an increase in worker productivity produced by the psychological stimulus of being singled out and made to feel important.

Clarke links the effect to work done by Frederick Taylor who gave birth to the idea of industrial psychology.

My common sense experience as a manager says you should pay attention to workers as a matter of course. Sadly this isn’t obvious to everyone. It certainly wasn’t back in the 1920s and 1930s when these ideas were fresh and new. If the effect is clear among knowledge workers at your workplace, it’s a sign you aren’t managing people correctly.

See also: Taylor’s scientific management doesn’t apply to knowledge work

How to write like an old-time journalist

A blog post, article or other piece of copy is what journalists call a story. Here’s how to write one.

You start a story by telling the reader what it is about. You do this briefly in the headline. Then again in the introduction or intro, which is a stop press paragraph.

Ask yourself:

Sum up the story in your mind in one simple sentence. This is your intro.

Its purpose is to tell the reader what the article is about and draw the reader in. As a rule, readers prefer brief intros.

Write so a reader who only samples your intro still has a basic grasp of your story.

Newspapers teach journalists — on both tabloid and quality papers — to start with a single sentence of between 15 and 21 words. This is what you should strive for, although at times you’ll need to use more words.

As an aside, proper nouns made up of multiple words only count as a single word when you’re calculating the ideal intro length.

Your first paragraph can be one sentence or three but keep it short and crisp.

Next comes the how: how did it happen or, more usually in your case, what happens next?

This is background information or explanation.

After the explanation comes amplification. You amplify the point or points following on from the intro.

Make these points one by one and in descending order of importance.

Last, after making all the main points, tie up any loose ends — that is add any extra or background information deemed necessary but of lesser importance.

Originally published March 25th, 2010.

Frederick Herzberg’s Two-Factor Theory

Herzberg’s Two-Factor Theory identifies two sets of workplace factors: motivators and hygiene factors. These are not mirror images of each other; what motivates employees is distinct from what de-motivates them.

Motivational factors belong to an individual. They directly affect performance. Bosses need to pay attention to motivational factors because this is something they can influence or even control.

Being able to tick each motivational factor for everyone on your team is important, missing any motivational factors quickly leads to bad attitudes and negative thinking.

Herzberg’s motivational factors include:

Achievement:

This is the sense of successful conclusion: making a sale, reaching a target or solving a problem. Workers like to feel they do a good job. The sense of achievement is directly related to the size of the challenge. Managers should set achievable goals and acknowledge accomplishments..

Recognition:

Appreciation of a person’s contribution by management or colleagues. It can, but doesn’t necessarily, involve a reward for merit. From a manager’s point of view, it is as simple as saying “thank you”.

###:Job interest: The appeal of a particular job. People are more motivated by work that isn’t repetitive or boring.

Responsibility:

Workers need autonomy at work by being allowed to make decisions and being trusted. Many people get real satisfaction from being accountable for the work of others. As a manager you should remember that most employees would be pleased if you delegate important tasks.

Advancement:

Workers need to feel they are going somewhere. Having the opportunity for promotion in either status or responsibility is important, but the prospect of advancement is almost as important as real advancement.

Herzberg called his second group the Hygiene factors. Hygiene factors surround a job.

Companies control hygiene factors at a high level. They should not be confused with organisational culture, but the two are closely related. Hygiene factors won’t necessarily motivate people, any positive effects are modest or short-term, but if they are not there. workers will be dissatisfied and un-motivated.

Company policy and administration:

Ask yourself, are policies clearly defined? Is there red tape? How efficient is the organisation? Are internal communications effective?

Supervision:

The accessibility, competence, and quality of management impact job satisfaction.

Interpersonal relations:

Positive social interactions, such as casual conversations or shared lunch breaks, improve workplace morale.

Salary:

How a company’s total reward package compares with similar companies. Include factors such as cars, superannuation plans, perks and amount of paid annual leave. If this is not competitive, workers will look elsewhere.

Status:

This is a measure of the status of people within the organisation. They look at their workspace (corner office and privacy rank highly), their job title, key to the executive washroom, car parking facilities and company credit card among other things.

Job Security:

This is not just about the likelihood of someone losing their job, but also about the possibility of losing their job.

Personal Life:

How does a person’s job affect their life outside of work? Are they expected to work long hours, move to far-flung cities or simple neglect their spouses and children for the sake of corporate goals?

Does the organisation frown on unconventional ways of life even though they have no obvious impact on a person’s work.

Working Conditions:

The physical workplace. The degree of comfort or discomfort has a major effect on satisfaction. Also look at matters like proximity to facilities such as shops, lunch bars and public transport.

More on management and motivation theory

Apparently I’m not a geek

Originally published December 2011. Updated January 2026. After 40+ years in technology journalism, this principle remains central to my work.

Why detachment matters in journalism

The percentage may have changed slightly—technology has seeped deeper into everyone’s lives since 2011—but the core principle hasn’t: maintaining distance from geek culture makes for better technology journalism.

This isn’t about lacking technical knowledge. It’s about perspective. Technology journalists serve readers, not industry insiders. The moment you write primarily for other technology enthusiasts rather than the people who actually use technology in their daily lives and work, you’ve failed your audience.

According to How geeky are you? I’m only 15 per cent geek.

That seems right.

I fail because I don’t like science fiction or any other geeky form of entertainment.

Despite 30 years of writing about technology, geek culture hasn’t rubbed off on me.

I’m not comfortable when I’m with other technology journalists who want to talk about Star Trek or Dungeons and Dragons.

To say these things don’t interest me is an understatement.

We have science fiction books on our shelves at home. Visitors to our house assume they are mine. They are not. They belong to Mrs B. And apart from her reading tastes, she is even less geeky than me.

Computers do not mean geek

Most of the points I scored on the geek test come from work. After all, I’ve spent years writing about computers and technology, I know the difference between a Rom and a Ram.

Of course, I have more than one dictionary. It’s a journalist thing – they are tools of my trade. And yes, I confess I correct people’s grammar. Editing has been my job for most of my adult life.

In the past, people have commented on my non-geek status making me the wrong person to edit a newspaper’s computer pages, run a computer magazine or write about technology.

Detached

I disagree. A level of detachment means I can make better rational decisions. I’m less tempted to air my prejudices. It means I write for ordinary people, not geeks. In fact one of the skills I’m most proud of is being able to explain tricky things in plain English.

I’m a journalist first, technology specialist second. I can – and have – written about most subjects.

And anyway, most of my work has been writing for non-geek audiences. My lack of geekiness means I can better serve their needs. This approach proved especially valuable when covering New Zealand’s technology industry. Local companies need journalists who can explain their innovations to potential customers and investors, not just other technologists. Being able to translate technical developments into business and economic terms serves both the industry and the public better than insider jargon ever could.

The same applies when covering telecommunications regulation, business model challenges in media, or the impact of technology on society. These stories require understanding the technology, but they’re fundamentally about people, economics, and social change.

My journalism training taught me to ask “why should readers care?” before “how does this work?” That order matters. Geeks often reverse it.

Journalism first, technology second

This reader-first approach shaped how I’ve covered journalism itself. When publishers struggled with digital transformation, the story wasn’t about the technology—it was about business models, audience relationships and sustainable journalism.

When paywalls and subscriptions became necessary, the challenge wasn’t technical implementation but convincing readers of the value proposition. When ad-blocking threatened publishers, it was fundamentally about the broken relationship between readers, publishers, and advertisers.

Technology enables or constrains these developments, but it’s never the whole story. That’s why detachment from geek culture remains an asset, not a liability.

More on journalism and media: This post is part of ongoing coverage about journalism practice, business models and the craft of technology reporting:

Technology writing - a guide for beginners

A vintage typewriter sits on a dark wooden surface Follow a few simple rules and you’ll be able to write decent, readable articles or stories about technology for any audience.

Good technology writing doesn’t come easy. Not at first

Most people can write simple, straightforward text even if they’ve little formal writing experience.

That is the best place to start.

Next you need to learn to put your readers first. Understand what they need to know and the barriers they might face getting to the information.

After that, good technology writing is about understanding your subject matter and clear thinking — then turning your thoughts into words.

If you can do this in a logical way, the shape of your story will lead the reader through the key points.

Step one: Start simple

Start by sticking to basic words and simple sentence structures. Don’t worry if this feels like plodding. You can experiment with language when you feel more confident.

Inexperienced technology writers often have one of four faults:

Never worry if geeks tell you your technical writing is too simplistic. They are not the target reader and anyway they probably think they know everything about the subject already.

Hitting the right note

Pitching your copy at the right level is the hardest part of technology writing.

Experienced technology writers know no one ever succeeds by overestimating the reader’s intelligence. They also know no one succeeds by underestimating readers.

Remember people who are expert in one area of technology, may not understand other areas. And a technically literate readership does not give one a licence for sloppy explanations of complex technical matters.

If you find this difficult, imagine you are writing for an intelligent colleague working in another area of your organisation.

Lastly, if you can, always get someone to proofread your copy.

Ask them to point out what doesn’t make sense and to see if you’ve made any obvious errors. Don’t take offence if they find things that need changing. Your pride will be more wounded if the rest of the world saw your mistakes.

Apply good writing to all your communications

Technology writers spend significant time communicating via email with sources, PR representatives and editors. The same principles of clear, thoughtful writing apply here.

One seemingly small detail matters: avoid starting emails with time-specific greetings like “good morning.” Your message might arrive when it’s not morning, making you look thoughtless. Use greetings that work at any time.

Theory X and Theory Y — looking at motivation

After interviewing managers to find their views and attitudes on work, management theorist Douglas McGregor came up with two models. He called them Theory X and Theory Y.

These were first described in McGregor’s 1960 book, “The Human Side of Enterprise”.

Theory X assumes people dislike work and do what they can to avoid it. This leads to the following:

  1. Because people hate work, bosses have to force, threaten or bribe them before they will work hard enough.

  2. People like being ordered about, they seek security in authority and go out of their way to avoid taking on responsibility.

  3. Money is the main motivating force Issues to do with personal security come second.

  4. The only creativity most people are able to display is when it comes to avoiding work or finding ways of getting around management edicts.

We need to work, not just for the money

On the other hand Theory Y says people need to work as much as they need to rest or play.

Work is an important part of a person’s psychological growth; many people find it inherently interesting and even enjoy working. This gives rise to four more statements:

  1. People are generally happy to direct themselves towards any acceptable goal or target.

  2. Self-discipline is more effective and, sometimes, more severe than any external direction. Under the right conditions people will seek out and accept responsibility.

  3. Once they have met certain basic needs, people are more likely to be motivated by their internal need to realise their full potential than any base incentive.

  4. Everyone is basically creative and capable of intelligence, most of the time, managers underuse these qualities.

McGregor regards the two theories as basic attitudes. Most managers fall squarely into one camp or the other but sometimes the theory one follows depends on particular circumstances. For example, armed services depend on Theory X, so do many factory managers.

Although his research took place before modern knowledge-based industries developed, McGregor recognised Theory Y style management was better for problem solving. For the most part knowledge workers will be operating along Theory Y lines. However there are some companies and bosses that still subscribe to Theory X.

McGregor believed that if you treat people according to one of these theories, they’d act along the lines expected. In other words, one conclusion of Theory X and Theory Y is if you assume people are lazy, they will be.

Simple writing is good writing - keep it clear and direct

Simple writing is good writing. It is direct, clear and precise. It is unambiguous.

As a writer your goal is to get ideas to your reader.

You want to do this in a way that is fast and accurate.

The best way to do this is by putting as few barriers as possible between your message and your audience.

Forget what you learnt about writing in school

You may have impressed teachers and exam markers with your grasp of obscure long words and clever grammar. In the real world simple, straightforward language works best.

For many would-be writers this is the hardest adjustment to make.

Keeping it simple applies to all types of writing. It applies to every audience.

Think of your readers

Not all your readers are native English speakers. Not all them are highly educated. It’s unlikely you’ll impress those who are both with fancy words and cleverness.

Not every reader has intimate knowledge of the subject matter. We all have to begin somewhere. Even experts in one area are at a lower level in similar areas. And anyway, they don’t want to be challenged all the time.

Apply simple writing to everyday communication

Simple writing principles apply to all your communications, including email.

One common mistake is starting emails with time-specific greetings like “good morning”—these can backfire when your message arrives at the wrong time. Use greetings that work any time of day to avoid looking thoughtless or rude.

📢 If you plan to use an iPad for writing, take a look at A practical guide to writing on the iPad.

Originally published May 2009 | Last reviewed January 2026

Writing for the web in 300 words

All you need to know about web writing in under 300 words. From my 2010 Wordcamp NZ presentation.

  1. Start straight away. Don’t waste time warming up.
  2. Reduce barriers between your ideas and your audience.
  3. Write clearly. Use readily understandable language. Be unambiguous.
  4. Learn grammar. Forget what teachers said about long words making you look smart. It isn’t true.
  5. Instead use simple words, grammar and sentences. It is harder to go wrong.
  6. Go easy on adjectives and adverbs.
  7. Spellcheck.
  8. Try to imagine your reader – an ordinary bloke or woman. Write for that person.
  9. Use ‘be’ verbs sparingly to make your writing more interesting. Use them even less in headlines.
  10. “I would have written a shorter letter, but I did not have the time.” Most people think it was Mark Twain; it was Blaise Pascal, a French Mathematician.
  11. Keep sentences short; up to 20 words. A 15 word sentence limit is better.
  12. Keep paragraphs short; usually one to four sentences. Only use more if you need to.
  13. Use plenty of full stops and line breaks. Use lists and bullet points. Be generous with crossheads (secondary headings).
  14. Highlight keywords with bold or italics.
  15. Writing is story telling.
  16. Summarise your story in the headline.
  17. If you write an introduction use it to tell readers what your story is about. Expand on your ideas in the following paragraphs.
  18. Write so you can cut the story at any point yet readers have the maximum information.
  19. Aim for short and crisp. Online readers tire after 200 words and start dropping out at around 300. Keep most stories below this length although you can write longer pieces.

You can find longer explanations of all these points elsewhere on this site.

While all this remains true in 2026, there are good reasons to write more than 300 words. Google favours longer posts and readers are less scared of scrolling down than they were in the past.