The specialist generalist

Over the years I have come to understand that my real aptitude lies in helping to explore and define problems rather than to craft specific and technical solutions. Do this well, as I have at various stages, and you’ll find yourself in positions where the latitude to sit and think is extended significantly and can in fact become the accepted reason for your continued employment and contribution. At that point you have actually moved beyond simply being a simple, practical generalist and have started to engage with the role of the specialist generalist. This is someone who, rather than simply bringing together a variety of specialties, works in the world of the complex and the unknown, to define and appreciate problems and then to architect the shape of possible approaches and solutions. I must say I am enjoying the freedom of Charles 6.0 to suit myself as to those problem domains and chaotic edges in which as a specialist generalist I choose to dwell.

Advertisements

Since ceasing full time work, and starting my Charles 6.0 transformation, I have met a number of people (some more significantly advanced in years than my own) that continue employment on a consulting or contracting basis. It has sometimes been suggested that perhaps that’s something I would like to do. Quite apart from being pretty fully occupied without having any work-like obligations, one reflection on this has been that these people normally have a highly specific and singular expertise that is valued in the marketplace, such as database programming, construction engineering or town planning.  The world of the consultant contractor is the world of the dedicated specialist.  Quite reasonably most clients are looking for someone to undertake a specific task with well-defined outcomes – that way they know they will get at least an approximation of what they are paying for.

At various times during my working life I have worked as a self-employed consultant/contractor. To be completely frank I’ve never really been all that good at it.  While I easily discharged the usually IT related tasks (such as application coding or database design) entrusted to me by various clients, I generally wanted to do more and often found the focused, repetitive aspect of the work they wanted me to specialise in frustrating and somewhat unfulfilling – when I’ve done something once I generally want to solve a different problem or acquire a fresh skill. The basic problem is that I’m interested in too many things.

Over the years I have come to understand that my real aptitude lies in helping to explore and define problems rather than to craft specific and technical solutions.  However, usually people either feel they have a good handle on what needs doing or they lack the trust necessary to commission someone else to explore the problem space.

By and large there is also an inclination to ‘rush to solution’ – there is little appreciation of the art and skill of sitting with a problem long enough to understand its true demands and dimensions – which quite frequently are more or less different to the immediately presenting issues.   Newsflash: that is not a proposition easily sold into a competitive market place – there are not many clients willing to pay someone to sit with a problem – they want them solved – ASAP!

I actually found the best place to practice that particular art is as a full time employee – oftentimes you can layered the necessary time spent sitting and thinking in among all the busy work that employers seem delighted to visit upon their workers. It is here you can cultivate the position of the generalist employee, easily deployed to various tasks but sometimes lampooned as the ‘jack of all trades that is master of none’.  However as the Wikipedia entry about that saying notes, such an individual may be a master of integration, knowing enough from many learned trades and skills to be able to bring the disciplines together in a practical manner – what I would call a practical generalist.

Do this well, as I have at various stages, and you’ll find yourself in positions where the latitude to sit and think is extended significantly and can in fact become the accepted reason for your continued employment and contribution. At that point you have actually moved beyond simply being a simple, practical generalist and have started to engage with the role of the specialist generalist. This is someone who, rather than simply bringing together a variety of specialties, works in the world of the complex and the unknown, to define and appreciate problems and then to architect the shape of possible approaches and solutions.

One conceptual tool I have found useful to frame complexity in this context is what is commonly known as the ‘Stacey diagram’, so named after the British organizational theorist and Professor of Management Ralph Douglas Stacey. It has apparently been frequently adapted by other writers, as noted by Wikipedia often in ways not consistent with Stacey’s – to the point that apparently ‘he dropped the diagram and now argues against its use’.  I am as guilty of appropriating and extending his original thinking as anyone!  But I find it incredibly useful as framework for analysis and thought, and so I have sketched my own take on it, as illustrated here.

20171003_100112.jpg

There are two axis to the diagram – Uncertainty and Disagreement:

  • The horizontal x-axis is Uncertainty. When an Issue or decision is close to certainty it is because cause and effect linkages can be determined.  This is usually the case when a very similar issue or decision has been made in the past, you can then use past experience to predict the outcome with a good degree of certainty. The other end of the certainty continuum is ‘far from certainty’. This is when the situation is unique or at least new to the decision makers.  The cause and effect linkages are not clear.  Using past experiences is not a good method to predict outcomes in the far from certainty range.
  • The vertical y-axis is Disagreement. This measures the level of agreement about an issue or decision within the group, team or organisation.  The degree of agreement on what should be done is an important factor in determining success.

I have found a very useful and succinct exploration of the Stacey matrix in relation to art of management and leadership, on this GP training resource archive.  It maps various forms of decision making onto the matrix: Technical rational in the ‘simple’ region which is close to certainty and close to agreement – in terms of this blog the place for the specialist; Political for the area having a great deal of certainty about how outcomes are created but high levels of disagreement about which outcomes are desirable; Judgmental for the opposite set of issues with a high level of agreement but not much certainty as to the cause and effect linkages to create the desired outcomes.

Political and Judgmental for my purposes here are the realm of the ‘practical generalist’.

And then there is the Complexity zone which lies between these regions of traditional management approaches and chaos and is the natural home of the specialist generalist.

20171003_101720.jpg

A few observations on what is required to work as this close to the edge of chaos – for it to be a ‘zone of opportunity’ …

  • Be prepared to have absolutely no idea what you’re doing much of the time!  The qualification is to be able to ascertain rapidly what needs to be known and to acquire that knowledge rapidly rather than to have a stored repertoire of specialist knowledge to hand.
  • Work on the basis of principles rather than rules. I like this recent post I found on LinkedIn – ‘Burn Your Rule Book and Unlock the Power of Principles’, which observed “Principles, unlike rules, give people something unshakable to hold onto yet also the freedom to take independent decisions and actions to move toward a shared objective. Principles are directional, whereas rules are directive.”  But a specialist generalist needs to be prepared for uncertainty even here: paradigm shifts in terms of the set of principles to be applied in a given space, to find space for innovation and novel principle to emerge.
  • Be a systems thinker – I like the following illustration of the Tools of a System Thinker (attached to a tweet by @SYDIC_ITALIA Chapter Italiano della System Dynamics Society Internazionale – no further reference to acknowledge). However a specialist generalist must be an open-ended systems thinker, sensitive to emergent systems and to proto-systems at the edge of chaos.  You cannot insist on systems at all costs, but need to utilise the insights systems thinking can generate.  Be a network systems thinker, value the connections in the models you will perceive and generate as well as utilizing networks of skill and knowledge around the problem space.

IMG_20170915_084751.jpg

It took me a long time to recognize and name myself as a ‘specialist’ generalist.  It is a very difficult and demanding role, one that is difficult to sell and articulate, but one which can deliver dividends with multiplier effects well beyond the contributions of specialists and practical generalists, since it is the role that seeks innovation, requires agility and rewards resilience.   That said, in the end with respect to my specialist computer-related skills, I decided to employ my abilities to my own ends rather than to try to meet the often poorly articulated and often contradictory needs of clients, be they internal or external.  I must say I am similarly enjoying the freedom of Charles 6.0 to suit myself as to those problem domains and chaotic edges in which as a specialist generalist I choose to dwell.

20170604_151702-01-01.jpeg

AI – where might we be going?

This blog is based on my observation of AI since I first came across the idea working at the Royal Blind Society in the early 80s – although what AI was then and what it is now are very different beasts. Thinking about developments for the next decade, I settled on the term ‘Sentience’, which deliberately avoids the term AI (although as a loose umbrella term it can be read in), to choose a word reflecting a more modest level of machine capability. This ‘sub-intelligence’ if you like is conceivable from the intersection of a number of network and ICT trends / technologies emerging by the start of the decade (i.e. 2020). It seems to me that ‘sentience’ might best be described in terms of being surrounded by, embedded in, environments that are in some way aware of the individual and their context – various relationships to things, information and other people. It will not be complete but its emergence seems likely to be a dominant theme for the next decade, a logical inheritor of the consequences from digitalisation, convergence and then network developments, with the same kind of wide ramifications for culture and society.

I have been interested in the interaction of humans with technology since my initial university studies in anthropology and sociology.  We are not defined by our tools, but our tools have functioned as an extension of ourselves and as facilitators of interaction with our environment. Over the past millennia they have produced an accelerating transformation of the shape and pace of human society.

This post is shaping to be a two part blog: the first exploring my thoughts and experience with Artificial Intelligence (AI) technology – broadly defined – and the second next week looking at the discussion about the implications of this technology for the world and future of work.

I first came across AI when I was working at the Royal Blind Society (RBS) in the early 80s, although what AI was then and what it is now are very different beasts.  At the time I was completing my Master of Commerce, in which I had pursued my interest in technology in various topics ranging from Information Systems Design to Industrial Relations.  It was in the latter that I engaged with the ideas of sociologist Daniel Bell and the notion of the ‘post-industrial society’.

20170919_101218

It is interesting that the term ‘convergence’ had been much discussed in the seventies. It was a product of the Cold War – the idea that industrial economies would converge in their structure and organisation and that essentially Russia (then the USSR) would come to resemble the US and Europe. There is a whole thesis waiting to be explored in figuring out what happened to that idea in the vortex of history – too much to go into here.

The key learning for me was that the complexity and interrelatedness of technological innovation and change with economic and social factors such as:

20170919_101142

  • Factors of production
  • Technological interdependencies and linkages
  • Organizational structures
  • Sectional, regional and individual distribution of income and wealth. International interdependencies
  • Public and private demand

 

I applied this learning at the RBS when researching the impact of technological change on the employment context and prospects for visually impaired people.  It was a time of considerable technological excitement as the personal computer began to penetrate the mass market – a signature moment was when TIME Magazine nominate the PC as ‘machine of the year’ for 1983.

20170919_091628

One element of technological change which played into these investigations was machine vision: while wildly futuristic at the time, it was also becoming almost imaginable and at the apparent rate of change and innovation seemed possible in a foreseeable future. It turned out that machine vision, particularly ‘in the wild’ was actually much harder than might have been apparent and is something that is only now (2017) starting to find wide spread use in things like self-driving vehicles – and so far as I know has yet to find practical application in the everyday lives of visually impaired people.

However this sparked my general interest in the whole field of computers mimicking or emulating human cognition or perception. Expert systems were an area of market enthusiasm, one that I found particularly interesting. I actually crafted and experimented with my own primitive expert system shell, written from scratch using turbo Pascal which involved delving into the arcane and technical worlds of generative grammars and token parsing as well as algorithmic inference processing.

But ultimately both for myself and the world at large expert systems proved to be a dead end. This this was primarily due to the issue of knowledge capture that is the sheer effort required to manually encode knowledge into decision tree type language. The other limiting factor was the limited processing power and memory storage available on their computers off at the time.

The general interest in AI peaked by the end of the decade – the cover of this Economist 1992 special feature would not have been out of place today, but the discussion is much more about the limitations and cumbersome nature of the technology than grand horizons.

20170919_091743

Increasingly the view came to be that any particularly advanced or clever piece of coding was seen as intelligent while it was a novelty, but rapidly became ‘part of the furniture’ and thence became part of the ‘dumb’ and rather pedestrian reality of IT which came to dominate our working lives.

During the course of the 90s the word ‘convergence’ at least in tech circles changed and came to be much-discussed in terms of the coming together of the traditional silo platforms of broadcasting, telecommunications and print. Pervasive digitalisation  broke the legacy nexus between the shape of content and the container which carried it – a voice call was no longer solely defined by being carried on a plain old Bakelite telephone network; a TV show no longer solely by arriving via a transmission tower and home receiver (the same for radio shows); music spread rapidly beyond the domains of the vinyl record, compact cassette and CD – it got ‘shared’ online; and the Internet carried news much further and faster than a newspaper.  This meant that commerce and regulation constructed on the premise that content could be priced and controlled by how it was delivered increasingly lost its force, both in logic and in practice.

Then over the first decade of the 21st century (the ‘noughties’), IP-based networks and then social networks came to play an ever more important role.  This has meant content became non-linear, interlinked and ‘uncontained’ while people increasingly expected to connect and communicate seamlessly – anywhere, anyhow, anytime. Entire new and massively successful network businesses emerged in the second half of the decade – Google and Facebook to name the most obvious.

‘Silos’ was the convenient way to describe the pre-convergence arrangements and ‘Layers’ was an important alternative way to look at the way the technological environment was changing, as a way to describe the actuality of what was called convergence.  Layers had been in common technical use for a decade or two before this, but it around at this time the general utility of the concept more generally became apparent, since it is native to the way in which networks are constructed and the Internet works.

As the noughties wore on, it also became apparent that ‘layers’ as such could not ultimately and successfully grapple with all the developments in the marketplace.  The ‘bright lines’ between layers are blurring under the impact of virtualisation and software emulation.  An example of virtualization is the way in which several physical computer servers can emulate a single large (virtual) computer OR a single large physical computer can emulate several (virtual) computer servers. This has been extended beyond the enterprise and is essentially the basis for cloud computing – the customer buys the computing and storage they need as virtual resources from the supplier who takes care of the physical requirements.  Multiple, inter-networked free-scale networks which can configure to emulate many other network forms better explain the complexity and rapid adaptability of the market in the current decade, whatever we decide to call it (the ‘tweenies’?).

So the term ‘silo’ was useful shorthand to describe the pre-nineties technological environment, ‘convergence’ summarized the nineties, ‘layers’ was useful for the noughties and ‘networks’ is perhaps most apt for our current decade, which as you may have noticed, is drawing to a close.

This ‘progression’ is reflected in the movement in the discussion of the technological environment from ‘convergence’ to the ‘networked society’ and ‘connected life’.  This shift does not suggest that the transition to the ‘networked society’ is complete, but rather that the concept of the ‘network’ better describes and encapsulates the current dominant movement and theme at work and influencing society during the decade.  Having remarked on this progression, the obvious question is to ask: what is likely to be the concept that fulfils this role in another decade’s time?

My stab at it a few years ago was ‘Sentience’.  I was deliberately avoiding the term AI (although as a loose umbrella term it can be read in) and chose a word that reflected a more modest level of machine capability. This ‘sub-intelligence’ if you like is conceivable from the intersection of a number of network and ICT trends / technologies emerging by the start of the decade (i.e. 2020).  It seems to me that ‘sentience’ might best be described in terms of being surrounded by, embedded in, environments that are in some way aware of the individual and their context – various relationships  to things, information and other people.  It will not be complete but its emergence seems likely to be a dominant theme for the next decade, a logical inheritor of the consequences from digitalisation, convergence and then network developments, with the same kind of wide ramifications for culture and society.

What is different now in the contemporary explosion of interest and practical utilisation of AI is both the remorseless contribution of Moore’s law and the breakthrough in the algorithmic understanding of machine learning and its application in what is called deep learning. This is the technique which led to the victory by AlphaGo (the Google Deep Mind app) when it played the human Go master, Lee Sedol. It is also evident in everyday examples ranging from face-recognition, language translation, predictive text and enhanced search algorithms – things that in the eighties would have been dubbed AI are everywhere!

I am not particularly married to the precise term ‘sentience’ – numerous others exist. For example Shivon Zilis, an Investor at Bloomberg Beta surveyed every artificial intelligence, machine learning, or data related startup she could find (her list had 2,529 of them to be exact).  She addressed the labeling issue of using “machine intelligence” to describe how “Computers are learning to think, read, and write. They’re also picking up human sensory function, with the ability to see and hear (arguably to touch, taste, and smell, though those have been of a lesser focus) … cutting across a vast array of problem types (from classification and clustering to natural language processing and computer vision) and methods (from support vector machines to deep belief networks).”

She noted that:

I would have preferred to avoid a different label but when I tried either “artificial intelligence” or “machine learning” both proved to too narrow: when I called it “artificial intelligence” too many people were distracted by whether certain companies were “true AI,” and when I called it “machine learning,” many thought I wasn’t doing justice to the more “AI-esque” like the various flavors of deep learning. People have immediately grasped “machine intelligence” so here we are.

And I landed on ‘sentience’ – it is important to note that it does not indicate an ‘end state’ but rather flags a way to discuss a possible dominant theme of its decade (say 2020-2030). Another theme will arise and it is relevant to consider what sentience would not describe: to establish the boundary conditions for the concept and think about what may remain ‘undone’ by 2030-ish.  Thinking beyond that boundary may in turn give clues about the shape of the decade and those to follow … noting that such a shape is impossible to discern beyond broad conjecture.

One direction for such conjecture might be about the emergence of ‘machine autonomy’. It can be useful (in terms of imagined scenarios) although increasingly dangerous (due to the temptations and risks of predictive hubris) to speculate even beyond the rise of autonomy to further phases, perhaps the realization of fully conscious artificial intelligence, perhaps the emergence of essentially incomprehensible ‘alien’ machine-based intelligence:

  • 2030s – ‘machine autonomy’?
  • 2040s – AI ‘awareness’?
  • 2050s – ‘Alien’ intelligence?

It occurs to me that perhaps this is where the fruits of ‘convergence’ as mentioned in this blog have come full circle.  It seems that the developments which can loosely be pulled together under the umbrella term AI are genuinely flagging the arrival of ‘post-industrial’ society, that the world Daniel Bell conjured with is emerging in front of our eyes, even if we do (can) not accurately perceive it.  However the shapes we can discern are certainly the source of some anxiety, particularly as related to the world of work – and that will be the topic for my next blog post …

20170720_103157-01

Life cycle analysis

The life cycle concept is a useful tool for systems analysis and design thinking – by adding the fundamental idea of renewal to what might otherwise be seen as linear processes. This then captures a more dynamic view of systems, accepts change, favours flexibility and explicitly accommodates feedback into design. The life cycle idea is also closer to the general dynamic of social relationships and can have a powerful personal resonance, which can help bring design closer to people. However, does the pace of change and the rate of innovation that simply will not slow perhaps have implications for the lifecycle of humanity itself, connecting the current shape of change with the deep development of human language, technology and society? Maybe we are coming to the inflection point where we might contemplate the historic end of the ‘life cycle’ over the next couple of decades, at least as far as human society is concerned.

This was a tricky blog to get started – the topic opened so many doors to dense topics it was hard to get the sense of a thread to join them.  So I went for a run instead and after the quiet reflection that such activity induces I had the beginning and the end … now all I need to do is navigate between them.  I hope you can stay with me for the ride.

20170905_161507I first came across the idea of the ‘life cycle’ as a formal tool for analysis and design in the early eighties when I picked up managing the new DEC VAX computer system while working at the Royal Blind Society, and subsequently broadened my Master of Commerce degree studies at UNSW to encompass Information Systems Management (ISM).

I took to the concept immediately, firstly because I am something of a natural systems thinker, and secondly perhaps because of my engagement with biology as a subject at high school – it was my enthusiasm for social biology that drove my initial interest in studying sociology at university.  In the early seventies sociology was a new kid on the block at NZ universities – in fact it wasn’t available to first year students when I commenced at Auckland, dictating an initial stint with Anthropology and Psychology instead.  But I digress …

I explored the lifecycle topic quite extensively in an ISM essay from around 1986, which I unearthed in that front room filing cabinet. I noted:

A ‘System Lifecycle’ notion is often used to describe the development, planned or unplanned, of information systems based on computing tools. Although “there are many different methods for representing the lifecycle … all contain essentially the same components” (7, P.13) and these components can be viewed in a simple linear sequence. But when generalizing to the information systems activities of an organization or sizeable organizational unit, a more sophisticated understanding is necessary.

For organizations in the ‘real world’, the stability in simple linear models of change does not exist. A simple linear model would ignore the fundamental characteristic of information systems that they age and wear out like most other assets, and renewal must be part of the planning process. Therefore, as a first complication, we must accept “the complete life cycle of a system, from its initial conception to its ultimate disposal.”

An even broader sense of discontinuity to further complicate planning scenarios is pointed to by Buss when he suggests that “Complete uniformity across all IS projects is likely to be impossible because organizations will be at different stages in their use of … various technologies.”

I have chosen those paragraphs because they continue to ring true for the task of managing information technology today (the language and technology have changed, the challenges remain almost exactly the same! They also capture the essence of how the life cycle concept contributes to analysis and design – by adding the fundamental idea of renewal to what might otherwise be seen as linear processes.  This then captures a more dynamic view of systems, accepts change, favours flexibility and explicitly accommodates feedback into design.

The life cycle idea is also closer to the general dynamic of social relationships and can have a powerful personal resonance, which can help bring design closer to people. This was exemplified for me as I wrote the words to celebrate my eldest sons’ marriage a few years ago, where I commented on the family pattern of ‘building up, letting go and welcoming in’ – a pattern that has been continued and confirmed by the most welcome birth of our first grandchild.  I have appended the notes for that brief speech, which to my surprise was the subject of a number of favourable remarks, at the end of the blog (lightly edited to protect privacy).

20170905_154636

I found the idea of a ‘life cycle’ to be useful well outside the narrow world of IT systems.  For example I used it as a consultant in the late nineties to create a model of Neighbour Aid service delivery, identifying possible benchmark events for these Services. That model was built around the various activities and processes of groups of stakeholders (community management committee, service coordination, clients, and volunteers), reflecting what might be called their ‘life cycle’ in the organisation. The model then neatly provided a starting point to detail functions which Neighbour Aid services might usefully benchmark between themselves.

Single Code Customer Life Cycle

The concept then found applicability in my work as a consumer advocate at CHOICE in the noughties.  In 2003 I advocated that a Customer Life Cycle perspective would provide a useful structure for a Single Telecommunication Consumer Protection Code.  The code surfaced almost a decade later and rather than explicitly use that framework assembled various existing codes into chapters – thereby inevitably covering a number of customer lifecycle event but without that overarching logic.

 

 

 

I also used the idea to describe how advocacy worked, which found utility in the work of a consumer focus group convened by the Australian Communications Authority in 2004.  It had the objective:  “To improve the effectiveness of consumer input and influence to the regulation and governance of the communications industry.”

CDC Representational Cycle

We gave ourselves the label Consumer Driven Communications and we established a powerful logic in drafting our work into the ‘Strategies for Better Representation’ Issues Paper by combining what we dubbed (and rather crudely illustrated) as the Representational Cycle with our version of the Regulatory Pyramid (a whole other discussion probably for further blog post sometime).

CDC Reg PyramidCDC Matrix

 

This produced a matrix of 36 topics which extensively covered the field of consumer engagement with the telecommunications industry and governance – a little too comprehensive for the appetites of some in the end perhaps. The group produced an extensive discussion document and then generated a final report with numerous recommendations, all of which pretty much disappeared under the tides of history, as the Communications Authority merged with the Broadcasting Authority to produce the ACMA in 2005.   Digital archaeology on the project is difficult – Google will only unearth scattered, mostly cached results.  Such is life, as many famous advocates have said.

 

Nevertheless, as I hope I have demonstrated, the ‘life cycle’ is powerful and persuasive tool – it doesn’t fit every situation, but when it does apply it can deliver coherence and insight – both useful for analysis and design.

There is a distinct sense today of technological acceleration, and certainly in the IT world we seem to be seeing ever shorter life cycles, as we move from products and services dependent on hardware to those defined by software, and now to offerings crafted from data analysis by machine learning and AI.

It certainly seems clear that the pace of change and the rate of innovation will not slow: as one group with a real-world interest in understanding technological change (the Office of the [US] Secretary of Defense’s Rapid Reaction Technology Office NeXTech project) suggested that “… in the period the team is actually supposed to be planning for, the strategic horizon of the next 25 years, we will see technologies literally one billion times more powerful than today.”

Some draw a conclusion relevant to the lifecycle of humanity itself, connect the current shape of change with the deep development of human language, technology and society. This line of thinking is illustrated by a reviewer’s summary of the thesis in Sapiens: A Brief History of Humankind by Yuval Noah Harari:

For the first half of our existence we potter along unremarkably; then we undergo a series of revolutions. First, the “cognitive” revolution: about 70,000 years ago, we start to behave in far more ingenious ways than before, for reasons that are still obscure, and we spread rapidly across the planet. About 11,000 years ago we enter on the agricultural revolution, converting in increasing numbers from foraging (hunting and gathering) to farming. The “scientific revolution” begins about 500 years ago. It triggers the industrial revolution, about 250 years ago, which triggers in turn the information revolution, about 50 years ago, which triggers the biotechnological revolution, which is still wet behind the ears. Harari suspects that the biotechnological revolution signals the end of sapiens: we will be replaced by bioengineered post-humans, “amortal” cyborgs, capable of living forever.

Perhaps this is, as the reviewer remarks, “exaggeration and sensationalism”: but maybe we are coming to the inflection point where we can contemplate the historic end of the ‘life cycle’ over the next couple of decades, at least as far as human society is concerned.

Do you agree?

DSCF2203

Meanwhile on a more immediate & personal note, celebrating my son’s wedding I said:

I have often said that if I had known how great it was

to have kids, I would have started a lot earlier …

but of course then I wouldn’t have been able to

do it with my wife …

and it wouldn’t have been anywhere near as

great or as much fun.

I have personally learnt heaps,

and grown a lot, from being a parent .

By and large I reckon I have got more out of the

 parenting deal than my kids have

… so I wouldn’t have missed it for the world.

As time has gone on, one of the things I have learned

is that the toughest job of parenting is not:

The missed sleep and 24/7 demands of the early years;

Nor the school concerts and soccer matches of the middle years;

Nor the taxi / linen / reserve banker services of later years.

No … those nurturing tasks are comparatively easy … because,

although they can be emotionally demanding

and sometimes physically draining,

you are in control.

I think the truly tough job of parenting is …

the letting go,

acknowledging you are no longer in control.

This was brought home to me when J***** set off on

his first long solo drive.

He was taking himself and his brother T*****

to a St John’s training camp in the Blue Mountains

…  all I (we) could do was wave goodbye and then

trust that he (and T*****) would arrive in one piece.

However what we were trusting was not good luck –

we were trusting that J***** had learnt well

and that he would navigate safely and

independently to his destination.

And of course he did.

And he has been travelling well ever since, working hard,

making good choices and getting good results.

And today J***** is continuing his life’s journey

joining with J##### to build what will be the most

important asset they can possess between them

– a happy and productive partnership.

An old friend once remarked that the investment

he most valued was his marriage – something I have

always remembered –

I always say I have 3 super funds to maintain:

Firstly – my marriage;

Secondly – my fitness / health; and

Thirdly and only then – any actual super account dollars

–  because without the first two,

money alone has little meaning or usefulness.

Growing value in a partnership is about

the opposite of letting go –

it means letting someone else in,

trusting, sharing and learning how to fit together.

What I have found to be quite miraculous about

creating a family is how that unit can grow

and extend.

I remember that when we were expecting T*****

(two and a half years after J*****)

 I expressed some concern to my wife that

I loved J***** so much

I wondered how could love

another child as much.

What I discovered was that the envelope of

family love extends easily and

T***** immediately had a huge place in our hearts.

So families are about welcoming in

as well as being about letting go,

While today marks for us a continuation

of letting J***** go,

we are also happy and proud

to welcome J##### to our family,

and to the extended family,

who over the last decades

have extended such a warm welcome

to me – one for which I am very grateful .

So, welcome J##### –

 there is always room for one more.

We wish you and J***** all the best

on your journey together,

and as you craft your own pattern of:

building up;

letting go; and

welcoming in.

 

 

Fake news anyone?

Fake news is very much in the news recently – the keynote presentation on the topic by Professor Jeff Jarvis at the launch of the new Centre for Media Transition at UTS last week got me thinking about the current erosion of ‘trust’, the shift from ‘siloed’ to ‘networked’ communications and media and how a viable business model for news media is actually essential to democracy, which in turn is essential to an innovative and adaptable economy and society.

Last week I went along to the launch of the new Centre for Media Transition at the University of Technology Sydney (UTS).   The Centre has the very useful goal of helping us to understand key areas of current media evolution and how new technologies and digital transition can be harnessed – to develop local media and to enhance the role of journalism in democratic, civil society.

As well as reconnection with various colleagues from the ACMA and other networks, I also very much enjoyed the keynote presentation by Professor Jeff Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at City University New York’s Graduate School of Journalism. He chose as his topic ‘Fake news’, which he then with a very sensible sense of irony denounced as a bogus topic.  The real topic he suggested was the current erosion, and urgent need for restoration of, ‘trust’: trust in ‘facts’ as a basis for policy action; our ability to conduct civil community discourse; political and other institutions; and news media.

20170725_175309-02One observation I would offer on the issue of so-called ‘fake news’ is that there has been a long established practice on public relations of ‘slanted’ if not ‘fake’ media stories – although it would seem that the velocity of less-than-reliable news has sped up with the rest of the news cycle.  Jarvis raised dramatic but not alarmist concerns about the ‘weaponisation’ of information manipulation and the ability of various actors to leverage the media tools now available to foster polarization and attack the ‘truth’ through scale and speed of communications.

When pondering the trustworthiness of news, I remembered the notion that traditionally journalism has not been rated as high as many other professions in terms of trustworthiness.  I hunted out the latest Roy Morgan survey on the image of various professions, conducted in May 2017.  This finds only 20% of Australians rate Newspaper Journalists ‘very high’ or ‘high’ for ethics and honesty with 17% so rating TV Reporters.  However, looking at the time series helpfully provided by Roy Morgan, it is worth noting this is actually an all-time high for newspaper journalists on a rising trend, and up from 12% in 1976. TV reporters are shown to be reasonably stable around the mid-teens since 1988 when first measured.

This suggests to me perhaps some support for the avowed optimism Jarvis offered, with strategies and counsel about using traditional and new journalistic practice to counter the attacks on trust, to build news literacy, resurrect civility and encourage responsible sharing. What particularly struck a chord with me was his stress on the need for journalism to develop as an audience-centric service.

In my own thinking about media and communications futures I have found the application of network thinking and analysis to be very useful.  The world of communications has moved over the last couple of decades from one of massive ‘silos’ such as TV stations and printing plants to one in which the functions of those silos have been spread out across wide and varied networks, from the electronic hardware of the Internet and to the software based landscape of social media. This has been, to use an often misunderstood and sometime overused term, a ‘paradigm shift’, which has shaken business models and re-arranged social structures.

Under the ‘silo’ paradigm, agents such as journalists and regulators could see themselves as standing ‘outside’ the silos, but they are now effectively participants, enmeshed in the networks of the new paradigm. And this is where, as I understand it, Jarvis is going with his thinking and teaching: forget about so-called ‘objective’ reporting and engage meaningfully with real communities and deliver them a service they find valuable. Makes sense to me.

One important and obvious dimension of the paradigm shift has been the commercial challenges to the business models of the incumbent media industry ‘silo-owners’.  That in turn has been an ever increasing threat to the business-as-usual activities and very livelihoods of people working in them – such as musicians, photographers and journalists. This was an ever-present motif in presentation and the Q&A that followed: how can the activities of journalists be made commercially viable?

I was reminded of the classic and prescient 2009 article by US digital analyst Clay Shirky, ‘Newspapers and Thinking the Unthinkable’.  His persuasive analysis was that print media does much of society’s heavy journalistic lifting and the work of print journalists is used by everyone from politicians to district attorneys to talk radio hosts to bloggers.  However, the marriage of this heavy-lifting journalism to the stream of advertising revenue was essentially coincidental.

Shirky notes “This wasn’t because of any deep link between advertising and reporting … that the relationship between advertisers, publishers, and journalists has been ratified by a century of cultural practice doesn’t make it any less accidental.”   And if this relationship was under stress in 2009, things are reaching breaking point a decade or so on and Shirky’s wry observation “that ‘You’re gonna miss us when we’re gone!’ has never been much of a business model” is becoming very real indeed.

The persistence but ‘hollowing out’ of established masthead media, was chronicled by media and technology editor Nic Christensen in his final day at Mumbrella, writing about the “ … massive changes in the media, with more to come. We are living through a media revolution driven largely by the rise of digital, but with it comes the consequence for the journalism profession of multiple ongoing rounds of redundancies, as the media business model looks to reinvent itself within what is a seismic transition.”

The same sentiments were reported for Canada by Nieman Lab: “To be clear, though, almost all daily publishers have found them themselves forced to cut, given the cascading losses of their broken print business.  … We’re not mourning the death of printed newspapers, but of all the reporting — pixels or paper — that’s been disappearing for a decade.”

Clearly a business model beyond click-bait is needed.  What that might be is a matter of urgent inquiry by many and anxious anticipation by others – despite what may be an emerging market failure, such a thing will be next to impossible to regulate into existence   News itself may be a commodity, but without a fountain-head of reliable reporting about things the great and the good might prefer we remain ignorant of, democracy has a profound challenge.

And without the great capacity of genuine democracy to renew and sustain an innovative and adaptable economy and society we all risk being significantly poorer, materially and in spirit.  Hopefully journalists find ways to be engaged, adaptive, entrepreneurial and commercially viable – all of which it must be said is much more easily advised than done – and the new UTS Centre for Media Transition can assist.

The Zone of Opportunity

Croquet is the first and only competitive game or sport I have ever played, and taking up such a pursuit later in life has presented a fascinating opportunity to observe myself learn and develop. I never really ‘got’ (or liked) the sporting analogies many people use in their business vocabulary. But coaching has emerged as an important common ground, since hitting the relevant ‘zone’ helps participants identify and realise opportunity, be it scoring croquet hoops or delivering career outcomes.

Playing croquet for the last six years or so has been an extremely interesting and instructive journey.    I slowly whittled down my handicap as the necessities and interruptions of full-time work allowed, and now a major re-invention project is to play more competition croquet.

As it happened, shortly after taking up croquet, I did an intensive leadership development course, which included a number of residential sessions at the Mt Eliza campus of University of Melbourne’s Melbourne Business School. That was a pretty special place – it was apparently sold in 2016 to a retirement village operator, which feels a bit like the end of an era. It boasted a vineyard, private beach access, 95-bedroom accommodation, conference and training facilities and four dining facilities.  The course was a memorable experience, engaged in intensive cohort learning with a number of my colleagues.

The thing is, croquet is the first and only competitive game or sport I have ever played, and taking up such a pursuit later in life has presented a fascinating opportunity to observe myself learn and develop.  The Mt Eliza experience and focus on complex adaptive systems thinking provided many tools and insights to inform and energise that observation.  Croquet provided a valuable additional case study over the nine month duration of the course, and in subsequent reflection and use of that training.

One such tool was the notion of ‘double loop learning’.  Essentially the concept is that as well as learning the simple linear skill, you also observe and think about how the learning itself is happening, and make adaptive changes to that process as is useful.  The idea is well explained in the classic article by Chris Argyris, ‘Teaching Smart People How to Learn’. I have found this useful and important both in management and in my chosen game; because as a manager often the challenge is guiding your best people to be even better, and because croquet tends to be a game that attracts smart people.

Until I actually played a competitive sport I never really ‘got’ the sporting analogies many people are fond of employing in their business vocabulary.  I observed that these analogies often created in- and out-groups, appeared to discouraged diversity and often favoured male values. One of the attractive things for me about croquet is that by and large it is gender-neutral, with men and women playing on equal terms.  It is also very age-inclusive.

Without abandoning those observations, I have found a greater ability to relate to appropriate sporting insights, properly delivered.  In particular coaching emerges as an important common ground.

A couple of months ago I completed a Croquet Australia coaching course and as a result was endorsed as a Foundation Coach (level 1) for the three codes of Association, Ricochet and Golf croquet – I even got a badge!

20170725_084101-01I found many points of resonance between the material we covered and my management practice and learning over many years, some of which may unpack into future blogs … I don’t pretend to be anything but a fledgling sporting coach, but I am an expert generalist manager.

One notion I picked up on in particular was the ‘Zone of Opportunity’, which forms the title for this blog.  In croquet it has a highly technical application, but it resonates much more widely for me. It fits snugly into the complex adaptive thinking body of thought, exemplified in the sapling that clings to its opportunistic niche in the feature image.

20170707_114147-01

To the technicality – if your croquet ball is much more than 30 degrees off the centre-line of the hoop you are attempting to run, it is simply not possible for it to be hit through.  Skill and practice can shade the edge of the zone, but clearly it materially improves your chances if your approach shot lands you comfortably within the zone.

Single-loop learning might focus on practicing how to run difficult angles, while double-loop thinking might suggest practicing approaches that consistently land well within the zone as more fruitful.

20170725_085117So the more general use of the term is the coaching necessity to help anybody you are helping to develop, in whatever field of endeavour, to best apply their abilities to solve the skill-related problem, as it is relevant to them.  Finding and exploiting their zone will help them identify and realise opportunity, be it scoring croquet hoops or delivering career outcomes. Obviously this should not simply be a matter finding a ‘comfort zone’ and sensible coaching sets a path of achievable development to levels of greater performance.

This references another use of the word ‘zone’, where players often referred to ‘being in the zone’.  A psychologist, Mihály Csíkszentmihályi coined the word Flow to describe this feeling.  In essence, flow is characterized by achieving complete absorption in what you are doing and thereby of losing all track of time and sense of your surroundings.  Flow is broadly defined by a balance between ability and challenge, when your abilities match the specific challenge you can enter the flow state: croquet players as they build their break, craftsmen when they employ their skills, artists when they paint, writers when they craft their words.

In fact, in a double-loop style observation, this is essentially why I write this blog – as I write, time flies, my skills develop and I have fun: what more reward can anyone sensibly seek in life?

 

The narrative necessity

We are currently seeing the dark side of the narrative necessity being played out in politics and social media – never mind the experts, just give me the real story with facts I can conveniently believe in …
However, there is an abiding, important and useful role for positive narratives in our lives.  

One thread of commentary about the recently concluded G20 Summit Meeting has been a loss of coherent narrative flowing from the leaders at the event.  There is a deep seated need in humans for explanatory narratives, and ‘sense-making’ in terms of crafting and articulating such narratives is a critical role for leadership.  We seem to need a narrative flow to give a sense of momentum and coherence to our lives, as we transition from moment to moment; without that sense of temporal structure we just have a collection of moments.

In data-driven world of today, discerning and creating narratives to make sense of the myriad data points is more essential than ever.  We are surrounded by more and more dots and the effort of joining them can be exhausting and at times overwhelming. While ‘being in the moment’ is great counsel and a source of comfort in the face of life’s pressures, the narrative ‘engine’ is the key to joining the dots, establishing direction and getting stuff done.

But here’s the thing – people just want a narrative that helps make sense, preferably one that helps simplify and streamline their world.  It doesn’t necessarily have to be true, but it needs to be believable and consistent with the facts on the ground as we perceive them.  And in a circular twist, our preferred narrative then guides our perception and selection of ‘facts’.

This is the stuff of cognitive biases, about which we are becoming more and more aware of through studies such as behavioural economics. A recent blog in the Economist reported an interesting reflection on the persistent of beliefs in the face of contrary facts, especially noting  “motivated reasoning, [which] is a cognitive bias to which better-educated people are especially prone.”

NoDummy

Being smart is no get of jail free card!

If we are not careful, we can simply (or very cleverly) project what we want to see onto the essentially blank world of noisy and jumbled data.  This human tendency to perceive meaningful patterns within random data has been termed ‘apophenia’.  In the world of data, for example, this manifests itself in ‘overfitting’, where a statistical model emerges to fit noise rather than signal and/or ‘confirmation bias’, where information is sought or interpreted in ways that seek to prove ideas rather than test them.

That’s the down side and I think we are currently seeing the dark side of the narrative necessity being played out in politics and social media – never mind the experts, just give me the real story with facts I can conveniently believe in …

 

 

 

 

However, there is an abiding, important and useful role for positive narratives in our lives.  This was beautifully enunciated by Viktor Frankel in his book Man’s Search for Meaning, first published in 1946. An Austrian psychiatrist before (and after) WW2, he drew on his experiences as an Auschwitz concentration camp inmate to document how, in even the most extreme circumstances, the human urge to seek and create meaning is crucial.

He shows that we have amazing powers of endurance, so long as it somehow makes sense to us to go on living: “He who has a why to live can bear with almost any how”.  On this broad basis he worked out what he called ‘logotherapy’, a technique oriented to enable men and women to see meaning in their suffering, aiming to set them free from despair and find new courage to face circumstances which seem beyond them.

20170709_084329Frankel suggests “that mental health is based on a certain degree of tension, the tension between what one has already achieved and what one still ought to accomplish, or the gap between what one is and what one should become. Such a tension is inherent in the human being and therefore is indispensable to mental well-being.”  That tension is narrative tension as we work on the story arc of our lives, and Frankel observed in the extreme circumstances of his heinous captivity that “The prisoner who had lost faith in the future [lost that narrative tension] was doomed.”

Beyond the personal, sensible evidence-backed policy is more important than ever and policy makers need to acknowledge and resist various cognitive bases in their decision making. Those in leadership positions have a necessity and obligation to help people to develop and sustain unifying and sustaining stories about what they are doing and why.

Authentic narrative is essential to meaningful existence.  I attended the NSW U3A Network 2017 annual conference a couple of weeks ago, and one of the sessions was about Big History – unsurprisingly, space here does not permit a full exposition of the history of the entire universe.  However the speaker, Prof David Christian of Macquarie Uni, did a fine job which is replicated in his TED talk on the subject: “The history of our world in 18 minutes” – can I strongly suggest taking a look?

Suffice it to say that his narrative arc from the big bang to the present ‘anthropocene’ provides a very interesting story and perspective – if anyone has the ear of a G20 attendee they might send them the link!