A light, easy introduction to the psychological concepts underpinning overwhelm, including alignment with personal values, boundary-setting, procrastination and perfection. There's nothing particularly new or revelatory here; but this is a great starting point for those wanting to dip their toe into the topic.
User Profile
technology. cybernetics. systems. science fiction. languages. machine learning. speech recognition.
This link opens in a pop-up window
KathyReid@bookwyrm.social's books
View all booksUser Activity
KathyReid@bookwyrm.social rated Summary and Analysis of Hunger: 5 stars
KathyReid@bookwyrm.social rated Hunger: 5 stars
Hunger by Roxane Gay
KathyReid@bookwyrm.social rated Deep Work: 4 stars
Deep Work by Cal Newport
One of the most valuable skills in our economy is becoming increasingly rare. If you master this skill, you'll achieve …
Review of 'The Reflective Practice Guide' on 'Goodreads'
4 stars
This is an accessible, well-structured guide both for those new to reflective practice, and those guiding or instructing others in the discipline of reflective practice.
It provides solid, but not overwhelming, theoretical foundations for different approaches to reflective practice, and pragmatic, easily-implementable strategies for structuring reflecting writing, responding to emotions in reflective ways, and understanding the role reflecting practice plays in life-long learning and professional development.
I only wish this book had been recommended to much earlier.
KathyReid@bookwyrm.social rated Science Fiction Prototyping: 3 stars
KathyReid@bookwyrm.social reviewed AARNet by Glenda Korporaal
Review of 'AARNet' on 'Goodreads'
4 stars
Korporaal's well-researched booked chronicles an incredibly important time in Australia's technical genealogy, exploring the relationships, political influences, strokes of luck and ill fortune - that have all shaped AARNet today.
More than a decade after the period covered in the book, her weaving of multiple threads of people, personalities and partnerships resonates.
Review of "R.U.R. (Rossum's universal robots)" on 'Goodreads'
4 stars
This dystopian landmark challenges our notions of what it means to be human, the value of labour and the creation of meaning through struggle and suffering.
While not explicitly Marxist in outlook, it echoes principles put forward by Paulo Freire around the education of oppressed populations. Čapek resolves this tension not with a new relationship between student and teacher, but by eradicating the human race, encouraging us to go back to the beginning of the lesson.
Gender roles are challenged directly, which was prescient given the year in which the book was written - 1920 Czechoslovakia, bordering newly-Bolshevik Russia - which the only female lead of the play at first being portrayed as beautiful, as engaging, but devoid of intellectual power. Čapek challenges us to consider these as uniquely human traits, contrasting with the efficiency and strength and stamina of Rossum's Universal Robots. Ultimately it is the female protagonist - …
This dystopian landmark challenges our notions of what it means to be human, the value of labour and the creation of meaning through struggle and suffering.
While not explicitly Marxist in outlook, it echoes principles put forward by Paulo Freire around the education of oppressed populations. Čapek resolves this tension not with a new relationship between student and teacher, but by eradicating the human race, encouraging us to go back to the beginning of the lesson.
Gender roles are challenged directly, which was prescient given the year in which the book was written - 1920 Czechoslovakia, bordering newly-Bolshevik Russia - which the only female lead of the play at first being portrayed as beautiful, as engaging, but devoid of intellectual power. Čapek challenges us to consider these as uniquely human traits, contrasting with the efficiency and strength and stamina of Rossum's Universal Robots. Ultimately it is the female protagonist - Helena - who takes the fateful action that resets humanity's path.
Review of 'Made by Humans: The AI Condition' on 'Goodreads'
5 stars
Ellen Broad’s multi-faceted exploration of the many inter-twined aspects of artificial intelligence embarks and concludes at the same salient juncture; emerging technologies are conceived, shaped, used, governed and iterated by humans. Just as humans are inherently neither good nor bad, the systems we construct echo our moral plurality, our unconscious bias, and, frequently, our unwillingness to be critically interrogated.
That this is Broad’s first book – given its well-researched examples, coherent structure and intellectual incisiveness – is surprising. Its clarion call – for greater care, more rigourous thinking and a more holistic approach to the almost-infantile adoption of artificial intelligence, machine learning and autonomous decision-making – is not.
Structured in three distinct parts – Humans as Data, Humans as Designers and Making Humans Accountable, the book covers much territory. From systemic and cultural biases in how data used by machine learning is selected and captured, to the errors that are …
Ellen Broad’s multi-faceted exploration of the many inter-twined aspects of artificial intelligence embarks and concludes at the same salient juncture; emerging technologies are conceived, shaped, used, governed and iterated by humans. Just as humans are inherently neither good nor bad, the systems we construct echo our moral plurality, our unconscious bias, and, frequently, our unwillingness to be critically interrogated.
That this is Broad’s first book – given its well-researched examples, coherent structure and intellectual incisiveness – is surprising. Its clarion call – for greater care, more rigourous thinking and a more holistic approach to the almost-infantile adoption of artificial intelligence, machine learning and autonomous decision-making – is not.
Structured in three distinct parts – Humans as Data, Humans as Designers and Making Humans Accountable, the book covers much territory. From systemic and cultural biases in how data used by machine learning is selected and captured, to the errors that are introduced to data sets by humans, to decisions made about system tradeoffs, what privacy means in different contexts, how open a system is to inspection and intelligibility, how diverse that system is, to who is accountable for the impacts of a system, real life examples are interwoven with provocative and often confronting questions.
Broad does not set out – at least in this tome – to answer these questions – rather, she lays a foundation for examining each of these questions in more depth. Personally I’d like to see a follow-up to this that covers attempts to standardise practices in machine learning and artificial intelligence – the frameworks and benchmarks – often competing – that have been proposed – alongside efforts at industry adoption and (likely) the barriers that are faced.
Review of 'Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech' on 'Goodreads'
5 stars
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races …
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.
The examples that frame the narrative are disarming in their simplicity. The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field. The person of mixed racial heritage who can't understand which one box to check on a form. The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for.
Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design.
While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function. Give me 100 form designers, and I will will give you 100 different forms that provide 100 user experiences. And at least some of those 100 users will be left without "delight" - a nebulous buzzword for rating the success (or otherwise) of digital experiences.
Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their (at best) benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data. It is designing not for delight, but for deception.
"Default settings can be helpful or deceptive, thoughtful or frustrating. But they're never neutral."
Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.
Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the (unquestioned) assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future. Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.
The parting thought of this book is that good intentions aren't enough. The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!
Review of 'Technically Wrong' on 'Goodreads'
5 stars
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races …
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.
Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".
As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.
The examples that frame the narrative are disarming in their simplicity. The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field. The person of mixed racial heritage who can't understand which one box to check on a form. The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for.
Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design.
While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function. Give me 100 form designers, and I will will give you 100 different forms that provide 100 user experiences. And at least some of those 100 users will be left without "delight" - a nebulous buzzword for rating the success (or otherwise) of digital experiences.
Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their (at best) benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data. It is designing not for delight, but for deception.
"Default settings can be helpful or deceptive, thoughtful or frustrating. But they're never neutral."
Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.
Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the (unquestioned) assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future. Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.
The parting thought of this book is that good intentions aren't enough. The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!
KathyReid@bookwyrm.social reviewed Essentialism by Greg McKeown
Review of 'Essentialism' on 'Goodreads'
3 stars
Greg McKeown's easy-to-read tome on 'Essentialism' is a field manual - a guide for the busy manager or multi-tasker who is poor at saying no to commitments, and who erroneously believes we can do it all. Reading this book is a valuable use of time for the new manager, or the seasoned leader who finds their success has bred too many different projects.
The overarching frame of reference is that there are two types of managerial and leadership behaviour (the book equivocates management and leadership together) - Essentialist and Non-Essentialist, and that effectiveness is the product only of the former.
The book is well structured and each chapter clearly articulates an aspect of being 'non-essential' - illustrating the consequences with (at times, kitsch) anecdotes. The solution is then provided, in the form of take-away behaviours that can be practised over time.
This book would have been improved with the addition …
Greg McKeown's easy-to-read tome on 'Essentialism' is a field manual - a guide for the busy manager or multi-tasker who is poor at saying no to commitments, and who erroneously believes we can do it all. Reading this book is a valuable use of time for the new manager, or the seasoned leader who finds their success has bred too many different projects.
The overarching frame of reference is that there are two types of managerial and leadership behaviour (the book equivocates management and leadership together) - Essentialist and Non-Essentialist, and that effectiveness is the product only of the former.
The book is well structured and each chapter clearly articulates an aspect of being 'non-essential' - illustrating the consequences with (at times, kitsch) anecdotes. The solution is then provided, in the form of take-away behaviours that can be practised over time.
This book would have been improved with the addition of the following artefacts:
- A wall guide or infographic contrasting the 'Essentialist' and 'Non-Essentialist' behaviours for easy reference and to refer back to
- A maturity model or similar allowing the fledgling leader to self-rate their behaviours
I also found this book lacking in solid empirical research; much of the narrative is fleshed with anecdotal rather than research or evidence-based information, which detracts overall from the credibility of the book.
Review of 'Rise of the Robots' on 'Goodreads'
4 stars
A robust, evidence-based journey through the key arcs of the rise of technology, in particular robotics, artificial intelligence and machine learning, couple with major labour market disruption and stagnation. My only criticism of this engaging, accessible and well argued tome is that it offers only one broad solution to the problems articulated - universal income, and slight variations thereof.