Technology and Organizations

Vint Cerf: Adding Delay and Disruption Tolerance to our List of Requirements

Dr. Vinton Cerf (NSF video, bio) continues to energize, as well as build, the Internet. One of the true fathers of the Internet, his first role (while a grad student) was to write some of the initial code to allow packets of information to move reliably from one computer to another.

Location Awareness and Changing Perceptions

It seems that another train has left the station. Location aware computing is available, and its ubiquitous integration with our cell phones, laptops, and person is only a matter of time. In January, Mathew Honan described (Wired) his experiments with a location aware lifestyle. Yet, even as recently as 2007, locational privacy was on the table in discussions of Google Earth (Randall Stross, in Planet Google, pp. 142-151). (Google Privacy page)

Clearly times are changing. I wish I'd kept the peer reviews from 1989 calling me a fascist given that my dissertation was on the effects of computer performance monitoring. Never mind that in the later field research the chip inspectors found electronic monitoring less invasive than physical (watch over your shoulder) monitoring. My argument has always been that monitoring is a tool and that the outcomes are a combination of the tool's capabilities with the people and organizational practice. These issues identify that a key skill we need in our roles of accidental systems designers is the ability to understand and manage privacy and information access. Motahari, Manikopoulos, Hiltz, & Jones (2007) describe these Seven Privacy Worries in Ubiquitous Social Computing in their paper of the same name (pdf):

  1. Inappropriate use by Administrators: e.g. The system admin sells personal data without permission.
  2. Legal Obligations: The system admin is forced by an organization such as the police to reveal personal data.
  3. Inadequate Security.
  4. Designed Invasion (Poor Features): e.g. a cell phone application that reveals location to friends, but does this without informing the user or providing control of this feature.
  5. Social Inference through lack of Entropy: See CampusWiki example...
  6. Social Inference through Persistent User Observation: e.g. Bob is so often in Alice's office. Their relationship must be romance.
  7. Social Leveraging of Privileged Data: e.g. David can't access my location, but Jane can. David asks Jane my location.

The Economist published Every Move You Make in 2008 noting the following about Waber and Pentland's study of locational monitoring (via identity badges that could track location and the timbre and inflection the wearer's voice!) in a US high-tech firm and a German bank:

An interesting experiment, then. But how widely this approach can work in practice is unclear. Many people may object to having their behaviour scrutinised so closely and Mr Waber and Dr Pentland are, indeed, sensitive to privacy. They believe that the risk of rejection can be minimised by using the badges only for short periods of time, so that they do not become part of a routine monitoring system. It will also help, they believe, if everyone is treated equally, so that the boss’s actions, foibles and shortcomings are as transparent as those of his minions. Now that really would be a revolution in management science.

Contrast the above with the sentiments expressed by this Carnegie Mellon graduate student (using Lococcino, 2:43min video) and the popularity of iStanford. I continue, without success, to search for serious sociological research on our changing views regarding locational and other monitoring. Pointers appreciated.

System Conductivity

Look for people with tolerance for ambiguity and intrinsic motivation for engaging technical and organizational systems, and I think you find people who are likely to try new technology tools at work. I'm still playing with the ideas of All of Us as Accidental Systems Designers and trying to understand why some people take to this role easily, and others do not.

Hoffer, George, & Valacich describe one of the characteristics of successful systems design teams (they focus on teams as it is so rare that true systems design can be done individually) as: "Tolerance of diversity, uncertainty, ambiguity." Tolerance of ambiguity is "... 'the tendency to perceive ambiguous situations as desirable,' whereas the intolerance of ambiguity refers to 'the tendency to perceive (i.e., interpret) ambiguous situations as sources of threat' (p. 29)." (Judge et al., 1999, based on Budner (1962), full pdf)

Think about how people approach new technology tools.

My classic example is how different people react to the suggestion to use a wiki to manage a team project. Tolerance for ambiguity, as described above, describes the response I see from some of my colleagues: they are willing to give it a shot -- they may roll their eyes at yet another of my suggestions, but they are game to try. Intolerance of ambiguity describes another (luckily, much smaller) portion of my colleagues: they see a wiki-like tool as something that will lose their data, lose their rights to their data, or add one more thing to their already long list of things to do. Another response is more extreme: a colleague hears my suggestion and proposes we take it to the next level. Something like, "Wiki sounds great, but let's also try using a group Twitter tool so we can stay up-to-date on the progress of the project. I've never done it, but this would be a great time to try."

I think this is someone who has both tolerance for ambiguity and intrinsic motivation for engaging technical and organizational systems. The trait -- tolerance for ambiguity -- reduces the perceived hurdle of trying something new, and the task-specific state -- intrinsic motivation around organizational and technical engagement -- combines with tolerance to push us in a new direction. This kind of person is well situated to be an accidental, but elated, systems designer -- They are getting to do what they enjoy, and have the internal capability to put up with the challenges. Intrinsic Motivation: (from Hennessey & Amabile, 2005):

Intrinsic motivation is the motivation to do something for its own sake, for the sheer enjoyment of the task itself.... Theorists have emphasized the role of certain psychological states in the experience of intrinsic motivation, including a sense of self-determination or perceived control over task engagement (Deci and Ryan, 1985) and a sense of optimal challenge (Csikszentmihalyi, 1997) that enhances self-perceptions of competence (Deci and Ryan, 1985). The highest level of intrinsic motivation state has been labeled "optimal experience" or "flow" (Csikszentmihalyi, 1997).

Who are these people who actually enjoy engaging technical and organizational systems?

Maybe in the old days they were time and motion study experts (the kind with a passion for improving the organization, not the kind focused on driving the worker out). The Gilbreth's would be in this group. Their family life was the source of the book (1948) ) and movie (1950) "Cheaper by the Dozen." Modern versions could include your colleague who's always pestering you to try a slightly better way to do things in your team, or professors of management, at least the ones with a technical bent.

System Conductivity

I'm going to call the combination of tolerance for ambiguity and intrinsic motivation for engaging technical and organizational systems: System Conductivity -- both as it relates to the ability to direct technical and organizational systems toward new designs (cf. symphony conductor); and the ability to carry the necessary energy for new design (cf. electrical conduction).

I expect people in the category of having both high tolerance for ambiguity and high positive intrinsic motivation for engaging technical and organizational systems to be perhaps even be proactive, rather than accidental, designers -- people who look for opportunities to improve, or at least try to improve, the performance of their systems. The best designers will have also have an underlying systems sense that helps them understand the costs, benefits, opportunities, and hurdles related to attempts to use technologies that are not quite stable, practices that may or may not be better than what we are currently using, and on-going revisions to work practices built on such systems.

By thinking about system conductivity, we can better understand our own motivations and attitudes, as well as those of our colleagues -- and how these motivations and attitudes are likely to influence willingness to try new technology tools and organizational practices. And here's to all those tech support folks who must put up with those who lack system conductivity: httpv://www.youtube.com/watch?v=pQHX-SjgQvQ

Juries Run Amok via iPhones, BlackBerries, etc.

Jury trials are providing vivid examples of how difficult -- and important -- it is to manage technologies, organizational practice, and people all at the same time.

Seamless Response to Our Information Needs

Remember the user interface in Minority Report? Tom Cruise standing in front of a clear screen and, with super cool hand gestures, getting access to all the information he needs to solve a crime-to-be? httpv://www.youtube.com/watch?v=NwVBzx0LMNQ Mauricio Palomar, one of the students in my Technology & Innovation Management course forwarded MIT Professor Pattie Mae and Ph.D.

Subscribe to Quick Insights You Can Use

Syndicate content