David D. Woods

David D. Woods’s Followers (9)

member photo
member photo
member photo
member photo
member photo
member photo
member photo
member photo
member photo

David D. Woods



Average rating: 4.27 · 105 ratings · 5 reviews · 11 distinct worksSimilar authors
Behind Human Error

by
4.32 avg rating — 63 ratings — published 1994 — 14 editions
Rate this book
Clear rating
Resilience Engineering: Con...

by
4.07 avg rating — 46 ratings — published 2006 — 11 editions
Rate this book
Clear rating
Resilience Engineering in P...

by
4.09 avg rating — 22 ratings — published 2011 — 8 editions
Rate this book
Clear rating
Joint Cognitive Systems: Pa...

by
4.60 avg rating — 15 ratings — published 2006 — 6 editions
Rate this book
Clear rating
Joint Cognitive Systems: Fo...

by
4.23 avg rating — 13 ratings — published 2005 — 9 editions
Rate this book
Clear rating
Outmaneuvering Complexity: ...

really liked it 4.00 avg rating — 1 rating2 editions
Rate this book
Clear rating
Proactive safety management...

really liked it 4.00 avg rating — 1 rating
Rate this book
Clear rating
Intelligent Decision Suppor...

by
really liked it 4.00 avg rating — 1 rating
Rate this book
Clear rating
Behind Human Error: Cogniti...

0.00 avg rating — 0 ratings
Rate this book
Clear rating
Can We Ever Escape from Dat...

0.00 avg rating — 0 ratings
Rate this book
Clear rating
More books by David D. Woods…
Quotes by David D. Woods  (?)
Quotes are added by the Goodreads community and are not verified by Goodreads. (Learn more)

“The absence of failure was taken as positive indication that hazards are not present or that countermeasures are effective. In this context, it is very difficult to gather or see if evidence is building up that should trigger a re-evaluation and revision of the organization’s model of vulnerabilities. If an organization is not able to change its model of itself unless and until completely clear-cut evidence accumulates, that organization will tend to learn late, that is, it will revise its model of vulnerabilities only after serious events occur. On the other hand, high-reliability organizations assume their model of risks and countermeasures is fragile and even seek out evidence about the need to revise and update this model (Rochlin, 1999). They do not assume their model is correct and then wait for evidence of risk to come to their attention, for to do so will guarantee an organization that acts more riskily than it desires. The”
David D. Woods, Behind Human Error

“Research done on handovers, which is one coordinative device to avert the fragmentation of problem-solving (Patterson, Roth, Woods, Chow, and Gomez, 2004) has identified some of the potential costs of failing to be told, forgetting or misunderstanding information communicated. These costs, for the incoming crew, include: having an incomplete model of the system’s state; being unaware of significant data or events; being unprepared to deal with impacts from previous events; failing to anticipate future events; lacking knowledge that is necessary to perform tasks safely; dropping or reworking activities that are in progress or that the team has agreed to do; creating an unwarranted shift in goals, decisions, priorities or plans. Such”
David D. Woods, Behind Human Error

“If an organization is not able to change its model of itself unless and until completely clear-cut evidence accumulates, that organization will tend to learn late, that is, it will revise its model of vulnerabilities only after serious events occur.”
David D. Woods, Behind Human Error



Is this you? Let us know. If not, help out and invite David to Goodreads.