Patient Quality and Safety Tools

Safety Tools

According to the Institute of Medicine, USA, patient safety is part of the discipline of Quality. In general all world talk about Quality and Patient Safety and in many cases of Quality Assurance, as a whole, involving both disciplines.

Although they use many tools in common, the two disciplines have differences. The quality is oriented to the improvement of the performance or efficiency of the system, while the Patient Safety is oriented to decrease the Error.

Safety has had a lot of marketing and many wills working on it, but we should not forget about Quality, as it is a broader concept and also improves safety.

Example: the time to apply a drug after an incident, is a topic of Quality. The lack of consultation, with the Pharmacy, for a missing data, is a subject of Patient Safety. Both affect the quality of care.

It does not make sense to work in Safety if we do not work simultaneously to improve Quality.

Both issues have important financial implications. The improvement of Quality, reduces waste (frequently estimated at 30%) and improving in Safety improves re-jobs and the time of permanence of the patient, among other things.

In health issues, CQI or Continuos Quality Improvement is a trend based on TQM or Total Quality Management that Dr. Deming applied to quality improvement in Japan.


Plan – Do – Check – Analyze. (Prepare – make the change – check the results – analyze the next step)

Also called PDSA, or Plan, Do, Study, Act. We put it first, in this blog, because it should be the most used tool by all. It is almost a vice for those responsible for quality. A way of seeing things. They maintain several PDCAs in study simultaneously. It is a tool that can be applied to any type of organizational problem, even for the study of a row. Its use is recommended in a successive way, that is, after finishing a PDCA another one begins, on the same subject, improving the indicators of the previous one. When the problem is already well adjusted and one must work more adjustable, the use of other tools such as the DMAIC is recommended.

PDCA (Plan, Do, Check, Analyze)

The PDCA name indicates the following steps:

  • Plan:
    • Define the problem to be improved.
    • Define the purpose of the study.
    • Focus on this.
    • Plan the Study
  • Do:
    • Train the people who will be in control.
    • Run the study
  • Check:
    • Check how the control was done
    • Prepare the indicators
  • Analyze:
    • If the intended objectives were achieved, confirm and secure the change, with more measures.
    • If the objectives were not met, define what should change and start a new PDCA.
PDCA - Next steps
PDCA – Next steps

The use of successive PDCA is a necessity and defines for potentiality of this tool, to get to the core of the issue. The P phase objectives of the second PDCA can be modified and improved over the previous PDCA.


WHY? – WHY? – WHY? -WHY? – WHY?

It is a simple technique that forces us to avoid simplistic answers and to continue deepening the search for the first causes (dig-in).

Precisely one of the keys of the HRO, or High Reliable Organizations, is never accept answers that have not dug deep into the root of the problem.

Example: The nurse did not consult her question about the use of a medication.

  • Why? The pharmacist’s phone was busy for several minutes.
  • Why? This phone is always busy.
  • Why? The Pharmacy delays many time on some calls.
  • Why? Because the information has to be searched in physical files.
  • Why? Because there is not a program and a computer at the pharmacy table, to find the answers quickly.

You do not need to stop here and you can go further, until you have a satisfactory answer that indicates that we have reached the first cause.


This diagram seeks to arrive at the initial causes of each problem, began with Ishikawa, applying it to industry and classifying it into 4 items or 4M: labor, materials, media (machines) and methods. It was later extended to other causes and adapted to the health sector.

Ishikawa Diagram - Fishbone

Got from site

Each spine opens itself to new causes. All of the same category are grouped together.


The Pareto principle tells us that 80% of the effects originate in 20% of the causes.

This railway engineer based in Italy and born in France in 1848, discovered how this principle is valid in many fields of study.

Although it does not apply exactly in the 80-20 ratio, it does have surprising validity as a general principle and helps us to focus on a small number of elements to study.

As a derivation of the principle is the cumulative Pareto diagram. Then, as accumulated frequency:

Pareto - cumulative frequency

As factors A and B reach 77%, very close to 80%, you´ll probably want to stop and study only these two factors and leave the rest for another study.

After the correction of the problem, you will make a new Pareto and the relations will have changed and maybe now the C eill enter inside new causes to study.


The Swiss Cheese Model was created by Dr. James Reason, a psychiatrist specialized in the study of accidents and risks when he was a professor in England.

The model introduces us to the concept of latent faults inside a system. It uses the concept of the catalyst elements, or triggers, which came from the theory of complex elements, and that the lack of barrier elements, another concept introduced, make possible an error or incident. The holes (of the cheese) are potential failures, neglected in each category and with potential to let the problem pass.

The simplicity of the presentation, which says that when the holes of the cheese are aligned, an incident is generated, generated the popularization of the Model. It serves for the analysis of any cause of failures. Dr. Reason says the use of one or two barriers, at the end of the processes, prevents the failure from happening.

Swiss Cheese Model

We recommend reading the book The Human Contribution by James Reason, which points out the importance of humans in the control tasks and recommends the adoption of control barriers at the end of the processes.

Control Barriers


Usually used as a posterior study to and incident or reagent, it searches for layers, the cause or causes of each fault. It is based on Graph Theory, based on logic and mathematics that graphs with nodes, the relations between causes and effects.

This method is doing a mining (dig in) of the information using also the technique of why´s of each stage.

In medicine, Agregate RCA has been used as a pro-active method to study several cases and define trends that lead to a decrease in error.

There is also the RCFA, root cause failure analysis, brought from engineering.

RCFA (Root Cause Failure Analysis)


Used to prevent incidents, this tool developed by NASA introduces the probalistic calculation, both in the possibility of an incident occurring at a certain level, and in the potential effects of that incident.

In health, HFMEA or Health Failure Mode and effects Analysis has been used. The 5 steps of it are:

  1. Define the topic.
  2. Assemble the team of professionals.
  3. Develop the map of the process to be studied.
  4. Conduct risk analysis, through a cause-and-effect tree.
  5. Develop the actions to take to achieve the desired results and indicate who will be responsible for each action.

This tool has frequently led to the recognition of the need to have a defined Taxonomy (categorization) and a standardized terminology and disseminated in training to all professionals involved.


The combination of LEAN and 6 Sigma methodologies gave birth to this discipline.

Diffused through professional certifications such as Green belt, black belt, etc. It serves to train people in a system that aims to decrease, both waste and errors.

Its main tool is DMAIC: define, measure, analyze, improve, and control. That is, define the subject or process to study, measure, analyze, improve and control.

SIX SIGMA refers to the statistical parameter that establishes a probability of 99.99966% that a defect does not occur, that is, 3.4 defects per million of events and that was developed by Motorola and then spread to all industries. The method is based on accurately measuring the problems, until correcting them with their tools.

LEAN is a methodology developed mainly from Toyota control techniques. Based on eliminating everything that does not deliver value to the customer, it is aimed at reducing waste of time, material, space and so on.

It uses the insertion of a “positive control” or poka yoke to avoid errors by the force of the mechanisms.

Digital Six Sigma DMAIC Improvement Process

DMAIC is undoubtedly the most complete tool among all the above.

We wanted to increase the interest of professionals in these issues.

The total mastery of them requires an appropriate and extensive Course, which exceeds the possibilities of this blog.

The Human Aspects of Safety

Segurança do paciente

As per our education and culture we have incorporated assurances such as the following:

We are the result of our dedication and effort. We are the product of our desire to be and do things. Good people get the best results and do not make mistakes. We cannot fail. We have free will and we do it our way.

These sentences are not entirely true. The context in which we move determines both our results and our will and dedication. However, we feel the weight of responsibility as if everything depended only on us. It is a feature of our individualistic culture. Knowing the context will help us to improve medical care.

In Health Safety, the health professional improves his development by comprehending his relation with Context.

The operation of a hospital is a complex system and it is called a complex partner because of the high dependence on people’s abilities to operate the processes.

Complex systems have a non-linear relationship between causes and effects. There are several factors that influence a result that focuses on other factors. There are inhibitors, actions and triggering catalysts and attractors that produce their own gravitational field. There is a positive or negative feedback between drivers. Human beings tend to seek a causal and linear relationship and to think serially (one problem after another). We normally do not see the whole picture. We see only the part that corresponds to us. Thus paradoxical (unexpected) results appear. In this context, with limited resources, incomplete information, pressure on production to deliver, fatigue and the act of doing several things at once (multitasking), we must provide a service without errors or transgressions. It seems like an unequal fight.

Patient Safety undergoes continual comparisons with aviation regarding to the improvement they have achieved. The world of medicine is much wider than the world of aviation and many comparisons are not valid. Even so, we can learn from them.

Much part of the accident or adverse event studies focus on human error as the cause of disaster activation. Sometimes we consider the professional a hero. However, he is generally seen as the one who solved 100 problems but could not deal with one.

Recent studies on transgressions of pilot operating standards indicated that a very high number of procedural violations were necessary for the safety of the aircraft.

We are not proposing to violate the rules, but to understand the reason why many rules are violated. Some rules are simply transgressed because it is easier to do it that way. Others because the rule is not clear.

Making mistakes is part of our human condition. Some errors cannot be avoided, but can be anticipated and solved over time. “We cannot change the human condition, but we can change the conditions in which we work to have fewer mistakes and easier recovery.” (James Reason, The Human Contribution, 2008).

The errors happen in three levels of control: automatic, mixed and conscious. They operate in the following situations: routine situations prepared to solve problems or new situations.

In the following table, James Reason shows us the three performance levels: according to skill, standards and knowledge.

Conscience Mode according to James Reason, based on knowledge, rules and habilities

1. Errors of ability (errors based on skill). These errors are committed when operating at an automatic level (not conscious) and are usually called slips or lapses.
2. Errors when complying with rules (errors based on standards). It fails because of an error in the application of a rule or violation of the rule.
3. Errors of knowledge (errors based on knowledge). In face of new situations and does not apply the appropriate solution for memory error or lack of knowledge.

Operating in a conscious way all the time would not be possible. The brain uses roads and shortcuts to avoid energy consumption. Our attention is limited. If we receive information in the middle of a task, we may lose the operation of something we know well. When the mind searches information in our memory (we call it information packet), this is based on similarity and most frequent or recent use. We compare the information we receive with stored information following these norms: similarity and frequency. This can fool us very easily, because for the mind, due to energy economy, it does not analyze all possibilities. If you are working based on knowledge and the memory sends erroneous data, activity or solution, it can go wrong.

The only way to reduce our risk is to create a series of multiple layers barrier to avoid error or, when it happens, to soften the consequences of that error.

Barriers can be personal habits or systemic procedures.

It is necessary to understand the differences between conditions of causes and errors. The conditions are present in the cases with poor results and in others where nothing serious happened. The condition for failure, also called a pathogenic condition or latent failure does not cause any problems until the cause appears. The cause is the trigger of an existing fault, asleep within the organization or process.

The model given below is an evolution of the swiss cheese model, the same given by James Reason. This author dedicated more than 40 years to the study of the accidents and the conditions of error.

Latent failure - The model is an evolution of the swiss cheese model, the same given by James Reason

This way, the organizational culture type of guilt and the work environment are separated. Although both are part of the context, the first one has a more generic and cultural characteristic and the second one a more physical and the moment of the incident.

Therefore, we can also obtain important information, the security strategies of the organizations of high reliability (HRO).

In all of them the consequences of an error are very serious, once these industries or organizations of high reliability (HRO) in general, follow very strict guidelines from which we can extract some safety guidelines.


Everybody makes mistakes. It is a characteristic that is part of our humanity.

Accept them and be prepared to avoid mistakes or correct them without causing damage, is what the champions of security do. The same thing happens individually and collectively.

  • The best organizations adopt a culture of transparency and mutual trust.
  • Errors allow us to learn and improve. You can learn a lot from almost-accidents with a low cost because of damages.
  • Strongly hierarchical systems (based on power) do not help to develop security. Responsibility should be given to those who can solve security problems in a better way (experience and expertise).
  • People and organizations work with the law of minimum effort. Violation of the established rules will take place if the procedure is not easy to achieve.
  • The solution to a complex problem must have a higher level of complexity than the problem to solve it. A simple answer to a complex problem is a methodological error, independent of the response.
  • Each person and each organization has its own “dormant errors or resident pathogens” waiting for a trigger to materialize.
  • We must escape from superficial responses, focus on experience or on criteria based on common-sense. We must deepen the questions and answers for all problems.
  • Because of how the mind searches in memory to find a pattern, based on what is similar and frequent, and not on all the information we have, it makes us always prone to errors. Developing habits to prevent these mistakes is our goal as professionals.
  • The main process of high reliability or HRO organizations is controlled by computers (up to 320 applications on Airbus) and one person or supervisor controls the operation of the whole system because computers do not make mistakes, but the person who programs or feeds the information does. Principle of redundancy.

That means, there are recommendations and methodologies to reduce adverse events. We have not created a list of procedures here. We have concentrated on the conceptual and attitudinal.

Understanding the psychological aspect of an error or a violation leads us to change a punitive behavior by a conduct of greater investigation.

HRO, how to reach Zero Error?

HRO, how to reach Zero Error?

YES, there are organizations that work with zero error. Or at least, do not consider an option, to have a single error (although sometimes it happens). These are called High Reliability Organizations (HRO).

When we see that some industries have a thousand times less errors than the health industry, we can learn something from their methods.

The most renowned of the HROs are the nuclear plants, aircraft carriers, missile facilities, semiconductor factories, air traffic controllers and the aviation industry in general.

They are organizations where there is no acceptable calculation of errors. They work with the criteria of never events. That is, under no circumstances such events can happen.

There are many reasons that explain this result. Level of investment, it means that it is invested whatever is necessary to have zero error. Economic penalty for errors. In health, many events do not economically penalize the institution and even come to benefit it, as with extended permanence. Degree of automation of operations. Training and qualification. Systemic safety culture, which we will see below.

During 2015 the aviation industry made 29 million flights with 16 accidents.

That is, 0.55 accidents per million flights.

In health, we have 10% and 50% of it could be avoid. This indicator is given by the World Health Organization (WHO) and many others and considering that it may be the triple.

Ten percent of 1 million is 100.000 and if we take out the non-avoidable, we reach 50,000.

That is, 0.55 versus 50,000, or 90,000 times bigger.

If we make a comparison, with the number of deaths, the relationship is much worse.

After visiting hospitals in more than 10 countries, I find a great disparity in their level of safety.

  • In many countries, there is still no talk about patient safety.
  • We are still fighting to take out the potassium chloride from the nursey and leave it only at the pharmacy. A simple task becomes something heroic.
  • We have many identical packages.
  • We do not use a basic technology such as bar code.
  • There is a 30% of waste, which prevents from investing in automation.
  • There´s no senior management of Quality.
  • It is applied the guilty policy of health personnel.
  • There is a gradient of authority among people working in health, which difficult comments.
  • Governments even buy expensive equipment’s, but they do not invest in health aspect of the population, such as potable water and sewage.
  • Low investment is made in prevention and general health education, to reduce the expenses of correcting the caused problems.
  • Other industries do not have the obligation to help. For example: laboratories, food and education industries.
  • Etc.

It means that we do have a lot to do in health system and the first change we must do is Attitude.

An organization is Reliable when facing an abnormal fluctuation of internal and external conditions of the system, it keeps the results within the desirable.

Having desirable results when input parameters are controlled is not enough.

About the change in Attitude, there is a growing concern about the incidence of cognitive processes on results. We speak about the technique of mindfulness.

Mindfulness refers to the quality of our attention, rather than to keep the attention.

Therefore, the knowledge or perception of the context is very important.

The repertoire of HRO criteria’s that we can be applied is summarized in:

  • Worry about failures.

There is a consciousness that failures can occur. To err is human and it is almost a natural condition that we can fail. Automatic systems with their algorithms do not interpret all situations of reality and handle just the logical situations that may occur. The incidence of several simultaneous factors leads to unplanned results. HROs stop at the study of minor errors or near misses, with the same determination as if they were major faults. The management is oriented to study the failures, as much as to maintain the productivity.

  • Reluctance in simplifying interpretations.

In a complex system, simplification is a methodological error. In no way can we give a simple or simplistic answer to a problem occurred in a complex system. We must consider the limitations imposed by the context: our mental system, the limitations of the physical structure, the limitations of logical thinking, which parts of the totality we are not seeing and avoiding the disregard of intuitions.

  • Sensitivity to Operations.

Having the buble is a term used in navy to define when a commander has the perception of the integrality of the elements of the complex reality of the environment, together with the operational dynamics. Realize the whole situation through integrated instruments with human action. Be aware of possible, misinterpretations, near misses, system overloads, distractions, surprises, confusing signals, interactions, and so on.

  • Commitment to Resilience.

It refers to an anticipation of possible problems, even with some simultaneity and training to solve the situation, when they come. Resilience here refers to overcoming the surprises that some incidents may produce. Be prepared for the error. Accept the inevitability of error and prepare for it.

  • More horizontal hierarchical structures.

It is studied that in institutions with strong hierarchical definition, errors spread faster helped by organizational efficiency. It happens most of the times when the incident occurs at high organizational levels, rather than at the bottom. Having an organization that works more in consensus helps preventing these problems. It has even been defined that the term of organized anarchy as a mean of control. The responsibility for solving an incident falls more with the experts on that subject, than with the hierarchical leaders. The fact of being present at the time of the event also determines who can take decisions. The weakness of the hierarchical structure is a desired characteristic.

HRO - Filtering

An organization is Reliable when facing an abnormal fluctuation of internal and external conditions of the system, it keeps the results within the desirable.

Having desirable results when the input parameters are controlled is not enough. The HRT or High Reliable Theory, (Charles Perrow) defines that it is not the stability of the inputs that will give us the solution to a stability of the results.

We must also mention that our vision of safety is what James Reason explains in his book The Human Contribution.

Our preferred model is to have two systems working simultaneously: the computer system, which through predefined algorithms automatically manages all operations and human control, following step by step events to correct and improve safety.

Nurse in a Control Room

Administration Errors of Medicine and Financial Consequences

Administration Errors of Medicine and Financial Consequences

Andrea Righi de Oliveira Kelian

Nurse Specialist in Nursing Management, Hospital Management and Health Quality Management.



Among the adverse events (AEs) discussed, one of the most who brings concern are related to medication errors, and more specifically those related to drug administration. The study aims to raise scientific data to propose a model to calculate the cost of Adverses Events in Brazil. A literature research of articles published between 1995-2014, national and international was held. Based on the articles we propose a model with minimum of estimated cost with AE related to management errors. We understand the importance of investing in stocks so you can mitigate the risk of EA instead of having a high cost to treat them.

Key words: patient safety, adverse events, medication errors, medication system.


Currently the subject of medication error is bringing very serious disorders and expressing concerns as to the responsibility to provide a safe environment with quality in nursing care.

The administration of drugs is one of the most serious activities and greater responsibility of nursing and for its implementation is necessary to apply various scientific principles that underlie the action of nurses to provide patient safety.

Medicate patients depends only of human actions and mistakes are part of this nature, however, a well-structured system of medication should promote conditions that help in minimizing and preventing errors, implement standards, rules, actions, proceedings to assist professionals involved.

Adverse events related to medications can lead to major health problems of patients, with significant economic and social repercussions.

The use of medicaments is one of the interventions most frequently used in the hospital environment; however, studies over the past few years have shown the presence of errors in medicaments treatment causing harm to patients ranging from the non-receipt of necessary drug till injuries and deaths (LEAPE et al. 1995; TÁXIS & BARBER, 2003). The administration of medication matches the last opportunity to prevent an error in medication that may have arisen into the prescription or dispensing of medicines.

The health professionals should be aware of and alert to this fact and seek permanently, error prevention measures through new knowledge, behaviors or strategies to protect all involved, especially the patient.

Get a comprehensive view of the medication system view enables professional conditions analysis and interventions to ensure a responsible and safe care to patients and himself.

The medication errors are considered adverse events to drug liable of prevention and may or may not cause harm to the patient, with the possibility to occur in one or several times within the medication process (BATES et al., 1995; LEAPE et al. 1995).

The Institute of Medicine dos EUA published in 1999 a book-report entitled: To Err is Human: Building a Safer Health System.

This mortality attributed to adverse events in health care was a mortality (in the US at the time of the publication) higher than in car accidents (43,458 deaths in a year), breast cancer (42,297 deaths in a year) or AIDS (16,516 deaths in a year), placing deaths from errors arising from health care as the 8th leading cause of death in the us.

This publication also brought cost estimation data generated by adverse events over a year in the USA: U$ 17-29 billion per year.

This publication caused a big impact on public opinion in the USA. Then-President Bill Clinton called the federal health agencies to implement the recommendations of the Institute of Medicine.

Data from Brazil, from 2006, about 11,000,000 admissions by SUS (National Health System) and 4,000,000 hospitalizations in the private sector, in an estimated total of 15,000,000 admissions in one year.

According to the data of Brazilian studies, we would have an incidence of 7.6% of patients with adverse events. We would therefore 1,140,000 patients suffering adverse events in Brazil per year.

In March of 2014, the Ministry of Health, the Oswaldo Cruz foundation and ANVISA launched the “reference document for the National Program for Patient Safety,” which also mentions that 10% of patients suffer adverse event and of these 50 % were avoidable. This data refers to a study conducted in several countries such as Australia, England, New Zealand, Canada and others, including Brazil. (DE VRIES, 2008)

The Patient Safety refers to reducing unnecessary damage risks associated with health care (adverse events) to an acceptable minimum. The incidence of patients who suffer adverse reaction when hospitalized can reach almost 17% depending on the study performed.

These adverse events include: increased length of hospital stay, temporary or permanent injuries and even death. It is essential to think about the importance of risk management focused on patient safety.

On the report “To err is human: building a safer health system” of the Institute of Medicine (USA) published in 1999, based on studies conducted in Colorado, Utah and New York, points out that of the 33.6 million admissions made in 1997 in US hospitals, around 44,000 to 98,000 Americans died because of problems caused by medication errors (KOHN et al., 2001).

This way, and based on all the literature research about the subject, we aim to propose a minimum estimated cost model with AE related to management errors.


The results of a study conducted at two tertiary hospitals of large-scale, Brigham and Women’s Hospital and Massachusetts General Hospital in the United States, pointed out an average of 6.5 adverse events to drugs for every 100 admissions, of which 28% could have been prevented (BATES et al., 1995).

Errors in medication can often cause harm to the patient and, according to Bates (1996), about 30% of the damage during hospitalization are associated with medication errors, which also brings serious economic consequences to the health institutions. It is estimated an expenditure of approximately US $ 4,700 per adverse event of medication preventable or about $ 2.8 million annually in a teaching hospital with 700 beds. The annual cost of morbidity and mortality related to medication errors in the USA has been estimated at around US $ 76.6 billion (BERWICK & LEAPE, 1999; KOHN et al., 2001, ANDERSON, 2002).

Medicating patients requires an effective communication process to be successfully performed. Communication problems can be one of the causes of errors in medication, and originate from various situations encountered in day-to-day professional (BARKER & HELLER, 1964 apud RIBEIRO, 1991, p. 70).

It is known that errors are part of human nature therefore the medication systems should be well structured to promote conditions that help in minimizing and preventing errors, planning processes and implementing standards, rules and actions.

The medication system in the hospital is open and complex, involving several steps are interrelated and interconnected by various actions, developing 20 to 30 different steps during the prescription process, dispensing and administration of drugs, always involving many individuals and multiple transfer requests or materials that pass from one hand to another, which can lead to medication errors (LEAPE et al., 2000).

In 1989, the Joint Commission on Accreditation of Healthcare Organizations – JCAHO identified five components or processes of the medication system, which are: selection and obtaining medicine; prescription; preparation and dispensing; medication administration and patient follow-up in relation to the drug effects (NADZAM, 1998).

According to a study made by INCA (National Cancer Institute), in November of 2010, the most common errors in administering medication are inadequate technique, wrong place, inadequate preparation and administration without observing the care for that drug. Errors can result in serious problems for the patient and his family, and generate disabilities, prolong hospital stay and recuperation, exposure to more procedures and measures, delay or inability to resume their social functions and until his death.

The Administration of medicines can be considered in a health institution, as a high-risk activity, they are inherent in daily patient care and practice for this activity is closely linked and dependent on human action.

Nowadays, about 88% of patients seeking for medical care receives drug prescriptions. (CASSIANI, 2005)

In a study conducted in several countries, including Brazil, the 24 hours of 1328 patients in the ICUs were observed in relation to administration of medications and found that the most common errors are:

  • Wrong dose;
  • Wrong medication;
  • Wrong Via;
  • Wrong time of administration;
  • Administration absence.

In total 1328 patients observed were found between the 24 hours, 861 errors in 441 patients, i.e., the same patient experienced more than one error in these 24 hours. In 12 of them, there was permanent damage or death.

In this analysis, we can see that the total number of observed patients, 33% suffered an error in relation to medication administration.

A study made by University of São Paulo, conducted in 2010, examined five university hospitals and showed that there were 30% errors in medication administration, of these 77.3% at the wrong time, wrong doses in 14.4%, 6.1% wrong via, 1,7% medication delivery unauthorized and 0.5% in exchange for patients. This study analyzed 4958 via intravenous dose administrations and 1500 errors.

Within these studies, the number of adverse events that would be preventable gets close to 60% in the situations described.

If we consider the deaths and the serious injuries that can happen, could avoid much suffering to the patient and their families.

Based on the collected articles we can infer a scheme for greater understanding of the data found:


Didactic scheme of the structure of Adverse Events, related to Administration Errors

Within these surveyed studies, were found few materials related to the cost of non-quality, specifically talking about costs of an adverse event related to medication errors.

Most studies report that expenses with adverse events, bring an estimated cost in general, that is, the cost from any type of adverse event.


In the 50 and 60, there were records of the occurrence of adverse events (AE) in health services, but it was in the 90s that studies have shown the importance of social and economic costs of these events. The Harvard Medical Practice Study (HMPS) estimated that EAs occurred in 3.7% of admissions to acute care hospitals in the state of New York, United States of America (USA) in 1984. (PORTO, 2010).

The methodology used by HMPS formed the basis for other studies in different countries.

Study by Thomas Ej in hospitals in the states of Utah and Colorado, USA, showed that 3% of hospitalizations occurred EAs, which are preventable in 33% (Utah) and 27% (Colorado) cases. They estimated that, with the elimination of preventable AEs, could save between 17 and 29 billion dollars, including the loss of income for inactivity, disability and medical expenses.

Study in the UK estimated that EAs occurred in 10.8% of hospitalized patients, 48% considered preventable. Patients with AEs had an average of 8.5 days more permanent (range 0 to 70 days) compared to the average stay of patients without AEs. The authors estimated a cost of about 2 billion pounds per year, exclusively associated with increased duration of hospitalization of patients who experienced AEs.

An Australian study found the occurrence of adverse events in 7% of hospitalizations, with an average increase of 10 days in the average length of stay. The costs of adverse events accounted for 16% of direct hospital costs, representing an increase of 19% in the budget of admissions.

Studies show that patient harm arising from health care have significant impact on hospital spending.

There are reasons to assume that the costs of adverse events in Brazil are underestimated, that because financial information used does not include the cost of the salaries of medical staff, because they are public hospitals, although spending on outsourced services personnel are computed. The second aspect concerns the employment in the financial analysis of the average amount paid per day of hospitalization. Studies have shown that hospitalization days after the occurrence of EA are more expensive than the previous ones. As such discrimination cannot be performed in this study, it is assumed that the observed average value is smaller than the actual average value. (PORTO, 2010)

After analyzing the collected studies, it is proposed three cases with similar values for the cost of adverse events.

Follows below the examples:

As examples, we propose a 200-bed hospital, where we have 5400 patient-days per month, we can infer that 65,000 doses of drugs are administered monthly. We will use this data for cost calculations preventable adverse events.

Not included in these calculations the sentinel’s events, deaths and judicial costs.

The values shown represent an estimated minimum cost.

1. Applying Didactic scheme of adverse events structure related to medication errors, we have:

Hospital 200-beds – 5400 pacients-day/month

·  being 540 AE/month;

·  being 270 AE/month avoidable;

·  being 81 AE/month of medication;

·  being 25 AE/month the administration of step;

·  being 12,15 AE/month with moderate or severe damage

We estimate an average of three days of recovery in ICU at a cost of R$ 3,200 per day, according to Anvisa data.

Cost:  12,15 x 3 days x 3.200 = R$ 116.640 /month.

2. According to a FIOCRUZ / RIO study, published in 2010 in the Portuguese Journal of Public Health, by Mendes and colleagues brought similar data to the previous case, which applied to the suggested scenario of a 200-bed hospital would have the following data:

Hospital 200-beds – 5400 pacients-day/month

·  being 540 AE/month;

·  being 270 AE/month avoidable;

·  being 135 AE/month with moderate or severe damage

·  being 40 AE/month of medication;

·  being 12 AE/month the administration of step;

FIOCRUZ - study of Adverse Events in a hospital with 200 beds

Again, according to Anvisa data, we calculated an average of three days of recovery in intensive care at a cost of R$ 3.200 per day.

Custs of 12 AE x 3 days x R$ 3.200 = R$ 115.200 / month

3. According to another study published recently in the Journal of the Brazilian Medical Association in 2013, by Mendes and colleagues presented the following data: In a sample of patients 1103, were identified 65 AE preventable, 56 patients who suffered preventable adverse events, that is, a patient may have experienced more than one AE avoidable. These preventable AE produced 373 days to more of hospitalization, and 4.6% was for medication errors reason. Therefore, follows below a projection of avoidable costs, applying this data to the 200-bed hospital, as suggested in the study. We also include the data of 30% AE regarding medication errors related to medication administration, as exemplified in previous studies:

5.400 pacients-day/month x 373 days of recovery/1103 pacients x 4,6% correspond to medication x 30% correspond to administration = 25 days / month.

25 days x R$ 3.200 = R$ 80.000 / month

According to the cases presented above we can conclude that in all the models, the values of the costs are like each other. Remembering that we are based in avoidable AE.


This first study showed that in many countries there are studies showing the concern to mitigate AE that occur in their institutions. It is noteworthy that in the US, this issue is on the agenda for decades and, therefore, monitoring and risk management measures are more mature in that market.

It also shows the change in perception of the shock that the AE can cause.

Brings out the need to assess with more carefully the consequences for the patient, the family and the institution of these AE, for example, a medication error. Unfortunately, the adverse event, specifically the error related to medication administration, occurs much more often than it should, as shown by the national and international studies on the subject, often causing long periods of hospitalization to the patient.

Another aspect to be considered is since the financial importance used to treat adverse events with moderate damage, severe or even death could be invested in other areas such as new technologies, new buildings, human resources, media, and others.

So, it is important and urgent the study on this subject, their discussions and especially new proposals to mitigate the risks of new happen Adverse Events.


ANDERSON, J. G. et al. Evaluating the capability of information technology to prevent adverse drug events: a computer simulation approach. J. Am. Med. Inform Assoc., v. 9, n. 5, p. 479-490, 2002.

BARKER KN, Flyin EA, Pepper GA, Bates DW, Mikeal RL. Medication errors observed in 36 health care facilities. Arch Intern Med. 2002;162(16):1897-903.

BATES, d. W. et al. Incidence of adverse drug events and potential adverse drug events: implications for prevention. JAMA, v. 274, n. 1, p.29-34, 1995

BERWICK, d. M.; LEAPE. L. L.  Reducing errors in medicine. BMJ, v. 319, p. 136 – 137, 1999.

BOHOMOL ER. Erro de medicação: importância da notificação no gerenciamento da segurança do paciente. Rev Bras Enferm 2007; 60.

CAMARGO B. CASSIANI B, Oliveira C. Estratégias para prevenção de erros na medicação no setor de emergência. Rev Bras Enferm 2005; 58(4).

CARVALHO VT, CASSIANI SHB, CHIERICATO C, MIASSO AI. Erros mais comuns e fatores de risco na administração de medicamentos em unidades básicas de saúde. Rev Latino-am Enfermagem 1999; 7(5):67-75.

CASSIANI B, SILVIA C. Administração de medicamentos: uma visão sistêmica para o desenvolvimento de medidas preventivas dos erros na administração. Rev Eletron Enferm 2004; 6(2).

CASSIANI B. Segurança do paciente e o paradoxo no uso de medicamentos. Rev Bras Enferm 2005; 58(1): 95-9.

COHEN MM, et al. Medication safety program reduces adverse drug events in a community hospital. Qual Salf Health Care 2005; 14:169-174.

DAVIS, P et al. Adverse events in New Zealand public hospitals I: occurrence and impact NZMJ,2002; 115(1167):1-9

DAVIS, P et al. Adverse events in New Zealand public hospitals II: preventability and clinical context NZMJ, 2003; 116(1183):1-11

DE VRIES EM RAMARATTAN MA, SMORENBURG SM, GOUMA DJ, BOERMEESTER MA. The Incidence e nature or in-hospital adverse events: a systematic review. Qual Saf Health Care. 2008; 17: 216-223.

FONSECA AS, PETERLINI FL, COSTA DA. Segurança do Paciente. 1 edição. Ed. Martinari, 2014. Sao Paulo – SP.

Joint Commission International [página na Internet]. EUA: Joint Commission International: segurança de medicamentos.

Kohn, L. T. et al. To Error is human: building a safer health system. Washington: Committee on Quality of Health Care in America, National Academy of Institute of Medicine, 2001.

LEAPE, L. L. et al. Reducing adverse drug events: lessons from a breakthroughs series collaborative. Jt. Comm. J. Qual. Improv., v. 26, n. 6 p. 321-331, 2000.

MENDES, W. et al. The assessment of adverse events in hospitals in Brazil International Journal for Quality in Health Care 2009; 21(4):279–284

MENDES, W. et al. Caracteristicas de eventos adversos evitáveis em hospitais do Rio de Janeiro. Rev. Assoc. Med. Bras 2013;59(5):421-428

MIASSO, Adriana Inocenti et al. Erros de medicação: tipos, fatores causais e providências tomadas em quatro hospitais brasileiros. Rev. esc. enferm. USP, São Paulo, v. 40, n. 4, Dec.  2006.

NADZAN, D. M. A system approach to medication use. In: COUSINS, D. M. Medication use: a system approach to reducing errors. Oakbrook Terrace: 1998. p. 5-18.

OLIVEIRA RC, CAMARGO AEB, CASSIANI SHB. Estratégias para prevenção de erros na medicação no setor de emergência. Rev. bras. enferm., Brasília, v. 58, n. 4, Aug.  2005

OTERO MJ, DOMÍNGUES AG. Acontecimientos adversos por medicamentos: una patología emergente. Farm. Hosp. 2000;24(4):258-266.

PORTO S, at al. A magnitude financeira dos eventos adversos em hospitais brasileiros. Rev. Port. Saude publica. 2010; VolTemat (10):74-80

RIBEIRO, E. Dose Unitária – Sistema de distribuição de medicamentos em hospitais. São Paulo, 1991. 476 p. Dissertação (Mestrado) Escola de Administração Hospitalar e de Sistemas de Saúde da Fundação Getúlio Vargas.

SILVA, AEBC.; CASSIANI, SHB. – Administração de medicamentos: uma visão sistêmica para o desenvolvimento de medidas preventivas dos erros na medicação. Revista Eletrônica de Enfermagem, v. 06, n. 02, 2004.

TAXIS, K.; BARBER, N. Ethnographic study of incidence and severity of intravenous drug errors. BMJ, v. 326, n. 7391, p. 684-687, mar. 2003

THOMAS Ej, Studdert Dm, NEWHOUSE Jp, ZBAR Bi, HOWARD Km, WILLIAMS Ej, Et Al. Costs of Medical Injuries in Utah And Colorado. Inquiry. 1999; 36:255-64.

VINCENT C, NEALE G, Woloshynowych M. Adverse Events in British Hospitals: Preliminary Retrospective Record Review. Br Med J. 2001; 322:517-9.

VINCENT, C. et al. Adverse events in British hospitals: preliminary retrospective record review BMJ 2001; 322:517–9

Recomendations to design the information to be included in the Unidosis or Unitarized Package

Unitarização Unitarizar

Defining processes and standardizing procedures is an objective to be achieved within the activities of the hospital and the hospital pharmacy. We know that a high percentage of Adverse Events comes from a communication failure. Problems of similarity in medicines must be attacked, there´s a large number called LASA (look alike and sound alike), producing the necessary differentiation of the packages and communicating the main information, even with redundancy according to the rules of safety of operations.

These recommendations are based mainly on the US Institute for Safe Medication Practices (ISMP) documents, other readings with similar content that which objective is to improve communication between the Hospital Pharmacy (HP) and other sectors, preventing errors and alerting. Here fits the information that it is responsibility of the HP to differentiate the packages that will transit inside the hospital and to educate the nursing staff.

The improvement of quality and the decrease of Adverse Events (AE) can only be given when all the information delivered to the nurses is adequately expressed and presented, to comply with the certain 5.

As any safe and reliable system, there must be two systems working simultaneously. That´s why we define the bar code (BC) as the main element, which would be the main parachute and the reading information of the Nurse, related to the Route of Administration and its procedure, as well as the alerts and warnings, the secondary parachute or of safety.

Would you jump off the plane, with only the main parachute, leaving the safety one in land?

Other industries are safer because they have redundancy procedures, where human action controls automated procedures. It’s what Dr. James Reason calls “The Human Contribution”, because only the electronics and algorithms fail to cover every real-life situation.

Let’s express our recommendations in a summary language:

  • Always use the Name of the Basic Drug and do not include the commercial name of the drug.
  • Never abbreviate the name of the drug.
  • Do not use zeros to the right of the comma. For example, write 5 mg and never 5.0 mg. (If the coma is not out well printed you will be administering 10 times the required dose).
  • Always place zeros to the left of the comma. For example, 0.3 mg and never .3 mg.
  • write the word “Units” completely and never use the U. For example, 50 U can be read as 500.
  • Enter Units for UI or international units.
  • Include points to space the zeros from thousands. Example 5.000 Units.
  • Do not use M to abbreviate thousands, as it can be misunderstood for millions.
  • Write the word Thousands or Millions instead of using zeros. For example, 150 thousand instead of 150,000 and 150 million instead of 150,000,000.
  • Avoid words from medical jargon and adequate simplify language.
  • Do not abbreviate the manufacturer’s name.
  • Do not use abbreviations. In case of use, follow the instructions:
    • Use mL as milliliter.
    • Use mg as a milligram.
    • Write microgram and do not abbreviate as mcg.
  • Do not use other abbreviations. They are not recommended by the ISMP and the Joint Commission.
  • Do not place the Hospital logo. This will not help you to avoid mistakes.
  • To write the name of the basic drug and the form of presentation or dilution, use the Arial font size 12 condensed (Bold). The Verdana font is also allowed, but it takes more space and you will be forced to abbreviate it.
  • You can switch to a smaller 10 or 11 fonts for other recommendations and alerts.
  • Use all uppercase letters, decreases the correct reading ability.
  • Do not reduce the space between words and always look for a clean and clear text.
  • Always use number and not text to express quantities.
  • Use text only in horizontal mode.
  • Increase the space between lines to the maximum.
  • Always use black ink on a white background. The other combinations have less reading definition.
  • Always include the form of Administration, in detail, especially when it is not via Oral.
  • Use Alerts and Warnings.
  • Do not put too much information in, as it may reduce the clarity of the message.
  • The Opuspac system places only 3 reading fields in the large packages. More information no longer helps.
  • This system provides in its database a list of 70 Layouts or information of Way of Administration, Alerts and Warms already related to more than 700 medicines.
  • It also provides more than 150 drug names in a combination of uppercase and lowercase letters, following the recommendations of the ISMP for LASA drugs helping to differentiate the packages.
  • This information is open to any interested part.
  • In addition, uses figures designs, studied to call the reader attention and it goes with a text to identify the message with precision.

Recomendations to design the information to be included in the Unidosis or Unitarized Package

Methodology of Changes

Metodologia das Mudanças

Implementing a change is a great effort, whether it is a personal matter, or within a group or institution.

As in a personal way, the force of habit dominates us, so does it happen to human groups. The improvements that must be made inside a hospital, with no doubt, are changes. Sometimes they are small changes of learning techniques and other times are great changes of culture.

Many of the changes fail in the moment of making them, because the difficulty that really exists to change is yet unknown.

The strategy for changing leads us to study it, as a real battle (figuratively).

  • There will be winners and losers.
  • There will be friends and enemies (or simply people who oppose).
  • It is better to prepare the troops, that is, to bring together all those who can help us and instruct and motivate them to support change.
  • There will be an evolution, that is, a process. Everything will not happen in an instant, but in a sequence, that must be studied and conducted.
  • Change will bring a new reality and new problems to be anticipated.
  • If the desired change cannot be achieved, the situation must be anticipated and there will be a new stable situation.
  • Is it possible and convenient to divide the change into several stages?

Moving forward a change requires a level of leadership. The person should stand out by his position, his history, virtues or knowledge. From inside out, the leader must have great personal discipline and emotional mastery of his person.

We normally classify the changes in: large, medium and small.

  • The big ones will be resisted and probably prevented, as they will seriously affect someone.
  • The little ones will be undone to the back when the wave of change happens. And there will be no change.
  • The realization of several medium changes is a great change and is the most probable way to evolve by overcoming the resistances.

Darwin quotation

We need to emphasize the need to prepare people for change. Fears, and especially the fear of the unknown, create a great force of opposition to change. It is easier to change when the consequences are fully known and help is requested to move forward.

The hospital environment, being a human set of many people and different levels with a gradient of authority and a strong tendency to exercise positions of power, is especially sensitive to all changes.

The change first occurs in the head of people and then in reality. That is why transmitting a vision of post-change reality and selling that idea, properly, is the way to a successful change.

It is often necessary to argue that change will occur anyway, either now or later and that resistance will be overcome by another, with a better or worse outcome than the current one. People should also envision non-change. That is, how a situation can be degraded by lack of conduction towards a positive change, leading to a situation of deterioration. Although we often oppose a change to maintain the status quo or current situation, in fact there is always a change and in that case, for lack of a driving force, it will be for the worst.

There are the top-down changes in the hierarchical scale and the bottom-up changes. The two may fail for lack

of support. It is necessary to recognize which position you have and complement with what you miss, that is, the two supports are necessary and must be developed.

After the change, it is necessary not to trust that everything is already achieved. The changes come back as much as they advance. It is necessary to do a follow up and to correct or improve, tying all loose ends;

In short, we can summarize this way:

  • Study and prepare yourself for change.
  • Take effective leadership, with the necessary supports.
  • Transmit a vision.
  • Choose several medium changes to make.
  • Track and improve post-change.

In change, death and taxes are the only certain things in life.

When an Error is an Adverse Event?

When an Error is an Adverse Event?

Adverse Event is an Error with Damage.

It is important to have a clear definition of Adverse Events (AE) because there are many variants of it in literature.

If there is no damage, it is not an Adverse Event.

If the incident or event does not reach the patient, it is not AE.

An error that does not get done and does not come into contact with the patient is an almost-error or a “near miss”. Example: A wrong preparation of a dilution that is not applied to the patient. This type of incident is also called “close call” in English.

Of all medical procedures, a fraction becomes an incident. Official information from World Health Organization tells us that it is approximately 10%.

These incidents are classified as Avoidable and Non-Avoidable.

There are medical procedures and medications that have unavoidable consequences or contraindications. These are not Adverse Events, as they are not errors.

Adverse events can occur in any sector of the hospital, but they are always related to the patient and must cause damage, to be considered as such.

AEs should be reported to the system responsible for Patient Safety.

The percentage of self-reported AE varies from country to country, because it depends on the culture of the hospital. Where there is a culture of guilt, there is much less reporting. Usually only 10% of real AEs are reported.

There are systems to calculate the level of AE, such as Global Trigger Tool from IHI (Institute of Health Improvement) that is independent of the complaints and can increase up to 10 times the AEs to be considered.

Only 2 to 3% of the AEs are responsibility of the hospital operator, being almost totally a system problem.

There are many hospitals that undertake not to blame medical operators to get more self-report and improve quality.

Following the trail of near-miss is a smart procedure to study how to improve processes, without going into the subject of culture, because as there was no error, there is no guilty.

There are errors of commission (to be done) and errors of omission (for not having done).

Both types are included in the Adverse Event studies.

All medical meetings

Although human intervention is always necessary to the commission of a mistake, systemic failure is the main responsible in + 95% of the cases.

Stress, overworking and long hours work journey are the main cause of distractions and mistakes, work in automatic or subconscious mode can cause errors.

Only a small part of the AEs are caused by gross negligence, for example by people who systematically violate rules of procedures and that are the only ones who must be blamed for.

Says Dr. James Reason, inventor of the Swiss cheese model:

“We cannot change the human being, but we can change the conditions in which he works.”

Why to Unitarize?

Why to unitarize?

Unitarizing (Unit dose repackaging) is the process of preparing the drugs in ready-to-administer form to the patient. That is, without any necessary preparation to be performed in a subsequent operation. This process is normally performed in the Hospital Pharmacy, Drug Store Sector or Unitarization Center for several hospitals, but always under the responsibility and control of the Hospital Pharmacy.

Although an ampoule that should be diluted in nursery, is not strictly a unit dose, by extension of unitarization speech when converted into dosage units for a patient.

Unitization involves two main processes: packaging and printing of the label or unit packaging.

This can be done manually or with machines.

Other tasks of the unitarization process are: 1) to take off the medicines from stock, 2) to assemble a unitarization order with the corresponding information: authorization, barcode, quantity of medicines, validity, etc. 3) to get materials to unitarize 4) to cut the blister 5) separate the drugs from their secondary packages or boxes 6) to print the labels (in case of manual unitization) 7) to pack and label 8) to form suitable groups for the subsequent handling of the stock 8) to place again in the warehouse .

This process is performed immediately upon receiving the drugs at the hospital and before entering the main stock (stock 1) in almost all cases of unitarization with automatic machines. In cases of manual unitarization it is not always possible to do so.

Nowadays there are also machines that cut blisters automatically and that are used by hospitals with more than 200 beds.

There are also semiautomatic machines for unitarization, devices to perform the whole process in an automated way, without the constant presence, in all the cycles, of the operator.

Information can be found on the website:

Why to unitarize? Mainly to centralize the control of the medicines in the Hospital Pharmacy and to differentiate the packages. Also, to place a hospital bar code.

The centralization of control allows a reduction of up to 57% in Adverse Events, according to two studies conducted in Germany and the USA.

There is also a decrease in the time taken by nurses to unitarize on floors and less waste.

This process began to spread in the world from 1965 and today is the main world trend of change.

But its application is still not widespread in all countries, as in Europe, several countries still dispense drugs to the infirmaries in boxes that are then unitarized at the time.

In the US, many drugs are unitarized without blister, received in large bottles, to then be sent to the electronic dispensers of the infirmaries or sectors.

Unitarize to Differentiate

Blisters and ampoules received from laboratories do not have the necessary differentiation to be used inside the hospital.

Each hospital and Pharmacy professional has its own rules on how to “customize” the packaging presentation. In general, recent past incidents influence the criteria to protect against new incidents. Its creates a wide variety of differentiation criteria in each hospital, which the laboratory industry cannot solve.

The placement of hospital bar codes and even serial code, that is, each package has a different code and can be tracked individually from the beginning to the end of its use and linked to a patient, makes the whole process of Unitarization has more practical sense.

Appearances of ampoules are very similar

Unitarize to differentiate since drugs delivered by the pharmaceutical industry are very similar

Taxis et al, Pharma World Sci. In Germany, they demonstrated a 53% reduction of AD in favor of the single dose

Barker, 1965. A study in an Arkansas Hospital gave a 57% reduction of AD, when it passed from the collective system to the unit dose.

When the Culture what Matters Most

When the Culture is the Most Important

What do we understand when we talk about culture in a hospitable environment?

We refer to the attitude of each one and in its totality towards the others and toward the environment.

We define as the best culture attitude: fair, horizontalized and transparent.

Maybe the worst culture is that one where people think everything will be the same, no matter what happens, nothing will change or improve, no matter what you do.

We should classify at least two kinds of culture: those ones centered on power and those ones centered on values.

The power-centric ones, that is, very hierarchical or vertical, usually form an atmosphere of fear. People operate with a strong incidence of fear of an authority and its consequences.

The values-centered ones, on the other hand, are more horizontal, as there is not an excessive gradient of authority between one operator and another. Focusing on values, it means that there must be justice acting to support values. Rules are also part of a value system.

Why is it dysfunctional to have a culture of power inside a hospitable environment?

Many authors have defined the hospital environment as a complex system. What is a complex system?

A complex environment is a surrounding area, where there are several actions or “drivers” that promote actions that generate reactions. There is no direct relation of cause and effect between them, neither between them and the reactions or consequences. Sometimes the reactions are disproportionate to the changes in the drivers. It doesn´t mean that there is chaos. There is another kind of order, not direct, not linear.

In a complex system, we have inhibitory factors acting and preserving the system and catalysts, that is, factors that promote change that can be stimulated.

We have drivers that, by the action of others, are positively feed and others that are negatively feed. There is no proportionality of the consequences.

There are also attractors, that is, elements, such as the force of gravity, that always maintain their influence in all situation.

The fact that we do not know how a complex system works does not mean that it is chaotic. It only means that the rule of 1 + 1 = 2 does not work. Linear thinking will not help us. A relational or relative logic, where there is no absolute right, neither absolute wrong, will better guide us.

There will be many cases of paradoxical results. They are simply those whose logic we still do not understand. Not for that, less logical.


A hospital has, for the integration between a human environment, technological environment, private and sick interests, a characteristic of a socio-complex system. But it doesn´t mean that it is not necessary to carry out too many operations with a deterministic sequence of a military operation. This last one must work in logistics, cleaning, production, administration tasks, etc.

For all those operations where is necessary the participation of several people and criteria, when taking decisions, a culture based on values stands out on a culture supported by the power.

The gradient of authority. In the aviation industry, after checking that in some accidents the co-pilot did not have courage to tell the commander that there was an error in the procedure and that led them to a disaster, it began to train the staff together and not separately. The communication skills of the team members were evaluated and they were encouraged to communicate themselves, reducing the authority gradient, what prevented a lower level person within the group from communicating a significant incident to the leader.

A horizontal culture does not mean totally horizontal. It is not a democracy, where a procedure is voted. It is a culture where an assistant can inform a great surgeon that there was a diversion or an error in the procedure. But leaders keep being needed, in those situations where the complex still remains. It is a good culture that the solutions are searched organically in the expert people, with knowledge and skills and not in the chain of command, which means, in the bosses.

A good culture is well-fed by transparency. To have a culture based on values, everyone must have access to information, with great transparency. Transparency is the ethics of the 21st century. Systems are more complex than before and require the participation and collaboration of all, to reduce errors.

Much is said about the culture of guilt, that is, that blames people for mistakes instead of looking for the failures that exist in the system. With few exceptions failures are always caused by a system with errors or unsafe.

The dysfunctionality of power-based culture, within complex systems, is that group decision, in accordance with different specialists and point of views, is necessary for its success. A simple or simplest solution is, by definition, a wrong reply for a complex problem. It is necessary to dig deeper. The solution must consider, at least, the same complexity level of the problem.

Human beings are beings that make mistakes, and the draw of any system must be done keeping it in mind and preventing its consequences, when natural mistakes happen.

It takes too much time to change a culture, once there are many minds and memories surrounded in this change, but it is one of the factors with most results. Technological changes help but it´s not itself enough to create all needed evolution.

The Largest Numbers Of Patient Safety

Os Grandes Números

We have reviewed more than 100 articles and works and all of them show error statistics. Even so, it is very difficult to reach firm numbers. For those of us who are used to the engineering sciences, it is difficult to understand so many differences between some works and others. There´s still a lack of clarity about what we are measuring, what process, in what condition, etc.

Many times, we find numbers that seem exaggerated and we find it difficult to present them in our works, because it will generate a discredit to all the figures presented.

However, the numbers of errors are so high, when compared to other industries, that it makes no sense to discuss whether the correct value is 6 or 8%, when the desired value should be 100 times lower.

According to IATA data, during 2014, 3,100 million passengers flew and 265 died. (Small airplanes were not considered because they are not a company or industry itself).

This gives us 0.085 deaths per million of passengers (265 / 3,100 = 0.085)

If it were in the health sector, the index would be 3/1000, deaths by people treated in a hospital. (Manifesto Err is Human, page 1: 98,000 deaths for 33.6 million cases).

What gives us for the health sector, 3,000 deaths per million.

So, the comparison would be: 0.085 vs 3.000.

The ratio between the two would be: 3,000 / 0.085 = 35,300 (35,300 times more)

When a method of observation (analysis of records) gives us a value 10 times lower than another (direct observation), or an objective method like the Global Trigger Tool gives us 10 times more AE than the reported ones, we must stop arguing if value is 8 or 22. That doesn´t matter anymore, the values ​​are also exaggeratedly high in relation to other industries. That is, the value that should be and that maybe we can achieve in the future. Numbers cannot be the goal of a study, as they are equally disproportionately exorbitant.


These would be the most consensual numbers.

  • 10% of the attendance have an AE (data from ANVISA AND WHO).
  • 50% of these is avoidable (there is a great coincidence among researchers on this value)
  • 30% of hospital AEs occur in the Medication process (D. Bates)
  • 30% of medications have an error with damage.
  • 50% of drug administration AEs are severe or moderate, the rest are of light incidence.

We have found a lot of disparity of concern with Patient Safety and Quality in many countries. Even inside Europe.

Many values ​​are obtained from self-denunciations that remains a resisted topic throughout the world. Self-denounced values ​​vary a lot from one culture to another. Statistics copied this disparity. It should not surprise us.

It is considered that the indicators in developing countries would be 3 times bigger than in the developed ones.

We mean by this that indexes do not matter as much as we think. Values ​​are also very high. We must change procedures and offer to health professionals all means to help them to do their job well. Perhaps because Health is a right recognized in almost all the world and many times sustained by the governments, it does not receive as much investment as other industries.

It is incredible that many health institutions do not have a strong economic motivation to improve Quality. In many cases, if the patient must remain 10 more days due to an AE, the hospital continues to receive a payment for each busy day. In other words, you earn more with AE.



To Err Is Human: Building a Safer Health System. The title of this report encapsulates its purpose. Human beings, in all lines of work, make errors. Errors can be prevented by designing systems that make it hard for people to do the wrong thing and easy for people to do the right thing. Cars are designed so that drivers cannot start them while in reverse because that prevents accidents. Work schedules for pilots are designed so they don’t fly too many consecutive hours without rest because alertness and performance is compromised.

In health care, building a safer system means designing processes of care to ensure that patients are safe from accidental injury. When agreement has been reached to pursue a course of medical treatment, patients should have the assurance that it will proceed correctly and safely so they have the best chance possible of achieving the desired outcome.

This work, launched in 1999, was the beginning for all the great work on quality that happened over the next 15 years.

But this work also said: to err is human and natural and we cannot change it. Then we must prepare ourselves for when the error happens. We must accept that we are going to err and then we must create systems that solve that incident.

James Reason: We cannot change the human being, but we can change the conditions in which he works, to improve the results.

We emphasize the use of automated or computer systems, conducting processes, but always with the supervision and vigilance of the people.

We must give people all the necessary conditions to do a good job.