Skip to content

Contractors in the Process Industries

Suddenly, a heated exchange took place between the king and the moat contractor (Larson)

Suddenly, a heated exchange took place between the king and the moat contractor (Larson)

The material in this post is extracted from Chapter 19 of the book Process Risk and Reliability Management and from Chapter 6 of the 2nd edition of the book Offshore Safety Management.

Contractors play a vital and increasingly important role in the design, construction, operation and maintenance of process and energy facilities, as can be seen from the chart shown below. It can be seen that, over the last twenty years or so, the number of contractor work hours has increased by about a fact of fifteen, whereas the number of hours worked by employees of the host companies has hardly doubled. (The offshore oil and gas industry is particularly reliant on contractors.)

Contractor-Growth

We have prepared three videos to do with contractors in the process industries. They are:

  1. Contractors and Operators
  2. Regulations and Standards
  3. Management of Contractors

The first video — Contractors and Operators — lists the different types of contractor and contract company, it describes some of the issues and difficulties that occur at the operator/contractor interface, and highlights some of the legal issues that need to be thought about. Strategies for contractor selection are described and references to the Center for Offshore Safety Contractor-Operators templates are provided.

The second video — Regulations and Standards — reviews regulations to do with contractors from OSHA (the Occupational Safety & Health Administration) and BSEE (the Bureau of Safety and Environmental Enforcement), along with the industry standard API Recommended Practice 76. These regulations and standards provide a basis for developing contractor management systems, regardless of location or industry.

The third video — Management of Contractors — discusses how an operating company can develop bridging documents with the hundreds of contractor companies with which it works. The development of maps using a Safety Management System, such as shown in the drawing below, is one way of doing this efficiently and with minimal redundancy.

Bridging Document

Bridging Document

Risk Perception

Rock-Climbing-1

In earlier posts we discussed the concepts of Perfect Safety and Acceptable Risk. This post concludes this short sequence with some thoughts to do with the subjective nature of risk. The material below has been adapted from the book Process Risk and Reliability Management.

The discussion would seem to have little relevance to the work of those working in the process industries. But such a conclusion would be misleading, as can be seen from a recent LinkedIn post Perception trumps truth. The premise of the post, and of many of the comments on it, is that the environmental consequences of fracking are not as bad as many of its opponents proclaim, hence we can go ahead as long as we put in effective environmental controls. But some environmentalists have a world view that states that, since fracking inevitably causes at least some environmental damage, this activity should be banned altogether.

The two parties have different belief systems and are therefore talking entirely at cross purposes. Indeed, A truth ceases to be a truth as soon as two people perceive it. Oscar Wilde (1854-1900)

Wilde-Oscar-3

Oscar Wilde

The above observation regarding different truths applies to hazards analysis and risk management work. Each person participating in a hazards analysis has his or her own opinions, memories, attitudes and overall ‘world view’. Most people are — in the strict sense of the word — prejudiced; that is, they pre-judge situations rather than trying to analyze the facts rationally and logically. People jump to preconceived conclusions, and those conclusions will often differ from those of other people who are looking at the same information with their own world view. With regard to risk management, even highly-trained, seasoned experts — who generally regard themselves as being governed only by the facts — will reach different conclusions when presented with the same data. Indeed, Slovic (1992) states that there is no such thing as ‘real risk’ or ‘objective risk’. His point is that if risk can never be measured objectively then objective risk does not exist at all.

In his book Bad Science the author Ben Goldacre (Goldacre 2008) has a chapter entitled “Why Clever People Believe Stupid Things”. Many of his insights can be applied to the risk assessment of process facilities. He arrives at the following conclusions:

  1. We see patterns where there is only random noise.
  2. We see causal relationships where there are none.
  3. We overvalue confirmatory information for any given hypothesis.
  4. We seek out confirmatory information for any given hypothesis.
  5. Our assessment of the quality of new evidence is biased by our previous beliefs.

The subjective component of risk becomes even more pronounced when the perceptions of non-specialists, particularly members of the public, are considered. Hence successful risk management involves understanding the opinions, emotions, hopes and fears of many people, including managers, workers and members of the public.

Some of the factors that affect risk perception are discussed below.

Degree of Control

Voluntary risks are accepted more readily than those that are imposed. For example, someone who believes that the presence of a chemical facility in his community poses an unacceptable risk to himself and his family may willingly go rock-climbing on weekends because he feels that he has some control over the risk associated with the latter activity, whereas he has no control at all over the chemical facility, or of the mysterious odors it produces. Hence rock climbers will quickly point to evidence that their sport is safer than say driving to the mountains. But their response misses the point — they are going to climb rocks anyway; they then assemble the evidence to justify what they are doing (see point #4 above).

Similarly, most people feel safer when driving a car rather than riding as a passenger, even though half of them must be wrong. The feeling of being in control is one of the reasons that people accept highway fatalities more readily than the same number of fatalities in airplane crashes.

The desire for control also means that most people generally resist risks that they feel they are being forced to accept; they will magnify the perceived risk associated with tasks that are forced upon them.

Familiarity with the Hazard

Most people understand and accept the possibility of the risks associated with day-to-day living, but they do not understand the risk associated with industrial processes, thus making those risks less acceptable. A cabinet full of household cleaning agents, for example, may actually pose more danger to an individual than the emissions from the factory that makes those chemicals. But the perceived risk is less.

Hazards that are both unfamiliar and mysterious are particularly unacceptable, as can be seen by the deep distrust that the public feels with regard to nuclear power facilities.

Direct Benefit

People are more willing to accept risk if they are direct recipients of the benefits associated with that risk. The reality is that most industrial facilities provide little special benefit to the immediate community apart from offering some job opportunities and an increased local tax base. On the other hand, it is the community that has to take all of the associated risk associated with those facilities, thus creating the response of NIMBY (‘Not in My Backyard’).

Personal Impact

The effect of the consequence term will depend to some degree on the persons who are impacted by it. For example, if an office worker suffers a sprained ankle he or she may be able to continue work during the recovery period; an outside operator, however, may not be able to work at his normal job during that time. Or, to take another example, the consequence of a broken finger will be more significant to a concert pianist than to a process engineer.

Natural vs. Man-Made Risks

Natural risks are generally considered to be more acceptable than man-made risks. For example, communities located in areas of high seismic activity understand and accept the risks associated with earthquakes. Similarly people living in hurricane-prone areas regard major storms as being a normal part of life. However, these same people are less likely to understand or accept the risks associated with industrial facilities.

Recency of Events

People tend to attribute a higher level of risk to events that have actually occurred in the recent past. For example, the concerns to do with nuclear power facilities in the 1980s and 90s were very high because the memories of Chernobyl and Three Mile Island were so recent. This concern is easing given that these two events occurred decades ago, and few people have a direct memory of them.

Perception of the Consequence Term

The Risk Equation (1.1) is linear; it gives equal value to changes in the consequence and frequency terms, implying a linear trade-off between the two. For example, according to Equation (1.1), a hazard resulting in one fatality every hundred years has the same risk value as a hazard resulting in ten fatalities every thousand years. In both cases the fatality rate is one in a hundred years, or 0.01 fatalities yr-1. But the two risks are not perceived to be the same. In general, people feel that high-consequence events that occur only rarely are less acceptable than more frequent, low consequence accidents. Hence, the second of the two alternatives shown above is perceived as being worse than the first.

The same way of looking at risk can be seen in everyday life. In a typical large American city around 500 people die each year in road accidents. Although many efforts are made to reduce this fatality rate the fact remains that this loss of life is perceived as a necessary component of modern life, hence there is little outrage on the part of the public. Yet, were an airplane carrying 500 people to crash at that same city’s airport every year, there would be an outcry. Yet the fatality rate is the same in each case, i.e., 500 deaths per city per year. The difference between the two risks is a perception rooted in feelings and values.

To accommodate the difference in perception regarding risk Equation (1.1) can be modified so as to take the form of Equation (1.3).

   RiskHazard  =  Consequencen  *  Likelihood…………………… (1.3)

where  n > 1

Equation (1.3) shows that the contribution of the consequence term has been raised by the exponent n, where n > 1. In other words, high consequence/low frequency accidents are assigned a higher perceived risk value than low consequence/high frequency accidents.

Since the variable ‘n’ represents subjective feelings it is impossible to assign it an objective value. However, if a value of say 1.5 is given to ‘n’ then Equation (1.3) for the two scenarios just discussed — the airplane crash and the highway fatalities — becomes Equations (1.4) and (1.5) respectively.

   Riskairplane  =  500 1.5  *  1………………………………………………. (1.4)

   =  11180

   Riskauto     =    1 1.5  *  500………………………………………………. (1.5)

   =    500

The 500 auto fatalities are perceived as being equivalent to over 11,000 airplane fatalities, i.e., the apparent risk to do with the airplane crash is 17.3 times greater than for the multiple automobile fatalities.

In the case of hazards that have very high consequences, such as the meltdown of the core of a nuclear power facility, perceived risk rises very fast as a result of the exponential term in Equation (1.3), thus explaining public fear to do with such facilities. Over the years, managers and engineers in such facilities have reduced the objective risk associated with nuclear power plants to an extremely low value, largely through the extensive use of sophisticated instrumentation systems. However, since the worst-case scenario — core meltdown — remains the same the public remains nervous and antagonistic. In such cases management would be better advised to address the consequence term rather than the likelihood term. With regard to nuclear power, the route to public acceptance is to make the absolute worst-case scenario one of low consequence.

The subjective and emotional nature of risk is summarized by Brander (1995) with reference to the changes in safety standards that were introduced following the Titanic tragedy.

They [scientists and engineers] tend to argue with facts, formulas, simulations, and other kinds of sweet reason. These don’t work well. What does work well are shameless appeals to emotion – like political cartoons. Like baby seals covered in oil. And always, always, casualty lists. Best of all are individual stories of casualties, to make the deaths real. We only learn from blood.

Comprehension Time

When people are informed that a significant new risk has entered their lives it can take time for them to digest that information. For example, when a patient is informed by a doctor that he or she has a serious medical condition, the doctor should not immediately launch into a discussion of possible treatments. He should allow the patient time to absorb the news before moving on to the next step. So it is with industrial risk. If people — particularly members of the public — are informed of a new risk associated with a process facility, then those people need to time to grasp and come to terms with what has been said. There is a difference between having an intellectual grasp of risk and of subjectively understanding how things have changed.

Randomness

Human beings tend to create order out of a random series of events. People have to do this in order to make sense of the world in which they live. The catch is that there is a tendency to create order, even when events are statistically independent of one another.

For example, psychologists gave test subjects a set of headphones and then played a series of random beeps. The subjects were told to imagine that each beep corresponded to an automobile going by. They were then asked if the beeps were coming in batches, such as would occur when cars were leaving a red traffic light, or whether the cars were spaced randomly, such as would happen on a freeway. The subjects generally said that the beeps were in groups, even though they were in fact occurring at random.

Therefore it is important for those working in process risk management not to create patterns and order out of randomly occurring events. For example, if two or three near miss incidents can be attributed to a failure of the Management of Change (MOC) system this does not necessarily mean that the MOC system is any more deficient than the other elements of the process safety program.

Regression to the Mean

Related to the above discussion concerning the tendency to create non-existent order out of random events, people will also create causal relationships where there are none, particularly when a system was regressing to the mean anyway.

For example, a facility may have suffered from a series of serious incidents. In response to this situation management implements a much more rigorous process safety management program than they had before. The number of incidents then drops. It is natural to explain the improvement as a consequence of the new PSM program. Yet, if the serious events were occurring randomly then it is likely that their frequency would have gone down anyway because systems generally have a tendency to revert to the mean.

Bias toward Positive Evidence / Prior Beliefs

People tend to seek out and information that confirms their opinions and they tend to overvalue information that confirms those opinions. It is particularly important to recognize this trait when conducting incident investigations. As discussed in Chapter 12, it is vital that the persons leading an investigation listen to what is being said without interjecting with their own opinions or prejudices.

We also tend to expose ourselves to situations and people that confirm our existing beliefs. For example, most people will watch TV channels that reinforce their political opinions. This can lead to shock when it turns out in an election that those beliefs did not constitute a majority opinion.

Availability

People tend to notice items which are outstanding or different in some way. For example, someone entering their own house will not see all of the items of furniture, but she will notice that the television has been stolen or that a saucepan has boiled dry. Similarly anecdotes and emotional descriptions have a disproportionate impact on people’s perceptions (as illustrated in the discussion to do with the Titanic tragedy provided earlier in this chapter).

Goldacre notes that, as information about the dangers of cigarette smoking became more available, it was the oncologists and chest surgeons who were the first to quit smoking because they were the one who saw the damage caused to human lungs by cigarettes.

Acceptable Risk

dice-4

The topic of “Acceptable Risk” is a difficult one because most process safety professionals aim for an environment of no incidents — they have trouble accepting the fact that some incidents are always going to occur, even though they know that risk can never be zero.

In an earlier post we discussed the concept of Perfect Safety. This post develops some of the ideas presented there. The material has been extracted from the bookProcess Risk and Reliability Management.

****************************** 

A fundamental aspect of understanding culture is to have a clear understanding as to what levels of risk are acceptable. Given that risk is basically subjective it is not possible to dispassionately define what level of risk is acceptable and what is not. After all, if a facility operates for long enough, it is certain – statistically speaking – that there will be an accident. Yet, given that real-world targets are needed for investing in PSM, a target for “acceptable safety” is needed. This is tricky. Regulatory agencies in particular will never place a numerical value on human life and suffering because any number that they develop would inevitably generate controversy. Yet working targets have to be provided, otherwise the facility personnel do not know what they are shooting for.

The difficulty with attempting to identify an acceptable level of risk is that, as discussed in the sections above, the amount of risk people are willing to accept depends on many, hard-to-pin down factors. Hence no external agency, whether it be a regulatory body, a professional society or the author of a book such as this can provide an objective value for risk. Yet individuals and organizations are constantly gauging the level of risk that they face in their personal and work lives, and then acting on their assessment of that risk. For example, at a personal level, an individual has to make a judgment as to whether it is safe or not to cross a busy road. In industrial facilities managers make risk-based decisions regarding issues such as whether to shut down an equipment item for maintenance or to keep it running for another week. Other risk-based decisions made by managers are whether or not an operator needs additional training, whether to install an additional safety shower in a hazardous area, and whether a full Hazard and Operability Analysis (HAZOP) is needed to review a proposed change. Engineering standards, and other professional documents, can provide guidance. But, at the end of the day, the manager has a risk-based decision to make. That decision implies that some estimate of ‘acceptable risk’ has been made.

One company provided the criteria shown in Table 1.8 for its design personnel.

Table 1.8
Example of Risk Thresholds

  Fatalities per year (employees and contractors)
Intolerable risk >5 x 10-4
High risk <5 x 10-4 and >1 x 10-6
Broadly tolerable risk <1 x 10-6

Their instructions were that risk must never be in the ‘intolerable’ range. High risk scenarios are ‘tolerable’, but every effort must be made to reduce the risk level, i.e.,to the ‘broadly tolerable’ level.

The Third Law

The Third Law of Thermodynamics states that it is impossible for any system to reduce its entropy to zero in a finite number of operations. A safety incident is an example of a system that is not in a zero entropy state, i.e., one that is not perfectly ordered. And it makes sense. No person is perfect, no organization is perfect. No matter how much time, effort and money and goodwill we spend on improving safety, incidents will occur. Indeed, the data shown in Figure 1.15 suggest that safety trends offshore have reached an asymptote. The data, which were published by the United States Bureau of Safety and Environmental Enforcement (BSEE), show a steady improvement from the mid-1990s to the year 2008. But since then there seems to have been a leveling out. Whether this trend will continue is to be seen, but it does suggest that some type of limit may have been reached.

Figure 1.15
Offshore Safety Trends (United States)

Figure 3

Looked at in this light, perfect safety can never happen. Nevertheless we should strive toward it because otherwise we accept that people will be injured — which is something that none of us want or accept, and we certainly do not want to quantify (although a goal of zero incidents over a specified time frame may be achievable).

Perfection as a Slogan

Although perfect safety may not be theoretically achievable, many companies will use slogans such as Accidents Big or Small, Avoid them All. The idea behind such slogans is that the organization should strive for perfect safety, even though it is technically not achievable.

Whether such slogans have a positive effect is debatable. Many people view them as being simplistic and not reflecting the real world of process safety. They seem to over-simplify a discipline that requires dedication, hard work, education, imagination and a substantial investment. For example, a large sign at the front gate of a facility showing the number of days since a lost-time injury is not likely to change the behavior of the workers at that facility. Indeed, it may encourage them to cover up events that really should have been reported. Or to be cynical about the reporting system.

As Low as Reasonably Practical – ALARP

Some risk analysts use the term ‘As Low as Reasonably Practical (ALARP)’ for setting a value for acceptable risk. The basic idea behind this concept is that risk should be reduced to a level that is as low as possible without requiring ‘excessive’ investment. Boundaries of risk that are ‘definitely acceptable’ or ‘definitely not acceptable’ are established as shown in Figure 1.15 which is an FN curve family. Between those boundaries, a balance between risk and benefit must be established. If a facility proposes to take a high level of risk, then the resulting benefit must be very high.

Figure 1.15
Risk Boundaries

Risk-Boundaries

Risk matrices (discussed below) can be used to set the boundaries of acceptable and unacceptable risk. The middle squares in such a matrix represent the risk levels that are marginally acceptable.

One panel has developed the following guidance for determining the meaning of the term ‘As Low as Reasonably Practical’.

  • Use of best available technology capable of being installed, operated and maintained in the work environment by the people prepared to work in that environment;
  • Use of the best operations and maintenance management systems relevant to safety;
  • Maintenance of the equipment and management systems to a high standard;
  • Exposure of employees to a low level of risk.

The fundamental difficulty with the concept of ALARP is that the term is inherently circular and self-referential. For example, the phrase ‘best available technology’ used in the list above can be defined as that level of technology which reduces risk to an acceptable level – in other words to the ALARP level. Terms such as ‘best operations’ and ‘high standard’ are equally question-begging.

Another difficulty with the use of ALARP is that the term is defined by those who will not be exposed to the risk, i.e., the managers, consultants and engineers who work safely in offices located a long way from the facility being analyzed. Were the workers at the site be allowed to define ALARP it is more than likely that they would come up with a much lower value.

Realistically, it has to be concluded that the term ‘ALARP’ really does not provide much help to risk management professionals and facility managers in defining what levels of risk are acceptable. It may be for this reason that the United Kingdom HSE (Health and Safety Executive) chose in the year 2006 to minimize its emphasis to do with ALARP requirements from the Safety Case Regime for offshore facilities. Other major companies have also elected to move away from ALARP toward a continuous risk reduction model (Broadribb 2008).

De Minimis Risk

The notion of de minimis risk is similar to that of ALARP. A risk threshold is deemed to exist for all activities. Any activity whose risk falls below that threshold value can be ignored ¾ no action needs to be taken to manage this de minimis risk. The term is borrowed from common law, where it is used in the expression of the doctrine de minimis non curat lex, or, ‘the law does not concern itself with trifles’. In other words, there is no need to worry about low risk situations. Once more, however, an inherent circularity becomes apparent: for a risk to be de minimis it must be ‘low’, but no prescriptive guidance as to the meaning of the word ‘low’ is provided.

Citations / ‘Case Law’

Citations from regulatory agencies provide some measure for acceptable risk. For example, if an agency fines a company say $50,000 following a fatal accident, then it could be argued that the agency has set $50,000 as being the value of a human life. (Naturally, the agency’s authority over what level of fines to set is constrained by many legal, political and precedent boundaries outside their control, so the above line of reasoning provides only limited guidance at best.) Even if the magnitude of the penalties is ignored, an agency’s investigative and citation record serve to show which issues are of the greatest concern to it and to the community at large.

RAGAGEP

With regard to acceptable risk in the context of engineering design, a term that is sometimes used is ‘Recognized and Generally Accepted Good Engineering Practice’ (RAGAGEP). The term is described in Chapter 9 — Asset Integrity of Process Risk and Reliability Management.

Indexing Methods

Some companies and industries use indexing methods to evaluate acceptable risk. A facility receives positive and negative scores for design, environmental and operating factors. For example, a pipeline would receive positive points if it was in a remote location or if the fluid inside the pipe was not toxic or flammable (Muhlbauer 2003). Negative points are assigned if the pipeline was corroded or if the operators had not had sufficient training. The overall score is then compared to a target value (the acceptable risk level) in order to determine whether the operation, in its current mode, is safe or not.

Although indexing systems are very useful, particularly for comparing alternatives, it has to be recognized that, as with ALARP, a fundamental circularity exists. Not only has an arbitrary value for the target value to be assigned, but the ranking system itself is built on judgment and experience, therefore it is basically subjective. The biggest benefit of such systems, as with so many other risk-ranking exercises, is in comparing options. The focus is on relative risk, not on trying to determine absolute values for risk and for threshold values.

Employee Participation and Culture

Thinker-1

Rodin’s Thinker

The topic of “Safety Culture” is one that has received much attention in recent years. Yet “culture” is very similar to “Employee Participation” — a topic that has been part of the OSHA Process Safety standard for a generation and that is now being incorporated into BSEE’s SEMS rule. Therefore it is useful to review what has been learned about Employee Participation. The following material is extracted from the book Process Risk and Reliability Management.

Employee participation lies at the heart of any process safety management program. It is probably for this reason that OSHA placed the topic of Employee Participation, also known as Workforce Involvement, as the first of its fourteen Process Safety Management elements.

All employees (including contract workers) must be involved in the program. Although PSM programs are often conceived of primarily in technical topics such as hazards analysis, risk quantification and fire and explosion modeling, the involvement of all employees at every level is fundamental to the success of such programs. When employees feel involved they are much more likely to make suggestions for improvement, participate in new initiatives and “walk the extra mile”. Moreover, the effective involvement of the workforce provides a sanity check for new ideas, projects and analyses. Anything new or unusual should be reviewed by the employees; they will immediately identify any common sense problems because they are the ones who know the facility best.

It is important to note that this element is called Employee Participation, not Employee Communication. The intent is that employees fully engage in the spirit of the process safety program. For example, a process hazards analysis (PHA) offers an opportunity for participation in two ways.

First, all employees should be encouraged to participate in the PHA meetings themselves. They should have a chance to contribute their knowledge, experience and ideas. Second, and maybe more important, carrying out PHAs creates state of mind for all employees; they will start to look at everything they do in terms of its risk impact. The insights generated will then suggest ideas for reducing that risk. In other words, the purpose of a PHA is not just to identify hazards, but also to encourage a particular way of thinking among all employees. So, an operator working by himself at one o’clock in the morning may be about to open a valve on a line that connects two tanks. If, before doing so, he spends a few moments going through some of the PHA guidewords such as “reverse flow” or “contamination” he may identify a possible accident situation, and decide not to open the valve until he has talked over the proposed action with his supervisor or colleagues. When the operator acts in this manner both the participation and the PHA elements of the process safety program are working perfectly. Employee participation is not a stand-alone activity; instead it should be woven into the fabric of all the elements of a risk management program.

Additional examples of workforce involvement occur when a pipefitter learns that a new chemical is about to be used in the process. He may question whether the current gaskets are safe in the new service. Or an outside contractor may feel that he or she has not been given sufficient instructions as to what to do and where to go in an emergency, and makes that concern known to the host company.

Although there are many benefits to do with participation, management has to recognize that, by asking employees to get involved in decision making they are also asking those employees to take more risk with regards to their career and reputation. It is much easier for an employee merely to follow orders — even if he or she knows that those orders are not sensible — than to take initiative. Moreover, increased employee participation may run into road blocks with unions and other organizations that represent those employees. Consequently, employees must feel that they are sufficiently rewarded for participating in management programs. One way of achieving this is to provide employees with long-term rewards if the company does well, for example by giving them stock rather than cash bonuses.

Developing Employee Participation

Management and the employees should develop a written plan showing how they plan to implement Workforce Involvement. An example of one of these is shown below.

  • The PSM program will involve all employees and contract workers, as appropriate to their job function and experience level.
  • The program will involve the full participation of “employee representatives” – where such duly elected representatives exist.
  • “Employees” includes not only full-time workers, but also temporary, part-time and contract workers.
  • Decisions as to which kinds or classes of employees should be consulted regarding specific PSM matters will take into account factors such as job functions, experience, and their degree of involvement with PSM and the company’s general background.

Safety Committees

Safety committees provide a formal channel through which management and the employees can communicate with regard to process safety issues and overall company culture. There are many references to the involvement of employee representatives in the OSHA standard. These would usually be on the safety committee. If the facility is non-union, it is essential that the employees’ representative is selected by the employees, not appointed by management. But it is important to ensure that the committee is not isolated; the effective implementation of this element requires that everyone participate in the process safety program.

Involvement in PSM Elements

Employees can participate in the PSM program by taking leadership of some of the elements of process safety. This type of involvement does not have to be universal; employees will be selected based on their understanding and knowledge of the topic in question. Nevertheless, it is a good idea to involve employees with lower levels of experience wherever possible in order to train them in the details of the process safety program.

Difficulties with Workforce Involvement

Although effective workforce involvement and employee participation bring many benefits, there are costs and drawbacks, as discussed below.

Inefficiencies

Increased participation of employees in the PSM program can lead to short term inefficiencies brought about by spreading work among a large number of people, rather than assigning it to a small number of full-time specialists. For example, rotating operators through a Hazard and Operability Study means that the analysis will be slowed down because the newcomers will have to get up to speed on what has already been covered by the previous team members.

Another example of this type of problem (and opportunity) occurs when the operators are each asked to check the P&IDs for a small section of the plant. It would be much quicker to have one designer go out and do the whole job — but doing so would lose the important benefits that would be gained when the operators check their own unit line by line and valve by valve. Furthermore, the operators may be able to identify problems with the P&IDs because they know how “things really are”. Ultimately, the short term inefficiencies consequent on using all the operators to perform such tasks will be more than compensated for by the gains in the overall knowledge and understanding of the operational integrity system.

Unwillingness to Accept Change

Implementation of workforce involvement can create anxiety — particularly among managers — because they are likely to hear facts about their organization that are critical of their efforts. Moreover many workers prefer to work in a “command and control” management system because they can thereby avoid the responsibility for mistakes that are made and because thinking is such hard work.

Labor / Management Relations

It has to be recognized that the ideal workforce involvement situation depends heavily on good labor/management relations. If there is a good deal of strife and disagreement between the two parties, then, realistically, progress in this area is likely to be difficult. For this reason, it is important to set realistic goals, and not to over-commit as to how much progress can be made in this area.

OSHA Standard

The OSHA standard and guidance to do with Employee Participation are shown below.

(1)  Employers shall develop a written plan of action regarding the implementation of the Employee Participation required by this paragraph.

(2)  Employers shall consult with employees and their representatives on the conduct and development of process hazards analyses and on the development of the other elements of process safety management in this standard.

(3)  Employers shall provide to employees and their representatives access to process hazard analyses and to all other information required to be developed under this standard.

Guidance

Employers are to consult with their employees and their representatives regarding the employers efforts in the development and implementation of the process safety management program elements and hazard assessments. [ Employers must ] train and educate their employees and to inform affected employees of the findings from incident investigations required by the process safety management program. Many employers, under their safety and health programs, have already established means and methods to keep employees and their representatives informed about relevant safety and health issues and employers may be able to adapt these practices and procedures to meet their obligations under this standard. Employers who have not implemented an occupational safety and health program may wish to form a safety and health committee of employees and management representatives to help the employer meet the obligations specified by this standard. These committees can become a significant ally in helping the employer to implement and maintain an effective process safety management program for all employees.

Written Plan of Action

OSHA requires that the Employee Participation program be written down. This can be difficult to do well because Employee Participation is involved in so many areas of process safety and because participation represents a state of mind rather than a specific program.

The plan of action should identify who is responsible for the management of the PSM program, how employees can learn about it, and how suggestions for improvement can be implemented.

Consultation

As already discussed, employees must be involved in all aspects of PSM, not merely informed about decisions that have been made by other people.  Their opinions matter, and should always be acted on. Even when an idea is rejected, management should always communicate with the employee as to why that decision was made.

On union plants, the employee representatives will be appointed by the union. On non-union plants, the employees may choose someone to represent their interests. The appointment must be made by the employees, not management.

Access to Information

In addition to consulting with employees, it is important that management makes sure that employees know that they have a right to access to information to do with process safety. The fact that Process Hazards Analyses (PHAs) are specifically identified within this element has prompted many companies to make sure that operators participate in the PHAs, often on a rotating basis.

How to Read and Why

Bloom-Harold-1

Harold Bloom

Recent posts at this blog have discussed the importance of written communications as part of the process safety profession. They include:

Much of the discussion in these posts has been to do with the importance of writing well — process safety professionals often have to write reports based on tasks and projects such as hazards analyses, incident investigations and prestartup reviews. These reports need to be clear, succinct and readable. Yet writing well is not sufficient. It is equally important that the reader of the report actually knows how to read. (It is often assumed that, if a written report fails to communicate its message, then the writer has a problem and needs to improve his or her technique. But another response to the difficulty is that maybe the reader needs to improve his or her reading skills.)

Now, in this context the word ‘read’ does not mean the ability to understand written statements such as “Reverse flow could cause corrosion of the impeller of Pump, P-101”. It means understanding the underlying causes of the problem and developing an understanding of management system failures; reading well will help identify hidden messages.

Harold Bloom

How-to-Read-and-WhyIn the year 2000 Yale Professor Harold Bloom (b. 1930) published the book How to Read and Why. Although his book is directed to those reading classical literature some of his thoughts and insights can be applied to the more banal activity of reading process safety reports.

His aphorism, “Clear your mind of cant” is particularly important. Cant means, “Monotonous talk filled with platitudes” or “Hypocritically pious language”. The safety business is prone to such platitudes and to pious language. For example, when discussing a major incident that has killed and injured many people it is common to use the word ‘tragedy’ when describing the event. And of course it is a tragedy – for those affected personally. But for people who were not involved in the event in any way use of the word ‘tragedy’ often seems to be somewhat sanctimonious. It is probably best simply to use the word ‘incident’.

More broadly, Bloom’s advice can mean simply “Clear your mind”.

Francis Bacon (1561-1626) expressed the same concept when he said,

Bacon-Francis-1

Francis Bacon

Read not to contradict and confute, not to believe and take for granted, not to find talk and discourse, but to weigh and consider.

What both of these writers are saying is that, when reading, we should refrain from being ‘prejudiced’ in the literal sense of the word: ‘pre + judge’. We should open our minds, as best we are able, to understanding what the writer is really saying, not to what we think he or she is saying.

Wilde-Oscar-1This is difficult. As Oscar Wilde (1854–1900) said, “A truth ceases to be a truth as soon as two people perceive it.” In other words facts are never truly objective; each person has their own perception of what they perceive to be the same reality. His insight also suggests that there is no such entity as ‘common sense’ — no two people have a common view of the world so they can never share a ‘common sense’.

In addition to his condemnation of cant Bloom suggests that an understanding of irony is also part of deep reading. But this insight does not apply to the process safety world. All reports should be written ‘straight’, with no use of word play.

Example

Effective reading in the process safety world is analogous to incident analysis and attempting to identify root causes.

For example, a report to do with a Prestartup Safety Review may state, “The start-up of the modified system could not proceed because the safety-critical pressure gauge downstream of Pump, P-101A had not been installed”.

The plant manager on reading this may react in a ‘prejudiced’ manner by stating that he always knew that the company that makes that type of gauge was not to be trusted. But a deeper reading of the report may proceed as follows.

  • The safety-critical pressure gauge was not installed.
    Why not?
  • The gauge had been delivered on time but it had been put in the wrong location in the facility warehouse.
    Why?
  • The warehouse manager was on vacation and her substitute did not understand the parts data base system.
    Why not?
  • No one in the warehouse has ever received formal training.
    Why not?
  • Because the process safety training program is directed just to line operators and maintenance personnel.

A deep reading of just one sentence has led to useful process safety insights.

Conclusion

Many companies encourage their employees to attend courses on improving their “Communication Skills”. Such courses tend to focus on how to write clearly and economically. Such training can be invaluable, but its effect would be greatly enhance were process safety professionals and their managers also trained in deep and thoughtful reading.

The Risk Management Consultant

Consultant-1

The material in this post is extracted from Chapter 20 of the book Process Risk and Reliability Management.

Last week’s post — The Risk Management Professional — discussed some of the attributes that help make a successful risk management/process safety professional. This week we take a look at a related topic: the attributes of an effective process risk management consultant.

When it comes to consulting the most important fact that consultants need to remember is they are not wanted. The only reason for hiring a consultant is to solve a problem — a problem that the client management wishes would just go away. The presence of the consultant is a constant, nagging reminder that time and money are being spent on solving the problem. Therefore, even if the client and consultant get on well personally, their relationship will always be tense; the best thing that the consultant can do is solve the problem and then go away.

Companies hire consultants to help them with their risk management programs for the following reasons.

  • Some of the elements of the program may be new to a company; in such cases a consultant can help them get started. For example, in the late 1980s and early 1990s Process Hazard Analyses (PHAs) were a new technique in most facilities. Hence a small consulting and software industry developed to conduct PHAs and to train clients in their use and application. Now that PHAs are part of the furniture for most companies the need for this particular consulting service is not so great (although many of the same people continue to assist with the implementation of the PHAs — but as such they are serving as contract workers, not consultants).
  • A company may be struggling with the logistics of its risk management program. Costs may be out of hand and/or the program may be way behind schedule. A consultant can work with the management team to bring the project back on track.
  • Consultants often make good auditors. Their expert knowledge of the principles of risk management of process safety regulations provides a solid foundation for their findings. And consultants are particularly well qualified to conduct assessments of a facility’s risk management program.
  • A consultant can provide fresh ideas as to how to perform well-understood tasks. For example, in Chapter 5 it was pointed out there is a wide variety of process hazards analysis techniques that can be used. If a company has become stuck with one method, say the HAZOP technique, a consultant can help them evaluate and use other methods, such as What-If or FMEA.
  • A company may require detailed help concerning the interpretation of a regulation or ruling. A consultant can provide benchmarks from other companies. Indeed, one of the most common questions that consultants have to answer is, “How do other people do it?”, where the word ‘it’ refers to an activity that they themselves are having trouble addressing.

True Expertise

Consultants must be true experts. Many people know “quite a lot” about a topic, but that does not make them true experts. In the example quoted above concerning PHAs, by the early 1990s many engineers and other technical specialists had become very familiar with the process of leading hazard analyses. This did not, however, qualify them to become PHA consultants. Their experience merely qualified them to lead hazards analyses, not to design, implement and run PHA systems.

The Consultant as Outsider

The consultant should be an outsider. This is important because he or she may be called upon to present unpalatable truths to management. In many situations the cause of a problem such as a deteriorating safety record is understood by the people at the working level. However, no one within the organization feels that they can present “the truth” to management for fear of retribution. (This is not always a management problem, however.  The consultant may find that management is quite flexible, and willing to adopt new techniques.  The resistance may come from supervisors and working-level people who have become entrenched in the current mode of operating.)

A consultant may be able to successfully present bad news more effectively than an employee for three reasons. First, the worst that the client company can do is to terminate the consultant’s contract. Since the consultant usually has other assignments, this loss of work is not as critical as it would be to full-time employee. Second, outsiders are often perceived as being more credible than insiders, even though they present exactly the same facts. (This is why consulting companies themselves sometimes have to hire consultants to tell them “the truth”. It is also the rationale behind the quotation, “An expert is someone who is more than fifty miles away.”) The third advantage of using an outsider to present bad news is that management is not quite sure where to “place” the consultant.  Consultants are often perceived as being “above” line employees, particularly if it is suspected that they have the ear of senior management. Therefore, comments from consultants are often treated with a good deal of respect and consideration.

The importance of being an outsider raises a concern about the use of “internal consultants” — a phrase which some might regard as being an oxymoron. If the consultant and the client work for the same organization, sooner or later their chains of command will meet. Hence, neither is truly independent from the other. Furthermore, as their respective careers progress, it is possible that they will find themselves working for or with one another. This knowledge is likely to cloud the objectivity of the client-consultant relationship.

The consultant should also be an outsider because it his knowledge of “how other people do it” that can be so valuable to an organization that has become trapped in its own systems and ways of thinking.

Ironically, one of the problems that consultants can run into is that they themselves can become stuck in their own rut; they may have trouble accepting that other people’s ideas may be as good as or even better than theirs.  Therefore, it is important to make sure that the consultant is truly up to date, and that he or she is constantly evaluating and testing their own ideas, and abandoning those that are out of date. This being the case, one question that the client company may want to ask a consultant before hiring him or her is, “Which of your opinions and ideas have you changed during the last few years?”

Consultants — Not Contractors

An appropriate analogy can be made here with respect to education and training, as discussed in Chapter 7. Someone who is educated in a topic understands its fundamental principles whereas someone who is merely trained in that topic knows “how to do it.” So it is with consultants and practitioners. Consultants provide insights to do with fundamental principles; practitioners, on the other hand, simply know what to do.

Consultants provide advice — they do not put that advice into practice. A consultant looks at organizational issues, and advises management on how to address them. This is why the end product of most consulting contracts is a report and a presentation to management. If he or she is asked to implement some of the recommendations contained in the report, he or she has switched roles from being an adviser to a doer.

Good consultants work by generalizing from the specific and then drawing specific conclusions from their generalizations. They go into a situation and investigate the facts of the current situation. From these facts they come up with a general analysis from which they develop specific recommendations. This ability to form general conclusions is also an important attribute of an incident investigator.

Consultants must possess good client-relations skills. They have to be aware not only of technical issues, but also of the internal company dynamics and politics. Process safety consultants frequently have a technical background — many of them are chemical engineers — and therefore tend to perceive the world as being rational and objective. They may fail to grasp that their clients, like all customers, base many of their decisions on a combination of both emotion and fact.

The distinction between “doing” and “consulting” can be frustrating for many consultants. Many of them have had a career in industry, often at quite senior levels.  They are used to taking charge and having their ideas put into practice. Hence, the need to persuade rather than command can be a challenge for such consultants, particularly when the client chooses to ignore the consultant’s recommendations.

A facility may choose to use contract help with many of its risk management activities, particularly those that are labor-intensive, such as writing operating procedures. Using consultants or contract workers in this manner moves away from the principles of employee participation and involvement.

Cuts Gordian Knots

gordian-knot-1In the 4th century B.C. King Midas in the city of Gordium in what is now the nation of Turkey tied his ox-cart/chariot to a post with an intricate knot. It was prophesied that whoever could undo the knot would become the next king of Persia. In 333 B.C. Alexander the Great attempted to untie the knot. He could not find an end to the rope, so he simply cut through the knot with his sword. He went on to conquer most of the known world, including Persia.

The story symbolizes the resolution of an intractable problem with a swift, unconventional stroke. Good consultants have the ability to cut the Gordian knots that clients have created for themselves.

Quick Study

Although a consultant may be expert in many areas of business or technology he will never possess the detailed technical knowledge to do with every task he or she faces. For example, each new assignment will require him to work with a new type of chemical process technology. This means that an effective consultant is a quick study, i.e., he or she must be able to enter a situation, learn it sufficiently well to understand the management issues involved and then make sensible recommendations. This is analogous to what a trial lawyer does. He will learn the details of a case very rapidly, organize the case that is to be presented to the court, make the presentation, and then almost immediately forget the details as he moves on to the next case.

Role of the Client

The client must realize that the success of the consultant’s work will depend largely on the attitude and degree of cooperation provided by the facility employees. In particular, client personnel must try to be open-minded and objective. The consultant has been hired because he or she represents an outside point of view. Hence the findings are likely to upset some people on the client side because old and comfortable ways of doing business will be challenged. The client should try to understand that there may be new and better ways of operating; in particular, everyone should try to avoid using the phrase, “we’ve always done it that way and it’s never been a problem” (with the implication that it never will be a problem.)

Response to Criticism

Consultants must have thick skins. It is almost certain that their ideas and recommendations will be critiqued and criticized. Oftentimes, the people doing the criticizing will be considerably less qualified than the consultant. Also they will have spent less time studying the problem being analyzed, and will probably have motives and agendas of their own. In these situations, the consultant must work as hard as possible to communicate the findings of the analysis to all concerned, but he or she must also recognize that the client is paying the bills, and ultimately makes the final decisions. The consultant is an advisor, not a decision-maker.

Marketing

Consultants must market their services. At the same time they must maintain a professional profile. For most consultants their marketing will be based on a web page that provides information on services offered. This will be supplemented by direct mails and carefully managed email campaigns (which are best done through a service that provides full opt-out capabilities).

Social media also provide an opportunity for professional marketing. By writing articles and blogs for LinkedIn and other similar sites, the professional gains exposure (and also develops his or her own ideas).

Maintaining a professional and independent profile is particularly important for consultants who serve as expert witnesses (a topic that is discussed below). He or she has to avoid the perception that he is a “professional expert” — a hired gun.

The Risk Management Professional

P&IDA successful risk management professional needs to have personal attributes that match his or her technical knowledge and skills. Some of these attributes are discussed below. Of course, no single person can possess all of them, but the list does provide an outline as to goals to aim for.

Education and Certification

Most risk management professionals have a technical education — often in engineering or environmental science. Such an education provides the necessary skills to handle the technical and quantitative aspects of the work, particularly with regard to the analysis or risk, fires and explosions and gas dispersion.

Technical Knowledge

The risk management professional should have a thorough understanding of the many technical topics that the discipline covers. Obviously, no one person can be an expert in all of the technical areas that make us risk analysis, but he or she should possess enough knowledge of them in order to develop the correct parameters for risk analyses and to understand the findings and reports that the experts provide.

Holistic

A person who thinks and works holistically is not limited to a single, narrow detailed specialized sphere; instead he can understand management, technical and human systems, and how they interact with one another. A risk management professional understands how his or her profession is composed of a wide range of disparate topics such as human factors engineering, Boolean algebra, government regulations, starting up a process plant and the design of instrument systems.

If a risk management professional is to be effective at integrating different types of knowledge, he or she must possess a good grasp of those topics. This does not mean that the professional has to be an expert in everything ­— such a goal is obviously unrealistic — but it does mean that he or she needs to have a working knowledge of multifarious topics, and to have a comprehension as to how they fit together. The phrase, “jack of all trades, but master of none”, is usually considered pejorative. However, with regard to the risk management professional, it is a sensible job description.

Numerate

As has been stressed throughout this book, risk has both objective and subjective elements. The objective part of the work means that those working in the area of risk management need to be numerate; they need to be comfortable with a variety of quantitative topics such as gas dispersion modeling, the development of F-N curves and the use of Boolean algebra.

Communication Skills

Risk management professionals spend much of their time communicating with others in a variety of ways such as writing reports, listening to client needs, delivering presentations and listening to anecdotes. Hence the risk management professional must be a good speaker, writer, listener and reader. Discussion of these topics is provided later in this chapter.

Industrial Experience

There is really no substitute for industrial experience. It is one thing to learn about a topic from books such as this, and by reviewing incidents that have occurred elsewhere, but it is quite another to actually learn from the school of hard knocks. Industrial experience includes not only a hands-on knowledge of industrial processes and equipment, but also an understanding of the realities of client/consultant relationships, the resistance that managers have toward spending money on safety, problems at the management/union interface and how government agencies actually enforce regulations.

Knowledge of Past Events

holmes-watson-1

Watson and Holmes

The risk management professional should know about incidents and events (both good and bad) that have occurred in other companies and locations. He or she can use these events to understand and identify patterns in current operations.

The importance of understanding the past is illustrated with regard to (the fictional) Dr. Watson’s ruminations as to what new friend Sherlock Holmes does for a living, not long after they first meet. Watson summarizes Holmes’ attributes. The list includes the following statement:

< knowledge of . . .> Sensational Literature — Immense. He appears to know every detail of every horror perpetrated in the century.

So it is for the risk management professional; he or she should possess an “immense knowledge” of incidents that have occurred and what lessons can be drawn from them. An overview of some major incidents in the process industries is provided in Chapter 1.

In this context it is interesting to note that the recently released proposed update to the OSHA PSM standard (Chapter 2) relies heavily on actual incidents. Almost all of the proposed changes are justified by showing how such changes could have helped prevent the cited incident.

Professional Involvement

Risk management professionals should be involved in their community. This is usually done by working with professional societies or independent trade organizations — often by helping with the organization of meetings, editing papers and articles, and writing technical standards. Reasons for being involved include the following:

  • It is a way for experienced professionals to give back to their community and to help young people who are entering the field.
  • Development of personal reputation and contacts within the community that could lead to more interesting and rewarding work and assignments.
  • Enhancement of the reputation for the company or organization that the professional works for.
  • The writing of articles and papers requires the author to carry out thorough research on the topic about which he or she is writing.
  • Helping others to prepare and publish their work increases the knowledge and skills of all parties.

Network

A well-known proverb states, “It’s not what you know, it’s who you know”. This proverb is only half correct — technical knowledge and personal skills are vital to any professional. Yet it is important to maintain a network of qualified contacts. In particular, when an expert is has to address a challenging problem it is useful to have someone to call who can help out as a friend and colleague.

The Resumé / CV

The expert’s knowledge, skills and attributes are summarized in his or her or resumé or curriculum vitae (CV).

It is critical that the resumé be accurate and verifiable, especially with regard to statements, such as the possession of advanced degrees, or major work experience. Accuracy of the resumé is particularly important when the risk management professional is involved in litigation. He or she must expect to have his qualifications challenged because, if the resumé can be discredited, then the expert‘s statements can be discredited also.

Many professionals fail to keep their resumés up to date. It is a good idea to check it and modify as needed every three months or so, particularly when new types of work or project are being carried out.

a)            Level of Detail

An expert’s resumé can become very lengthy because he or she is likely to have years of experience in a wide range of tasks and projects. Such length has its drawbacks — it can make the resumé difficult to read and lacking in focus. For this reason it is often a good idea to have a short (say half page) summary of at the start of the resumé, supplemented by an attachment that provides the detailed information.

b)            Publications

An expert’s resumé is greatly enhanced if he or she has published professional papers, articles and books. Books, in particular, can make a very strong impact — the risk management professional can say, “I wrote the book on that. Here it is!”

Involvement with professional societies, as discussed in the previous section, also looks very good on the resumé.

c)            Gaps / Negative Facts

After many years of work experience, no one will have a perfect work record. Everyone’s career hits the occasional bump in the road. In particular, there will often be gaps in the work record for the times that the professional was unemployed or was trying to land new contracts. These gaps can be filled with information to do with background work such as the preparation of seminars or professional papers, or with time spent on continuing education.

d)            Multiple Resumés

Some risk management professionals have multiple resumés, with each version emphasizing particular qualities. For example, one version may stress say design experience, whereas another may place a greater emphasis on field operational work.

Although this practice may help in specific situations, it is generally best not to have more than one resumé. This is particularly true with respect to litigation work because an opposing attorney may use the two documents to “demonstrate” that the witness is not to be trusted, particularly if the professional appears to have a “plaintiff resumé” and a “defendant resumé”.

e)            Declining Experience

One of the traps that experts can fall into is that, if they fail to keep up with the latest knowledge and practice in their field, they may not really be qualified to help a client in an area that is shown on their resumé. The expert may fail to recognize that his or her knowledge and judgment is out of date.

A related problem is that some process risk experts may have worked for just one company for the duration of their careers. On retirement they seek to become consultants with other companies, but find that their deep, but narrow, experience can be quite limiting.

********************

Monthly PSM Report

Sutton Technical Books

If you would like to receive a copy of our monthly letter “ThePSM Report” please register at our our Sutton Technical Books site. We use Constant Contact software to ensure privacy and to control spam. A sample letter is available here.

Thank you.